Mark

I have fixed few things in the solver and it is tested with the current master.
Can you write a MWE to reproduce the issue? Which version of CUDA and CUSPARSE 
are you using?
I was planning to reorganize the factor code in AIJCUSPARSE in the next days.

kl-18967:petsc zampins$ git grep "solver_type cusparse"
src/ksp/ksp/examples/tests/ex43.c:      args: -f 
${DATAFILESPATH}/matrices/cfd.2.10 -mat_type seqaijcusparse 
-pc_factor_mat_solver_type cusparse -mat_cusparse_storage_format ell -vec_type 
cuda -pc_type ilu
src/ksp/ksp/examples/tests/ex43.c:      args: -f 
${DATAFILESPATH}/matrices/shallow_water1 -mat_type seqaijcusparse 
-pc_factor_mat_solver_type cusparse -mat_cusparse_storage_format hyb -vec_type 
cuda -ksp_type cg -pc_type icc
src/ksp/ksp/examples/tests/ex43.c:      args: -f 
${DATAFILESPATH}/matrices/cfd.2.10 -mat_type seqaijcusparse 
-pc_factor_mat_solver_type cusparse -mat_cusparse_storage_format csr -vec_type 
cuda -ksp_type bicg -pc_type ilu
src/ksp/ksp/examples/tests/ex43.c:      args: -f 
${DATAFILESPATH}/matrices/cfd.2.10 -mat_type seqaijcusparse 
-pc_factor_mat_solver_type cusparse -mat_cusparse_storage_format csr -vec_type 
cuda -ksp_type bicg -pc_type ilu -pc_factor_mat_ordering_type nd
src/ksp/ksp/examples/tutorials/ex46.c:      args: -dm_mat_type aijcusparse 
-dm_vec_type cuda -random_exact_sol -pc_type ilu -pc_factor_mat_solver_type 
cusparse
src/ksp/ksp/examples/tutorials/ex59.c:     args: -subdomain_mat_type 
aijcusparse -physical_pc_bddc_dirichlet_pc_factor_mat_solver_type cusparse
src/ksp/ksp/examples/tutorials/ex7.c:      args: -ksp_monitor_short -mat_type 
aijcusparse -sub_pc_factor_mat_solver_type cusparse -vec_type cuda 
-sub_ksp_type preonly -sub_pc_type ilu
src/ksp/ksp/examples/tutorials/ex7.c:      args: -ksp_monitor_short -mat_type 
aijcusparse -sub_pc_factor_mat_solver_type cusparse -vec_type cuda 
-sub_ksp_type preonly -sub_pc_type ilu
src/ksp/ksp/examples/tutorials/ex7.c:      args: -ksp_monitor_short -mat_type 
aijcusparse -sub_pc_factor_mat_solver_type cusparse -vec_type cuda
src/ksp/ksp/examples/tutorials/ex7.c:      args: -ksp_monitor_short -mat_type 
aijcusparse -sub_pc_factor_mat_solver_type cusparse -vec_type cuda
src/ksp/ksp/examples/tutorials/ex71.c:   args: -pde_type Poisson -cells 7,9,8 
-dim 3 -ksp_view -pc_bddc_coarse_redundant_pc_type svd 
-ksp_error_if_not_converged -pc_bddc_dirichlet_pc_type cholesky 
-pc_bddc_dirichlet_pc_factor_mat_solver_type cusparse 
-pc_bddc_dirichlet_pc_factor_mat_ordering_type nd -pc_bddc_neumann_pc_type 
cholesky -pc_bddc_neumann_pc_factor_mat_solver_type cusparse 
-pc_bddc_neumann_pc_factor_mat_ordering_type nd -matis_localmat_type aijcusparse
src/ksp/ksp/examples/tutorials/ex72.c:      args: -f0 
${DATAFILESPATH}/matrices/medium -ksp_monitor_short -ksp_view -mat_view 
ascii::ascii_info -mat_type aijcusparse -pc_factor_mat_solver_type cusparse 
-pc_type ilu -vec_type cuda
src/snes/examples/tutorials/ex12.c:      args: -matis_localmat_type aijcusparse 
-pc_bddc_dirichlet_pc_factor_mat_solver_type cusparse 
-pc_bddc_neumann_pc_factor_mat_solver_type cusparse

> On Apr 15, 2020, at 2:20 PM, Mark Adams <mfad...@lbl.gov> wrote:
> 
> I tried using a serial direct solver in cusparse and got bad numerics:
> 
> -vector_type cuda -mat_type aijcusparse -pc_factor_mat_solver_type cusparse 
> 
> Before I start debugging this I wanted to see if there are any known issues 
> that I should be aware of.
> 
> Thanks,

Reply via email to