Thanks a lot, Stefano. Yes. I am using 2nd-order Nedelec elements. -pc_*bddc_use_lo*cal_mat_graph 0 can make the code run. I am testing more cpu #. I am testing my code using, petsc-3.21.1/petsc/arch-linux-c-opt/bin/mpirun -n 4 ./app -pc_type bddc -pc_bddc_coarse_redundant_pc_type svd -pc_bddc_use_vertices -ksp_error_if_not_converged -mat_type is -ksp_monitor -ksp_rtol 1e-8 -ksp_gmres_restart 2000 -ksp_view -malloc_view -pc_bddc_use_local_mat_graph 0 -ksp_converged_reason -pc_bddc_neumann_pc_type gamg -pc_bddc_neumann_pc_gamg_esteig_ksp_max_it 10 -ksp_converged_reason -pc_bddc_neumann_approximate -pc_bddc_dirichlet_pc_type gamg -pc_bddc_dirichlet_pc_gamg_esteig_ksp_max_it 10 -ksp_converged_reason -pc_bddc_dirichlet_approximate
The residual dropped to 6e-5 very fast and then continued to reduce very slowly. Do you have any suggestions to improve this ? Will it be necessary to change the basis for BDDC in order to accelerate the convergence ? In addition, I tried -pc_bddc_use_deluxe_scaling, but it showed some errors. It seems deluxe scaling obviously requires a much larger size (*Global size overflow 3051678564*) than my problem. Thanks, Xiaodong [0]PETSC ERROR: --------------------- Error Message -------------------------------------------------------------- [0]PETSC ERROR: Overflow in integer operation: https://urldefense.us/v3/__https://petsc.org/release/faq/*64-bit-indices__;Iw!!G_uCfscf7eWS!ZVyzxJb4s9N1kzsS2BV7raG-kJIn8X6skBNtfsvA8aHyjWPm8oYGfzk83j1n0PFstGE6nDCHpOIpMvkLFZcexA$ [0]PETSC ERROR: Global size overflow 3051678564. You may consider ./configure PETSc with --with-64-bit-indices for the case you are running [0]PETSC ERROR: WARNING! There are unused option(s) set! Could be the program crashed before usage or a spelling mistake, etc! [0]PETSC ERROR: Option left: name:-ksp_converged_reason (no value) source: command line [0]PETSC ERROR: Option left: name:-pc_bddc_coarse_redundant_pc_type value: svd source: command line [0]PETSC ERROR: Option left: name:-pc_bddc_neumann_pc_gamg_esteig_ksp_max_it value: 10 source: command line [0]PETSC ERROR: Option left: name:-pc_bddc_neumann_pc_type value: gamg source: command line [0]PETSC ERROR: See https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!ZVyzxJb4s9N1kzsS2BV7raG-kJIn8X6skBNtfsvA8aHyjWPm8oYGfzk83j1n0PFstGE6nDCHpOIpMvlAB6PtAg$ for trouble shooting. [0]PETSC ERROR: Petsc Release Version 3.21.1, unknown [0]PETSC ERROR: Configure options --with-cc=gcc --with-fc=gfortran --with-cxx=g++ --download-fblaslapack --download-mpich --with-scalar-type=complex --download-triangle --with-debugging=no [0]PETSC ERROR: #1 PetscSplitOwnership() at /Documents/petsc-3.21.1/petsc/src/sys/utils/psplit.c:86 [0]PETSC ERROR: #2 PetscLayoutSetUp() at /Documents/petsc-3.21.1/petsc/src/vec/is/utils/pmap.c:244 [0]PETSC ERROR: #3 PetscLayoutCreateFromSizes() at /Documents/petsc-3.21.1/petsc/src/vec/is/utils/pmap.c:107 [0]PETSC ERROR: #4 ISGeneralSetIndices_General() at /Documents/petsc-3.21.1/petsc/src/vec/is/is/impls/general/general.c:569 [0]PETSC ERROR: #5 ISGeneralSetIndices() at /Documents/petsc-3.21.1/petsc/src/vec/is/is/impls/general/general.c:559 [0]PETSC ERROR: #6 ISCreateGeneral() at /Documents/petsc-3.21.1/petsc/src/vec/is/is/impls/general/general.c:530 [0]PETSC ERROR: #7 ISRenumber() at /Documents/petsc-3.21.1/petsc/src/vec/is/is/interface/index.c:198 [0]PETSC ERROR: #8 PCBDDCSubSchursSetUp() at /Documents/petsc-3.21.1/petsc/src/ksp/pc/impls/bddc/bddcschurs.c:646 [0]PETSC ERROR: #9 PCBDDCSetUpSubSchurs() at /Documents/petsc-3.21.1/petsc/src/ksp/pc/impls/bddc/bddcprivate.c:9348 [0]PETSC ERROR: #10 PCSetUp_BDDC() at /Documents/petsc-3.21.1/petsc/src/ksp/pc/impls/bddc/bddc.c:1564 [0]PETSC ERROR: #11 PCSetUp() at /Documents/petsc-3.21.1/petsc/src/ksp/pc/interface/precon.c:1079 [0]PETSC ERROR: #12 KSPSetUp() at /Documents/petsc-3.21.1/petsc/src/ksp/ksp/interface/itfunc.c:415 [0]PETSC ERROR: #13 KSPSolve_Private() at Documents/petsc-3.21.1/petsc/src/ksp/ksp/interface/itfunc.c:831 [0]PETSC ERROR: #14 KSPSolve() at /Documents/petsc-3.21.1/petsc/src/ksp/ksp/interface/itfunc.c:1078 On Wed, Aug 14, 2024 at 11:54 AM Stefano Zampini <stefano.zamp...@gmail.com> wrote: > Ok, the problem is that the default algorithm for detecting the connected > components of the interface finds a lot of disconnected dofs. > What discretization is this? Nedelec elements? Can you try using -pc_ > *bddc_use_lo*cal_mat_graph 0? > Also, you are using -pc_bddc_monolithic, but you only have one field. That > flag aggregates different fields, but you only have one. > Note that with Nedelec elements, you need a special change of basis for > BDDC to work, see e.g. > https://urldefense.us/v3/__https://www.osti.gov/servlets/purl/1377770__;!!G_uCfscf7eWS!ZVyzxJb4s9N1kzsS2BV7raG-kJIn8X6skBNtfsvA8aHyjWPm8oYGfzk83j1n0PFstGE6nDCHpOIpMvlI_QH81A$ > > > Il giorno mer 14 ago 2024 alle ore 05:15 neil liu <liufi...@gmail.com> ha > scritto: > >> Hi, Stefano, >> >> Please see the attached for the smaller case(successful with BDDC). >> and the Error_largerMesh shows the error with the large mesh using petsc >> debug mode. >> >> Thanks a lot, >> >> Xiaodong >> >> >> On Tue, Aug 13, 2024 at 5:47 PM Stefano Zampini < >> stefano.zamp...@gmail.com> wrote: >> >>> can you run the same options and add "-ksp_view -pc_bddc_check_level 1" >>> for the smaller case? Also, can you send the full stack trace of the >>> out-of-memory error using a debug version of PETSc? >>> A note aside: you should not need pc_bddc_use_vertices (which is on by >>> default) >>> >>> Il giorno mar 13 ago 2024 alle ore 23:17 neil liu <liufi...@gmail.com> >>> ha scritto: >>> >>>> Dear Petsc developers, >>>> >>>> I am testing PCBDDC for my vector based FEM solver(complex system). It >>>> can work well on a coarse mesh(tetrahedra cell #: 6,108; dof # : 39,596). >>>> Then I tried a finer mesh (tetrahedra cell #: 32,036; dof # : 206,362). It >>>> seems ASM can work well with >>>> >>>> petsc-3.21.1/petsc/arch-linux-c-opt/bin/mpirun -n 4 ./app -pc_type asm >>>> -ksp_converged_reason -ksp_monitor -ksp_gmres_restart 100 -ksp_rtol 1e-4 >>>> -pc_asm_overalp 4 -sub_pc_type ilu -malloc_view >>>> >>>> while PCBDDC eats up the memory (61 GB) when I tried >>>> >>>> petsc-3.21.1/petsc/arch-linux-c-opt/bin/mpirun -n 4 ./app -pc_type bddc >>>> -pc_bddc_coarse_redundant_pc_type ilu -pc_bddc_use_vertices >>>> -ksp_error_if_not_converged -mat_type is -ksp_monitor -ksp_rtol 1e-8 >>>> -ksp_gmres_restart 30 -ksp_view -malloc_view -pc_bddc_monolithic >>>> -pc_bddc_neumann_pc_type ilu -pc_bddc_dirichlet_pc_type ilu >>>> >>>> The following errors with BDDC came out. The memory usage for PCBDDC >>>> (different from PCASM) is also listed (I am assuming the unit is Bytes, >>>> right?). *Although the BDDC requires more memory, it still seems >>>> normal, right? * >>>> >>>> [0]PETSC ERROR: --------------------- Error Message >>>> -------------------------------------------------------------- >>>> [0]PETSC ERROR: Out of memory. This could be due to allocating >>>> [0]PETSC ERROR: too large an object or bleeding by not properly >>>> [0]PETSC ERROR: destroying unneeded objects. >>>> [0] Maximum memory PetscMalloc()ed 30829727808 maximum size of entire >>>> process 16899194880 >>>> [0] Memory usage sorted by function >>>> .... >>>> *[0] 1 240 PCBDDCGraphCreate()* >>>> *[0] 1 3551136 PCBDDCGraphInit()* >>>> *[0] 2045 32720 PCBDDCGraphSetUp()* >>>> *[0] 2 8345696 PCBDDCSetLocalAdjacencyGraph_BDDC()* >>>> *[0] 1 784 PCCreate()* >>>> *[0] 1 1216 PCCreate_BDDC()* >>>> .... >>>> >>>> Thanks for your help. >>>> >>>> Xiaodong >>>> >>>> >>>> >>> >>> -- >>> Stefano >>> >> > > -- > Stefano >