@Junchao: yes, all with my ex2f.F90 variation on two or three cores @Barry: it's really puzzling that you cannot reproduce. Can you try running it a dozen times in a row? And look at the report_performance.xml file? When it hangs I see some nan's, for instance here in the VecAXPY event:
<events> <event> <name>VecAXPY</name> <time> <avgvalue>0.00610203</avgvalue> <minvalue>0.</minvalue> <maxvalue>0.0122041</maxvalue> <minloc>1</minloc> <maxloc>0</maxloc> </time> <ncalls> <avgvalue>0.5</avgvalue> <minvalue>0.</minvalue> <maxvalue>1.</maxvalue> <minloc>1</minloc> <maxloc>0</maxloc> </ncalls> </event> <event> <name>self</name> <time> <value>-nan.</value> </time> This is what I did in my latest attempt on the login node of our Rocky Linux 9 cluster: 1) download petsc-3.23.4.tar.gz from the petsc website 2) ./configure -prefix=~/petsc/install --with-cxx=0 --with-debugging=0 --with-mpi-dir=/cm/shared/apps/mpich/ge/gcc/64/3.4.2 3) adjust my example to this version of petsc (file is attached) 4) make ex2f-cklaij-dbg-v2 5) mpirun -n 2 ./ex2f-cklaij-dbg-v2 So the exact versions are: petsc-3.23.4, system mpich 3.4.2, system gcc 11.5.0 ________________________________________ From: Barry Smith <bsm...@petsc.dev> Sent: Friday, July 11, 2025 11:22 PM To: Klaij, Christiaan Cc: Junchao Zhang; PETSc users list Subject: Re: [petsc-users] problem with nested logging, standalone example And yet we cannot reproduce. Please tell us the exact PETSc version and MPI implementation versions. And reattach your reproducing example. And exactly how you run it. Can you reproduce it on an "ordinary" machine, say a Mac or Linux laptop. Barry If I could reproduce the problem here is how I would debug. I put use -start_in_debugger and then put break points in places which it seem problematic. Presumably I would end up with a hang with each MPI process in a "different place" and from that I may be able to determine how that happened. > On Jul 11, 2025, at 7:58 AM, Klaij, Christiaan <c.kl...@marin.nl> wrote: > > In summary for future reference: > - tested 3 different machines, two at Marin, one at the national HPC > - tested 3 different mpi implementation (intelmpi, openmpi and mpich) > - tested openmpi in both release and debug > - tested 2 different compilers (intel and gnu), both older and very recent > versions > - tested with the most basic config (./configure --with-cxx=0 > --with-debugging=0 --download-mpich) > > All of these test either segfault, or hang or error-out at the call to > PetscLogView. > > Chris > > ________________________________________ > From: Klaij, Christiaan <c.kl...@marin.nl> > Sent: Friday, July 11, 2025 10:10 AM > To: Barry Smith; Junchao Zhang > Cc: PETSc users list > Subject: Re: [petsc-users] problem with nested logging, standalone example > > @Matt: no MPI errors indeed. I've tried with MPICH and I get the same hanging. > @Barry: both stack traces aren't exactly the same, see a sample with MPICH > below. > > If it cannot be reproduced at your side, I'm afraid this is another dead end. > Thanks anyway, I really appreciate all your help. > > Chris > > (gdb) bt > #0 0x000015555033bc2e in MPIDI_POSIX_mpi_release_gather_gather.constprop.0 () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #1 0x000015555033db8a in MPIDI_POSIX_mpi_allreduce_release_gather () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #2 0x000015555033e70f in MPIR_Allreduce () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #3 0x000015555033f22e in PMPI_Allreduce () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #4 0x0000155553f85d69 in MPIU_Allreduce_Count (comm=-2080374782, > op=1476395020, dtype=1275072547, count=1, outbuf=0x7fffffffac70, > inbuf=0x7fffffffac60) > at /home/cklaij/petsc/petsc-3.23.4/src/sys/objects/pinit.c:1839 > #5 MPIU_Allreduce_Private (inbuf=inbuf@entry=0x7fffffffac60, > outbuf=outbuf@entry=0x7fffffffac70, count=count@entry=1, > dtype=dtype@entry=1275072547, op=op@entry=1476395020, comm=-2080374782) > at /home/cklaij/petsc/petsc-3.23.4/src/sys/objects/pinit.c:1869 > #6 0x0000155553f33dbe in PetscPrintXMLNestedLinePerfResults ( > viewer=viewer@entry=0x458890, name=name@entry=0x155554ef6a0d 'mbps\000', > value=<optimized out>, minthreshold=minthreshold@entry=0, > maxthreshold=maxthreshold@entry=0.01, > minmaxtreshold=minmaxtreshold@entry=1.05) > at > /home/cklaij/petsc/petsc-3.23.4/src/sys/logging/handler/impls/nested/xmlviewer.c:255 > > > (gdb) bt > #0 0x000015554fed3b17 in clock_gettime@GLIBC_2.2.5 () from /lib64/libc.so.6 > #1 0x0000155550b0de71 in ofi_gettime_ns () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #2 0x0000155550b0dec9 in ofi_gettime_ms () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #3 0x0000155550b2fab5 in sock_cq_sreadfrom () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #4 0x00001555505ca6f7 in MPIDI_OFI_progress () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #5 0x0000155550591fe9 in progress_test () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #6 0x00001555505924a3 in MPID_Progress_wait () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #7 0x000015555043463e in MPIR_Wait_state () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #8 0x000015555052ec49 in MPIC_Wait () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #9 0x000015555053093e in MPIC_Sendrecv () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #10 0x00001555504bf674 in MPIR_Allreduce_intra_recursive_doubling () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > #11 0x00001555505b61de in MPIDI_OFI_mpi_finalize_hook () > from /cm/shared/apps/mpich/ge/gcc/64/3.4.2/lib/libmpi.so.12 > > ________________________________________ > From: Barry Smith <bsm...@petsc.dev> > Sent: Thursday, July 10, 2025 11:10 PM > To: Junchao Zhang > Cc: Klaij, Christiaan; PETSc users list > Subject: Re: [petsc-users] problem with nested logging, standalone example > > > I cannot reproduce > > On Jul 10, 2025, at 3:46 PM, Junchao Zhang <junchao.zh...@gmail.com> wrote: > > Adding -mca coll_hcoll_enable 0 didn't change anything at my end. Strange. > > --Junchao Zhang > > > On Thu, Jul 10, 2025 at 3:39 AM Klaij, Christiaan > <c.kl...@marin.nl<mailto:c.kl...@marin.nl>> wrote: > An additional clue perhaps: with the option OMPI_MCA_coll_hcoll_enable=0, the > code does not hang but gives the error below. > > Chris > > > $ mpirun -mca coll_hcoll_enable 0 -n 2 ./ex2f-cklaij-dbg -pc_type jacobi > -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always > 0 KSP Residual norm 1.11803 > 1 KSP Residual norm 0.591608 > 2 KSP Residual norm 0.316228 > 3 KSP Residual norm < 1.e-11 > 0 KSP Residual norm 0.707107 > 1 KSP Residual norm 0.408248 > 2 KSP Residual norm < 1.e-11 > Norm of error < 1.e-12 iterations 3 > [1]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [1]PETSC ERROR: General MPI error > [1]PETSC ERROR: MPI error 1 MPI_ERR_BUFFER: invalid buffer pointer > [1]PETSC ERROR: See > https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK43J9p4SM$ > > <https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJjkYxsN9$> > for trouble shooting. > [1]PETSC ERROR: Petsc Release Version 3.22.4, Mar 01, 2025 > [1]PETSC ERROR: ./ex2f-cklaij-dbg with 2 MPI process(es) and PETSC_ARCH on > login1 by cklaij Thu Jul 10 10:33:33 2025 > [1]PETSC ERROR: Configure options: > --prefix=/home/cklaij/ReFRESCO/trunk/install/extLibs > --with-mpi-dir=/cm/shared/apps/openmpi/gcc/5.0.6-debug --with-x=0 > --with-mpe=0 --with-debugging=0 > --download-superlu_dist=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4VVy6P4U$ > > <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/superlu_dist-8.1.2.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJkouVHb2$> > --with-blaslapack-dir=/cm/shared/apps/oneapi/2024.2.1/mkl/2024.2 > --download-parmetis=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4-9b1K84$ > > <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/parmetis-4.0.3-p9.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrjo6-SP$> > > --download-metis=https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4Y9uaqiQ$ > > <https://urldefense.us/v3/__https://updates.marin.nl/refresco/libs/metis-5.1.0-p11.tar.gz__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhCc9MRE$> > > --with-packages-build-dir=/home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild > --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 -Wall > -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall > -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall > -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall > -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops > -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime > -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops > -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime > -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops > -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime > -Wno-unused-function -O3 -DNDEBUG" > [1]PETSC ERROR: #1 PetscLogNestedTreePrintLine() at > /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:289 > [1]PETSC ERROR: #2 PetscLogNestedTreePrint() at > /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:377 > [1]PETSC ERROR: #3 PetscLogNestedTreePrint() at > /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:384 > [1]PETSC ERROR: #4 PetscLogNestedTreePrintTop() at > /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:420 > [1]PETSC ERROR: #5 PetscLogHandlerView_Nested_XML() at > /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/xmlviewer.c:443 > [1]PETSC ERROR: #6 PetscLogHandlerView_Nested() at > /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/impls/nested/lognested.c:405 > [1]PETSC ERROR: #7 PetscLogHandlerView() at > /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/handler/interface/loghandler.c:342 > [1]PETSC ERROR: #8 PetscLogView() at > /home/cklaij/ReFRESCO/trunk/build-extlibs/superbuild/petsc/src/src/sys/logging/plog.c:2040 > [1]PETSC ERROR: #9 ex2f-cklaij-dbg.F90:301 > -------------------------------------------------------------------------- > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF > Proc: [[55228,1],1] > Errorcode: 98 > > NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. > You may or may not see output from other processes, depending on > exactly when Open MPI kills them. > -------------------------------------------------------------------------- > -------------------------------------------------------------------------- > prterun has exited due to process rank 1 with PID 0 on node login1 calling > "abort". This may have caused other processes in the application to be > terminated by signals sent by prterun (as reported here). > -------------------------------------------------------------------------- > > ________________________________________ > <image198746.png> > dr. ir. Christiaan Klaij | senior researcher > Research & Development | CFD Development > T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | > https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4BUEn1h8$ > > <https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrOqapgp$> > <image542473.png><https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJoD4fuV7$> > <image555176.png><https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJospHf95$> > <image269837.png><https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJrpsjB_W$> > > > From: Klaij, Christiaan <c.kl...@marin.nl<mailto:c.kl...@marin.nl>> > Sent: Thursday, July 10, 2025 10:15 AM > To: Junchao Zhang > Cc: PETSc users list > Subject: Re: [petsc-users] problem with nested logging, standalone example > > Hi Junchao, > > Thanks for testing. I've fixed the error but unfortunately that doesn't > change the behavior, the code still hangs as before, with the same stack > trace... > > Chris > > ________________________________________ > From: Junchao Zhang <junchao.zh...@gmail.com<mailto:junchao.zh...@gmail.com>> > Sent: Tuesday, July 8, 2025 10:58 PM > To: Klaij, Christiaan > Cc: PETSc users list > Subject: Re: [petsc-users] problem with nested logging, standalone example > > Hi, Chris, > First, I had to fix an error in your test by adding " > PetscCallA(MatSetFromOptions(AA,ierr))" at line 254. > [0]PETSC ERROR: --------------------- Error Message > -------------------------------------------------------------- > [0]PETSC ERROR: Object is in wrong state > [0]PETSC ERROR: Mat object's type is not set: Argument # 1 > ... > [0]PETSC ERROR: #1 MatSetValues() at > /scratch/jczhang/petsc/src/mat/interface/matrix.c:1503 > [0]PETSC ERROR: #2 ex2f.F90:258 > > Then I could ran the test without problems > mpirun -n 2 ./ex2f -pc_type jacobi -ksp_monitor_short > -ksp_gmres_cgs_refinement_type refine_always > 0 KSP Residual norm 1.11803 > 1 KSP Residual norm 0.591608 > 2 KSP Residual norm 0.316228 > 3 KSP Residual norm < 1.e-11 > 0 KSP Residual norm 0.707107 > 1 KSP Residual norm 0.408248 > 2 KSP Residual norm < 1.e-11 > Norm of error < 1.e-12 iterations 3 > > I used petsc-3.22.4, gcc-11.3, openmpi-5.0.6 and configured with > ./configure --with-cc=gcc --with-cxx=g++ --with-fc=gfortran > --download-openmpi --with-ssl=0 --with-shared-libraries=1 CFLAGS="-std=gnu11 > -Wall -funroll-all-loops -O3 -DNDEBUG" CXXFLAGS="-std=gnu++14 -Wall > -funroll-all-loops -O3 -DNDEBUG " COPTFLAGS="-std=gnu11 -Wall > -funroll-all-loops -O3 -DNDEBUG" CXXOPTFLAGS="-std=gnu++14 -Wall > -funroll-all-loops -O3 -DNDEBUG " FCFLAGS="-Wall -funroll-all-loops > -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime > -Wno-unused-function -O3 -DNDEBUG" F90FLAGS="-Wall -funroll-all-loops > -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime > -Wno-unused-function -O3 -DNDEBUG" FOPTFLAGS="-Wall -funroll-all-loops > -ffree-line-length-0 -Wno-maybe-uninitialized -Wno-target-lifetime > -Wno-unused-function -O3 -DNDEBUG" > > Could you fix the error and retry? > > --Junchao Zhang > > > On Sun, Jul 6, 2025 at 12:57 PM Klaij, Christiaan via petsc-users > <petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov><mailto:petsc-users@mcs.anl.gov<mailto:petsc-users@mcs.anl.gov>>> > wrote: > Attached is a standalone example of the issue described in the > earlier thread "problem with nested logging". The issue appeared > somewhere between petsc 3.19.4 and 3.23.4. > > The example is a variation of ../ksp/tutorials/ex2f.F90, where > I've added the nested log viewer with one event as well as the > solution of a small system on rank zero. > > When running on mulitple procs the example hangs during > PetscLogView with the backtrace below. The configure.log is also > attached in the hope that you can replicate the issue. > > Chris > > > #0 0x000015554c84ea9e in mca_pml_ucx_recv (buf=0x7fffffff9e30, count=1, > datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, src=1, tag=-12, > comm=0x7f1e30, mpi_status=0x0) at pml_ucx.c:700 > #1 0x000015554c65baff in ompi_coll_base_allreduce_intra_recursivedoubling ( > sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1, > dtype=0x15554c9ef900 <ompi_mpi_2dblprec>, > op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630) > at base/coll_base_allreduce.c:247 > #2 0x000015554c6a7e40 in ompi_coll_tuned_allreduce_intra_do_this ( > sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1, > dtype=0x15554c9ef900 <ompi_mpi_2dblprec>, > op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630, > algorithm=3, faninout=0, segsize=0) at coll_tuned_allreduce_decision.c:142 > #3 0x000015554c6a054f in ompi_coll_tuned_allreduce_intra_dec_fixed ( > sbuf=0x7fffffff9e20, rbuf=0x7fffffff9e30, count=1, > dtype=0x15554c9ef900 <ompi_mpi_2dblprec>, > op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaec630) > at coll_tuned_decision_fixed.c:216 > #4 0x000015554c68e160 in mca_coll_hcoll_allreduce (sbuf=0x7fffffff9e20, > rbuf=0x7fffffff9e30, count=1, dtype=0x15554c9ef900 <ompi_mpi_2dblprec>, > op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30, module=0xaecb80) > at coll_hcoll_ops.c:217 > #5 0x000015554c59811a in PMPI_Allreduce (sendbuf=0x7fffffff9e20, > recvbuf=0x7fffffff9e30, count=1, datatype=0x15554c9ef900 <ompi_mpi_2dblprec>, > op=0x15554ca28980 <ompi_mpi_op_maxloc>, comm=0x7f1e30) at allreduce.c:123 > #6 0x0000155553eabede in MPIU_Allreduce_Private () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #7 0x0000155553e50d08 in PetscPrintXMLNestedLinePerfResults () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #8 0x0000155553e5123e in PetscLogNestedTreePrintLine () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #9 0x0000155553e51f3a in PetscLogNestedTreePrint () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #10 0x0000155553e51e96 in PetscLogNestedTreePrint () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #11 0x0000155553e51e96 in PetscLogNestedTreePrint () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #12 0x0000155553e52142 in PetscLogNestedTreePrintTop () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #13 0x0000155553e5257b in PetscLogHandlerView_Nested_XML () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #14 0x0000155553e4e5a0 in PetscLogHandlerView_Nested () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #15 0x0000155553e56232 in PetscLogHandlerView () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #16 0x0000155553e588c3 in PetscLogView () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #17 0x0000155553e40eb5 in petsclogview_ () from > /home/cklaij/ReFRESCO/trunk/install/extLibs/lib/libpetsc.so.3.22 > #18 0x0000000000402c8b in MAIN__ () > #19 0x00000000004023df in main () > [cid:ii_197ebccaa1d27ee6ef21] > dr. ir. Christiaan Klaij | senior researcher > Research & Development | CFD Development > T +31 317 49 33 44<tel:+31%20317%2049%2033%2044> | > https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!dcT9AzbxDJMLIie0NhYIw4YU2TObPM3WHhzR-HlzrpfbjPd6sgsPX009yFy1lw_eLLu2WprNwYRABMK4BUEn1h8$ > > <https://urldefense.us/v3/__http://www.marin.nl__;!!G_uCfscf7eWS!cbfMf1uAUCQ_T756UiU6Vd_NZkAvFLYRqJzL47P2JiAVi_2KCG5Q1u2oHseUcGLNAIW5qWtWbWHMIk_YNR8bJhphmV4x$><https://urldefense.us/v3/__https://www.marin.nl/__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imk4ivm_tE$> > [Facebook]<https://urldefense.us/v3/__https://www.facebook.com/marin.wageningen__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkLNCvsiI$> > [LinkedIn]<https://urldefense.us/v3/__https://www.linkedin.com/company/marin__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkrb79Ay4$> > [YouTube]<https://urldefense.us/v3/__https://www.youtube.com/marinmultimedia__;!!G_uCfscf7eWS!dAFNrWR8FzE9RrQXQAlok1iR_fA-rZdm9JAi-dlnKTnbdNTOTCViw0Nc-jjU4g72I-mhE1x1MZaf8imkJiCoeLw$> > >
! ! Description: Solves a linear system in parallel with KSP (Fortran code). ! Also shows how to set a user-defined monitoring routine. ! ! ----------------------------------------------------------------------- program main #include <petsc/finclude/petscksp.h> use petscksp implicit none ! ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Variable declarations ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! ! Variables: ! ksp - linear solver context ! ksp - Krylov subspace method context ! pc - preconditioner context ! x, b, u - approx solution, right-hand side, exact solution vectors ! A - matrix that defines linear system ! its - iterations for convergence ! norm - norm of error in solution ! rctx - random number generator context ! ! Note that vectors are declared as PETSc "Vec" objects. These vectors ! are mathematical objects that contain more than just an array of ! double precision numbers. I.e., vectors in PETSc are not just ! double precision x(*). ! However, local vector data can be easily accessed via VecGetArray(). ! See the Fortran section of the PETSc users manual for details. ! PetscReal norm PetscInt i,j,II,JJ,m,n,its,petscEventNo(1) PetscInt Istart,Iend,ione,col(3) PetscErrorCode ierr PetscMPIInt rank,size PetscBool flg PetscScalar v,one,neg_one,val(3) Vec x,b,u, xx, bb, uu Mat A, AA KSP ksp, kksp PetscRandom rctx PetscViewerAndFormat vzero ! PetscViewerAndFormat vf PetscClassId classid PetscViewer viewer ! These variables are not currently used. ! PC pc ! PCType ptype ! PetscReal tol ! Note: Any user-defined Fortran routines (such as MyKSPMonitor) ! MUST be declared as external. external MyKSPMonitor,MyKSPConverged ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Beginning of program ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - PetscCallA(PetscInitialize(PETSC_NULL_CHARACTER,ierr)) m = 3 n = 3 one = 1.0 neg_one = -1.0 ione = 1 PetscCallA(PetscLogNestedBegin(ierr)) PetscCallA(PetscLogEventRegister("myFirstEvent",classid,petscEventNo(1),ierr)) PetscCallA(PetscOptionsGetInt(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-m',m,flg,ierr)) PetscCallA(PetscOptionsGetInt(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-n',n,flg,ierr)) PetscCallMPIA(MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr)) PetscCallMPIA(MPI_Comm_size(PETSC_COMM_WORLD,size,ierr)) ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Compute the matrix and right-hand-side vector that define ! the linear system, Ax = b. ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Create parallel matrix, specifying only its global dimensions. ! When using MatCreate(), the matrix format can be specified at ! runtime. Also, the parallel partitioning of the matrix is ! determined by PETSc at runtime. PetscCallA(MatCreate(PETSC_COMM_WORLD,A,ierr)) PetscCallA(MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,m*n,m*n,ierr)) PetscCallA(MatSetFromOptions(A,ierr)) PetscCallA(MatSetUp(A,ierr)) ! Currently, all PETSc parallel matrix formats are partitioned by ! contiguous chunks of rows across the processors. Determine which ! rows of the matrix are locally owned. PetscCallA(MatGetOwnershipRange(A,Istart,Iend,ierr)) ! Set matrix elements for the 2-D, five-point stencil in parallel. ! - Each processor needs to insert only elements that it owns ! locally (but any non-local elements will be sent to the ! appropriate processor during matrix assembly). ! - Always specify global row and columns of matrix entries. ! - Note that MatSetValues() uses 0-based row and column numbers ! in Fortran as well as in C. ! Note: this uses the less common natural ordering that orders first ! all the unknowns for x = h then for x = 2h etc; Hence you see JH = II +- n ! instead of JJ = II +- m as you might expect. The more standard ordering ! would first do all variables for y = h, then y = 2h etc. do 10, II=Istart,Iend-1 v = -1.0 i = II/n j = II - i*n if (i.gt.0) then JJ = II - n PetscCallA(MatSetValues(A,ione,[II],ione,[JJ],[v],INSERT_VALUES,ierr)) endif if (i.lt.m-1) then JJ = II + n PetscCallA(MatSetValues(A,ione,[II],ione,[JJ],[v],INSERT_VALUES,ierr)) endif if (j.gt.0) then JJ = II - 1 PetscCallA(MatSetValues(A,ione,[II],ione,[JJ],[v],INSERT_VALUES,ierr)) endif if (j.lt.n-1) then JJ = II + 1 PetscCallA(MatSetValues(A,ione,[II],ione,[JJ],[v],INSERT_VALUES,ierr)) endif v = 4.0 PetscCallA(MatSetValues(A,ione,[II],ione,[II],[v],INSERT_VALUES,ierr)) 10 continue ! Assemble matrix, using the 2-step process: ! MatAssemblyBegin(), MatAssemblyEnd() ! Computations can be done while messages are in transition, ! by placing code between these two statements. PetscCallA(MatAssemblyBegin(A,MAT_FINAL_ASSEMBLY,ierr)) PetscCallA(MatAssemblyEnd(A,MAT_FINAL_ASSEMBLY,ierr)) ! Create parallel vectors. ! - Here, the parallel partitioning of the vector is determined by ! PETSc at runtime. We could also specify the local dimensions ! if desired -- or use the more general routine VecCreate(). ! - When solving a linear system, the vectors and matrices MUST ! be partitioned accordingly. PETSc automatically generates ! appropriately partitioned matrices and vectors when MatCreate() ! and VecCreate() are used with the same communicator. ! - Note: We form 1 vector from scratch and then duplicate as needed. PetscCallA(VecCreateFromOptions(PETSC_COMM_WORLD,PETSC_NULL_CHARACTER,ione,PETSC_DECIDE,m*n,u,ierr)) PetscCallA(VecSetFromOptions(u,ierr)) PetscCallA(VecDuplicate(u,b,ierr)) PetscCallA(VecDuplicate(b,x,ierr)) ! Set exact solution; then compute right-hand-side vector. ! By default we use an exact solution of a vector with all ! elements of 1.0; Alternatively, using the runtime option ! -random_sol forms a solution vector with random components. PetscCallA(PetscOptionsHasName(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-random_exact_sol',flg,ierr)) if (flg) then PetscCallA(PetscRandomCreate(PETSC_COMM_WORLD,rctx,ierr)) PetscCallA(PetscRandomSetFromOptions(rctx,ierr)) PetscCallA(VecSetRandom(u,rctx,ierr)) PetscCallA(PetscRandomDestroy(rctx,ierr)) else PetscCallA(VecSet(u,one,ierr)) endif PetscCallA(MatMult(A,u,b,ierr)) ! View the exact solution vector if desired PetscCallA(PetscOptionsHasName(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-view_exact_sol',flg,ierr)) if (flg) then PetscCallA(VecView(u,PETSC_VIEWER_STDOUT_WORLD,ierr)) endif ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Create the linear solver and set various options ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Create linear solver context PetscCallA(KSPCreate(PETSC_COMM_WORLD,ksp,ierr)) ! Set operators. Here the matrix that defines the linear system ! also serves as the matrix from which the preconditioner is constructed. PetscCallA(KSPSetOperators(ksp,A,A,ierr)) ! Set linear solver defaults for this problem (optional). ! - By extracting the KSP and PC contexts from the KSP context, ! we can then directly call any KSP and PC routines ! to set various options. ! - The following four statements are optional; all of these ! parameters could alternatively be specified at runtime via ! KSPSetFromOptions(). All of these defaults can be ! overridden at runtime, as indicated below. ! We comment out this section of code since the Jacobi ! preconditioner is not a good general default. ! PetscCallA(KSPGetPC(ksp,pc,ierr)) ! ptype = PCJACOBI ! PetscCallA(PCSetType(pc,ptype,ierr)) ! tol = 1.e-7 ! PetscCallA(KSPSetTolerances(ksp,tol,PETSC_CURRENT_REAL,PETSC_CURRENT_REAL,PETSC_CURRENT_INTEGER,ierr)) ! Set user-defined monitoring routine if desired PetscCallA(PetscOptionsHasName(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-my_ksp_monitor',flg,ierr)) if (flg) then vzero = 0 PetscCallA(KSPMonitorSet(ksp,MyKSPMonitor,vzero,PETSC_NULL_FUNCTION,ierr)) ! ! Cannot also use the default KSP monitor routine showing how it may be used from Fortran ! since the Fortran compiler thinks the calling arguments are different in the two cases ! ! PetscCallA(PetscViewerAndFormatCreate(PETSC_VIEWER_STDOUT_WORLD,PETSC_VIEWER_DEFAULT,vf,ierr)) ! PetscCallA(KSPMonitorSet(ksp,KSPMonitorResidual,vf,PetscViewerAndFormatDestroy,ierr)) endif ! Set runtime options, e.g., ! -ksp_type <type> -pc_type <type> -ksp_monitor -ksp_rtol <rtol> ! These options will override those specified above as long as ! KSPSetFromOptions() is called _after_ any other customization ! routines. PetscCallA(KSPSetFromOptions(ksp,ierr)) ! Set convergence test routine if desired PetscCallA(PetscOptionsHasName(PETSC_NULL_OPTIONS,PETSC_NULL_CHARACTER,'-my_ksp_convergence',flg,ierr)) if (flg) then PetscCallA(KSPSetConvergenceTest(ksp,MyKSPConverged,0,PETSC_NULL_FUNCTION,ierr)) endif ! ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Solve the linear system ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - PetscCallA(PetscLogEventBegin(petscEventNo(1),ierr)) PetscCallA(KSPSolve(ksp,b,x,ierr)) PetscCallA(PetscLogEventEnd(petscEventNo(1),ierr)) ! Solve small system on master if (rank .eq. 0) then PetscCallA(MatCreate(PETSC_COMM_SELF,AA,ierr)) PetscCallA(MatSetSizes(AA,PETSC_DECIDE,PETSC_DECIDE,m,m,ierr)) PetscCallA(MatSetFromOptions(AA,ierr)) val = [-1.0, 2.0, -1.0] PetscCallA(MatSetValues(AA,1,[0],2,[0,1],val(2:3),INSERT_VALUES,ierr)) do i=1,m-2 col = [i-1, i, i+1] PetscCallA(MatSetValues(AA,1,[i],3,col,val,INSERT_VALUES,ierr)) end do PetscCallA(MatSetValues(AA,1,[m-1],2,[m-2,m-1],val(1:2),INSERT_VALUES,ierr)) PetscCallA(MatAssemblyBegin(AA,MAT_FINAL_ASSEMBLY,ierr)) PetscCallA(MatAssemblyEnd(AA,MAT_FINAL_ASSEMBLY,ierr)) PetscCallA(VecCreate(PETSC_COMM_SELF,xx,ierr)) PetscCallA(VecSetSizes(xx,PETSC_DECIDE,m,ierr)) PetscCallA(VecSetFromOptions(xx,ierr)) PetscCallA(VecDuplicate(xx,bb,ierr)) PetscCallA(VecDuplicate(xx,uu,ierr)) PetscCallA(VecSet(uu,one,ierr)) PetscCallA(MatMult(AA,uu,bb,ierr)) PetscCallA(KSPCreate(PETSC_COMM_SELF,kksp,ierr)) PetscCallA(KSPSetOperators(kksp,AA,AA,ierr)) PetscCallA(KSPSetFromOptions(kksp,ierr)) PetscCallA(KSPSolve(kksp,bb,xx,ierr)) end if ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Check solution and clean up ! - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - ! Check the error PetscCallA(VecAXPY(x,neg_one,u,ierr)) PetscCallA(VecNorm(x,NORM_2,norm,ierr)) PetscCallA(KSPGetIterationNumber(ksp,its,ierr)) if (rank .eq. 0) then if (norm .gt. 1.e-12) then write(6,100) norm,its else write(6,110) its endif endif 100 format('Norm of error ',e11.4,' iterations ',i5) 110 format('Norm of error < 1.e-12 iterations ',i5) ! nested log view PetscCallA(PetscViewerASCIIOpen(PETSC_COMM_WORLD,'report_performance.xml',viewer,ierr)) PetscCallA(PetscViewerPushFormat(viewer,PETSC_VIEWER_ASCII_XML,ierr)) PetscCallA(PetscLogView(viewer,ierr)) PetscCallA(PetscViewerDestroy(viewer,ierr)) ! Free work space. All PETSc objects should be destroyed when they ! are no longer needed. PetscCallA(KSPDestroy(ksp,ierr)) PetscCallA(VecDestroy(u,ierr)) PetscCallA(VecDestroy(x,ierr)) PetscCallA(VecDestroy(b,ierr)) PetscCallA(MatDestroy(A,ierr)) if (rank .eq. 0) then PetscCallA(KSPDestroy(kksp,ierr)) PetscCallA(VecDestroy(uu,ierr)) PetscCallA(VecDestroy(xx,ierr)) PetscCallA(VecDestroy(bb,ierr)) PetscCallA(MatDestroy(AA,ierr)) end if ! Always call PetscFinalize() before exiting a program. This routine ! - finalizes the PETSc libraries as well as MPI ! - provides summary and diagnostic information if certain runtime ! options are chosen (e.g., -log_view). See PetscFinalize() ! manpage for more information. PetscCallA(PetscFinalize(ierr)) end ! -------------------------------------------------------------- ! ! MyKSPMonitor - This is a user-defined routine for monitoring ! the KSP iterative solvers. ! ! Input Parameters: ! ksp - iterative context ! n - iteration number ! rnorm - 2-norm (preconditioned) residual value (may be estimated) ! dummy - optional user-defined monitor context (unused here) ! subroutine MyKSPMonitor(ksp,n,rnorm,dummy,ierr) use petscksp implicit none KSP ksp Vec x PetscErrorCode ierr PetscInt n,dummy PetscMPIInt rank PetscReal rnorm ! Build the solution vector PetscCallA(KSPBuildSolution(ksp,PETSC_NULL_VEC,x,ierr)) ! Write the solution vector and residual norm to stdout ! Since the Fortran IO may be flushed differently than C ! cannot reliably print both together in CI PetscCallMPIA(MPI_Comm_rank(PETSC_COMM_WORLD,rank,ierr)) if (rank .eq. 0) write(6,100) n ! PetscCallA(VecView(x,PETSC_VIEWER_STDOUT_WORLD,ierr)) if (rank .eq. 0) write(6,200) n,rnorm 100 format('iteration ',i5,' solution vector:') 200 format('iteration ',i5,' residual norm ',e11.4) ierr = 0 end ! -------------------------------------------------------------- ! ! MyKSPConverged - This is a user-defined routine for testing ! convergence of the KSP iterative solvers. ! ! Input Parameters: ! ksp - iterative context ! n - iteration number ! rnorm - 2-norm (preconditioned) residual value (may be estimated) ! dummy - optional user-defined monitor context (unused here) ! subroutine MyKSPConverged(ksp,n,rnorm,flag,dummy,ierr) use petscksp implicit none KSP ksp PetscErrorCode ierr PetscInt n,dummy KSPConvergedReason flag PetscReal rnorm if (rnorm .le. .05) then flag = KSP_CONVERGED_RTOL_NORMAL else flag = KSP_CONVERGED_ITERATING endif ierr = 0 end !/*TEST ! ! test: ! nsize: 2 ! args: -pc_type jacobi -ksp_monitor_short -ksp_gmres_cgs_refinement_type refine_always ! ! test: ! suffix: 2 ! nsize: 2 ! args: -pc_type jacobi -my_ksp_monitor -ksp_gmres_cgs_refinement_type refine_always ! !TEST*/