Re: [petsc-users] problem with nested logging, standalone example

2025-07-23 Thread Zongze Yang
Thank you for the quick fix — it works well on my end. Best wishes, Zongze From: Junchao Zhang Date: Thursday, July 24, 2025 at 02:55 To: Barry Smith Cc: Zongze Yang , Klaij, Christiaan , PETSc users list Subject: Re: [petsc-users] problem with nested logging, standalone example I think I

Re: [petsc-users] problem with nested logging, standalone example

2025-07-22 Thread Zongze Yang
Hi, I encountered a similar issue with Firedrake when using the -log_view option with XML format on macOS. Below is the error message. The Firedrake code and the shell script used to run it are attached. ``` [0]PETSC ERROR: - Error Message --

Re: [petsc-users] Suggestion: Support optional extras for Firedrake (or other python package) when installed via PETSc

2025-07-05 Thread Zongze Yang
Thank you. I recommend including these packages: [check, netgen, slepc]. Since VTK does not have an ARM Linux package, it may not be suitable to include. Best wishes, Zongze From: Barry Smith Date: Saturday, July 5, 2025 at 09:09 To: Zongze Yang Cc: PETSc users list Subject: Re: [petsc

[petsc-users] Suggestion: Support optional extras for Firedrake (or other python package) when installed via PETSc

2025-07-03 Thread Zongze Yang
Hi all, As PETSc can install Firedrake along with it, would it be possible to support optional extras for Firedrake—such as [vtk], [netgen]—during the installation? Is this something that could be considered? Best regards, Zongze

[petsc-users] Issue with tau_exec and PETSc: "perfstubs could not be initialized" on macOS M-series

2025-06-04 Thread Zongze Yang
Dear PETSc team, I’m encountering an issue when running a PETSc-based application with TAU instrumentation on macOS with Apple Silicon (M-series chip). The command I used is: ``` tau_exec ./ex56 -log_perfstubs ​​``` However, it results in the following error: ​``` ❯ tau_exec ./ex56 -log_perf

Re: [petsc-users] ParMmg crashes when run in parallel: Assertion failed: (isfinite(dd)), function PMMG_hashNorver_normals, file analys_pmmg.c, line 1072

2024-10-07 Thread Zongze Yang
w we do it in FreeFEM without DMPlex) and the error > persists, please let me know. > > Thanks, > Pierre > >> On 7 Oct 2024, at 6:12 PM, Zongze Yang wrote: >> >>  >> Hi Pierre, >> >> Thank you for the advice. I will look into implementing it as

Re: [petsc-users] ParMmg crashes when run in parallel: Assertion failed: (isfinite(dd)), function PMMG_hashNorver_normals, file analys_pmmg.c, line 1072

2024-10-07 Thread Zongze Yang
but plain Mmg instead which is > much more robust. > I don’t know how easy it is to do with DMPlex though (gather a DM and a > metric on a single process), especially from the command line. > > Thanks, > Pierre > >> On 7 Oct 2024, at 4:46 PM, Zongze Yang > <mailt

Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-04-04 Thread Zongze Yang
sion > Apple clang version 15.0.0 (clang-1500.3.9.4) > Target: arm64-apple-darwin23.4.0 > Thread model: posix > InstalledDir: /Library/Developer/CommandLineTools/usr/bin > petsc@mpro petsc.x % > > > On Tue, 2 Apr 2024, Zongze Yang wrote: > >> Thank you for th

Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-04-01 Thread Zongze Yang
ish Balay wrote: > > On Mon, 1 Apr 2024, Zongze Yang wrote: > >> >> I noticed this in the config.log of OpenMPI: >> ``` >> configure:30230: checking to see if mpifort compiler needs additional linker >> flags >> configure:30247: gfortran -o conftest -fPI

Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-04-01 Thread Zongze Yang
> On 1 Apr 2024, at 23: 38, Satish Balay wrote: > > On Sun, 31 Mar 2024, Zongze Yang wrote: >>> --- >>> petsc@ npro petsc % ./configure --download-bison --download-chaco --download-ctetgen ZjQcmQRYFpfptBannerStart This Message

Re: [petsc-users] ex19: Segmentation Violation when run with MUMPS on MacOS (arm64)

2024-03-31 Thread Zongze Yang
> On 31 Mar 2024, at 02: 59, Satish Balay via petsc-users wrote: > > I'll just note - I can reproduce with: > > petsc@ npro petsc. x % ./configure --download-mpich --download-mumps --download-scalapack ZjQcmQRYFpfptBannerStart This Message Is From an External Sender

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-17 Thread Zongze Yang
The issue of openblas was resolved by this pr https://urldefense.us/v3/__https://github.com/OpenMathLib/OpenBLAS/pull/4565__;!!G_uCfscf7eWS!b09n5clcTFuLceLY_9KfqtSsgmmCIBLFbqciRVCKvnvFw9zTaNF8ssK0MiQlBOXUJe7H88nl-7ExdfhB-cMXLQ2d$ Best wishes, Zongze > On 18 Mar 2024, at 00:50, Zongze Y

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-17 Thread Zongze Yang
tGAkpTvBmzy7BFDc9SrazFoooQ$ >> , so that’s another good argument in favor of -framework Accelerate. >> >> Thanks, >> Pierre >> >> PS: anyone benchmarked those >> https://urldefense.us/v3/__https://developer.apple.com/documentation/accelerate/sparse

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-17 Thread Zongze Yang
m. > > >> On Mar 17, 2024, at 9:58 AM, Zongze Yang wrote: >> >> This Message Is From an External Sender >> This message came from outside your organization. >> Adding the flag `--download-openblas-make-options=TARGET=GENERIC` did not >> resolve the is

Re: [petsc-users] Install PETSc with option `--with-shared-libraries=1` failed on MacOS

2024-03-17 Thread Zongze Yang
Adding the flag `--download-openblas-make-options=TARGET=GENERIC` did not resolve the issue. The same error persisted. Best wishes, Zongze > On 17 Mar 2024, at 20:58, Pierre Jolivet wrote: > > > >> On 17 Mar 2024, at 1:04 PM, Zongze Yang wrote: >> >&g

Re: [petsc-users] Help Needed Debugging Installation Issue for PETSc with SLEPc

2024-03-15 Thread Zongze Yang
G_uCfscf7eWS!fPbzaiH98Qdihq74E8kcX8JQE_EGk_PhMqYnheGPjwJD2l8AhgXg3ZrEkFZmV0lfG8vVlKawhAaQq4eSMRZVffeF$ > , so you need to use an up-to-date release branch of SLEPc. > > Thanks, > Pierre > >> On 15 Mar 2024, at 3:44 PM, Zongze Yang wrote: >> >> This Message Is From an External Sender >> This message came from outside your organiz

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
hes, Zongze Best regards, > > Yann > > Le 5/23/2023 à 11:59 AM, Matthew Knepley a écrit : > > On Mon, May 22, 2023 at 10:42 PM Zongze Yang > <mailto:yangzon...@gmail.com>> wrote: > > > > On Tue, 23 May 2023 at 05:31, Stefano Zampini > > mailto

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
On Tue, 23 May 2023 at 19:51, Zongze Yang wrote: > Thank you for your suggestion. I solved the problem with SuperLU_DIST, and > it works well. > This is solved with four nodes, each equipped with 500G of memory. Best wishes, Zongze Best wishes, > Zongze > > > On Tue,

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-23 Thread Zongze Yang
Thank you for your suggestion. I solved the problem with SuperLU_DIST, and it works well. Best wishes, Zongze On Tue, 23 May 2023 at 18:00, Matthew Knepley wrote: > On Mon, May 22, 2023 at 10:46 PM Zongze Yang wrote: > >> I have an additional question to ask: Is it poss

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
I have an additional question to ask: Is it possible for the SuperLU_DIST library to encounter the same MPI problem (PMPI_Iprobe failed) as MUMPS? Best wishes, Zongze On Tue, 23 May 2023 at 10:41, Zongze Yang wrote: > On Tue, 23 May 2023 at 05:31, Stefano Zampini > wrote: > >&g

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
the problem with 90 processes distributed across three nodes, each equipped with 500G of memory. If this amount of memory is sufficient for solving the matrix with approximately 3 million degrees of freedom? Thanks! Zongze Il giorno lun 22 mag 2023 alle ore 20:03 Zongze Yang > ha scrit

Re: [petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
Thanks! Zongze Matthew Knepley 于2023年5月23日 周二00:09写道: > On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote: > >> Hi, >> >> I hope this letter finds you well. I am writing to seek guidance >> regarding an error I encountered while solving a matrix using MUMPS on &g

[petsc-users] MPI_Iprobe Error with MUMPS Solver on Multi-Nodes

2023-05-22 Thread Zongze Yang
Hi, I hope this letter finds you well. I am writing to seek guidance regarding an error I encountered while solving a matrix using MUMPS on multiple nodes: ```bash Abort(1681039) on node 60 (rank 60 in comm 240): Fatal error in PMPI_Iprobe: Other MPI error, error stack: PMPI_Iprobe(124)..

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-15 Thread Zongze Yang
Got it. Thank you for your explanation! Best wishes, Zongze On Mon, 15 May 2023 at 23:28, Matthew Knepley wrote: > On Mon, May 15, 2023 at 9:55 AM Zongze Yang wrote: > >> On Mon, 15 May 2023 at 17:24, Matthew Knepley wrote: >> >>> On Sun, May 14, 2023 at 7:2

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-15 Thread Zongze Yang
On Mon, 15 May 2023 at 17:24, Matthew Knepley wrote: > On Sun, May 14, 2023 at 7:23 PM Zongze Yang wrote: > >> Could you try to project the coordinates into the continuity space by >> enabling the option >> `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-14 Thread Zongze Yang
Could you try to project the coordinates into the continuity space by enabling the option `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true`? Best wishes, Zongze On Mon, 15 May 2023 at 04:24, Matthew Knepley wrote: > On Sun, May 14, 2023 at 12:27 PM Zongze Yang wr

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-14 Thread Zongze Yang
On Sun, 14 May 2023 at 23:54, Matthew Knepley wrote: > On Sun, May 14, 2023 at 9:21 AM Zongze Yang wrote: > >> Hi, Matt, >> >> The issue has been resolved while testing on the latest version of PETSc. >> It seems that the problem has been fixed in the fol

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-14 Thread Zongze Yang
ith > high order tetrahedral elements (we've been coping with it for months and > someone asked last week) and plan to look at it as soon as possible now > that my semester finished. > > Zongze Yang writes: > > > Hi, Matt, > > > > The issue has been res

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-14 Thread Zongze Yang
tthew Knepley wrote: > On Sat, May 13, 2023 at 6:08 AM Zongze Yang wrote: > >> Hi, Matt, >> >> There seem to be ongoing issues with projecting high-order coordinates >> from a gmsh file to other spaces. I would like to inquire whether there are >> any plans to res

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2023-05-13 Thread Zongze Yang
Yang wrote: > Thank you for your reply. May I ask for some references on the order of > the dofs on PETSc's FE Space (especially high order elements)? > > Thanks, > > Zongze > > Matthew Knepley 于2022年6月18日周六 20:02写道: > >> On Sat, Jun 18, 2022 at 2:16 AM

Re: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant

2023-04-18 Thread Zongze Yang
ffsetof) && defined(__cplusplus) > > Satish > > On Tue, 18 Apr 2023, Jacob Faibussowitsch wrote: > > > This is a bug in GCC 9. Can you try the following: > > > > $ make clean > > $ make CFLAGS+='-std=gnu11’ > > > > Best regards, > &

Re: [petsc-users] Build error: vecimpl.h:124:98: error: expected declaration specifiers or '...' before string constant

2023-04-18 Thread Zongze Yang
t defined as > macro in assert.h > #include > > > Satish > > On Tue, 18 Apr 2023, Zongze Yang wrote: > > > Hi, I am building petsc using gcc@9.5.0, and found the following error: > > > > ``` > > In file included from /usr/include/alloca.h:25, &g

Re: [petsc-users] `snes+ksponly` did not update the solution when ksp failed.

2023-03-20 Thread Zongze Yang
Thank you for your clarification. Best wishes, Zongze On Mon, 20 Mar 2023 at 20:00, Matthew Knepley wrote: > On Mon, Mar 20, 2023 at 2:41 AM Zongze Yang wrote: > >> Hi, >> >> Hope this email finds you well. I am using firedrake to solve linear >> problem

[petsc-users] `snes+ksponly` did not update the solution when ksp failed.

2023-03-19 Thread Zongze Yang
Hi, Hope this email finds you well. I am using firedrake to solve linear problems, which use SNES with KSPONLY. I found that the solution did not update when the `ksp` failed with DIVERGED_ITS. The macro `SNESCheckKSPSolve` called in `SNESSolve_KSPONLY` make it return before the solution is updat

Re: [petsc-users] petsc4py did not raise for a second time with option `ksp_error_if_not_converged`

2023-03-06 Thread Zongze Yang
Thank you for your suggestion. Best wishes, Zongze On Mon, 6 Mar 2023 at 02:40, Matthew Knepley wrote: > On Sun, Mar 5, 2023 at 3:14 AM Zongze Yang wrote: > >> >> >> Hello, >> >> I am trying to catch the "not converged" error in a loop with

[petsc-users] petsc4py did not raise for a second time with option `ksp_error_if_not_converged`

2023-03-05 Thread Zongze Yang
Hello, I am trying to catch the "not converged" error in a loop with the `ksp_error_if_not_converged` option on. However, it seems that PETSc only raises the exception once, even though the solver does not converge after that. Is this expected behavior? Can I make it raise an exception every time?

Re: [petsc-users] Random Error of mumps: out of memory: INFOG(1)=-9

2023-03-04 Thread Zongze Yang
Thanks, I will give it a try. Best wishes, Zongze On Sat, 4 Mar 2023 at 23:09, Pierre Jolivet wrote: > > > On 4 Mar 2023, at 3:26 PM, Zongze Yang wrote: > >  > > > On Sat, 4 Mar 2023 at 22:03, Pierre Jolivet wrote: > >> >> >> On 4 Mar 2023, a

Re: [petsc-users] Random Error of mumps: out of memory: INFOG(1)=-9

2023-03-04 Thread Zongze Yang
On Sat, 4 Mar 2023 at 22:03, Pierre Jolivet wrote: > > > On 4 Mar 2023, at 2:51 PM, Zongze Yang wrote: > > > > On Sat, 4 Mar 2023 at 21:37, Pierre Jolivet wrote: > >> >> >> > On 4 Mar 2023, at 2:30 PM, Zongze Yang wrote: >> > >>

Re: [petsc-users] Random Error of mumps: out of memory: INFOG(1)=-9

2023-03-04 Thread Zongze Yang
On Sat, 4 Mar 2023 at 21:37, Pierre Jolivet wrote: > > > > On 4 Mar 2023, at 2:30 PM, Zongze Yang wrote: > > > > Hi, > > > > I am writing to seek your advice regarding a problem I encountered while > using multigrid to solve a certain issue. > > I

[petsc-users] Random Error of mumps: out of memory: INFOG(1)=-9

2023-03-04 Thread Zongze Yang
Hi, I am writing to seek your advice regarding a problem I encountered while using multigrid to solve a certain issue. I am currently using multigrid with the coarse problem solved by PCLU. However, the PC failed randomly with the error below (the value of INFO(2) may differ): ```shell [ 0] Error

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Zongze Yang
Yes, It seems that firedrake only works with DMPlex. Thanks. Best wishes, Zongze On Mon, 27 Feb 2023 at 22:53, Matthew Knepley wrote: > On Mon, Feb 27, 2023 at 9:45 AM Zongze Yang wrote: > >> Hi, Matt >> >> Thanks for your clarification. Can I change the type

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Zongze Yang
Knepley wrote: > On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang wrote: > >> Another question on mesh coarsening is about `DMCoarsen` which will fail >> when running in parallel. >> >> I generate a mesh in Firedrake, and then create function space and >> functio

Re: [petsc-users] Inquiry regarding DMAdaptLabel function

2023-02-27 Thread Zongze Yang
Hi, Matt Thanks for your clarification. Can I change the type of DMPlex to DMForest? Best wishes, Zongze On Mon, 27 Feb 2023 at 20:18, Matthew Knepley wrote: > On Sat, Feb 18, 2023 at 2:25 AM Zongze Yang wrote: > >> Dear PETSc Group, >> >> I am writing to in

Re: [petsc-users] Save images of ksp_monitor and ksp_view_eigenvaluses with user defined names

2023-02-09 Thread Zongze Yang
a bug in the KSP eigenmonitor viewing that I had > to fix to get this to work so you'll need to checkout the > *barry/2023-02-08/fix-ksp-monitor-eigenvalues-draw *branch of PETSc to > use the option I suggest. > > Barry > > > > > On Feb 8, 2023, at 5:09 AM, Zongze

[petsc-users] Save images of ksp_monitor and ksp_view_eigenvaluses with user defined names

2023-02-08 Thread Zongze Yang
Hi, PETSc group, I was trying to save figures of the residual and eigenvalues with different names but not default names. The default name is used when I use `-draw_save .png`. All images are saved. ``` python test.py -N 16 -test1_ksp_type gmres -test1_pc_type jacobi -test1_ksp_view_eigenvalues d

Re: [petsc-users] Build error with slepc: Unable to locate PETSc BAMG dynamic library

2022-11-17 Thread Zongze Yang
uild has the order (hence fails): > > - petsc > > - slepc > > - slepc4py > > - bamg > > > > I guess the alternative is: build slepc4py separately after > petsc/slepc/bamg are built. > > > > Satish > > > > On Thu, 17 Nov 2022, Z

Re: [petsc-users] How to show the x window for cmd `make -f ./gmakefile test ...`?

2022-10-06 Thread Zongze Yang
Thanks! Matthew Knepley 于2022年10月6日周四 16:19写道: > On Thu, Oct 6, 2022 at 9:16 AM Zongze Yang wrote: > >> Hi, everyone, >> >> I am trying to run some test cases with x window, but the x window never >> showed up with command `make -f ./gmakefile test ...`. It seems

[petsc-users] How to show the x window for cmd `make -f ./gmakefile test ...`?

2022-10-06 Thread Zongze Yang
Hi, everyone, I am trying to run some test cases with x window, but the x window never showed up with command `make -f ./gmakefile test ...`. It seems a default option `-nox` is set. How to disable this option for `make test`? An example is shown below: ``` z2yang@ws6:~/repos/petsc$ PETSC_ARCH=ar

Re: [petsc-users] Is the results of `DMAdaptLabel` as expected in `src/dm/impls/plex/tests/ex20.c`

2022-10-04 Thread Zongze Yang
Matthew Knepley 于2022年10月5日周三 00:33写道: > On Tue, Oct 4, 2022 at 3:19 PM Zongze Yang wrote: > >> Hi everyone, >> >> I am learning how to use the `DMAdaptLabel` for `DMPlex`, and found the >> example `src/dm/impls/plex/tests/ex20.c` which label one cell to refine. &

[petsc-users] Is the results of `DMAdaptLabel` as expected in `src/dm/impls/plex/tests/ex20.c`

2022-10-04 Thread Zongze Yang
is causing the crash. Abort(59) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0 ``` Thanks, Zongze Yang pre_adapt.pdf Description: Adobe PDF document post_adapt.pdf Description: Adobe PDF document

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2022-06-18 Thread Zongze Yang
Thank you for your reply. May I ask for some references on the order of the dofs on PETSc's FE Space (especially high order elements)? Thanks, Zongze Matthew Knepley 于2022年6月18日周六 20:02写道: > On Sat, Jun 18, 2022 at 2:16 AM Zongze Yang wrote: > >> In order to check if I mad

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2022-06-16 Thread Zongze Yang
relation of the closure and the order of the dofs for the cell? Thanks! Zongze Matthew Knepley 于2022年6月17日周五 01:11写道: > On Thu, Jun 16, 2022 at 12:06 PM Zongze Yang wrote: > >> >> >> 在 2022年6月16日,23:22,Matthew Knepley 写道: >> >>  >> On Thu, J

Re: [petsc-users] How to find the map between the high order coordinates of DMPlex and vertex numbering?

2022-06-16 Thread Zongze Yang
> 在 2022年6月16日,23:22,Matthew Knepley 写道: > >  >> On Thu, Jun 16, 2022 at 11:11 AM Zongze Yang wrote: > >> Hi, if I load a `gmsh` file with second-order elements, the coordinates will >> be stored in a DG-P2 space. After obtaining the coordinates of a cell, how