Thank you for the quick fix — it works well on my end.
Best wishes,
Zongze
From: Junchao Zhang
Date: Thursday, July 24, 2025 at 02:55
To: Barry Smith
Cc: Zongze Yang , Klaij, Christiaan ,
PETSc users list
Subject: Re: [petsc-users] problem with nested logging, standalone example
I think I
Hi,
I encountered a similar issue with Firedrake when using the -log_view option
with XML format on macOS. Below is the error message. The Firedrake code and
the shell script used to run it are attached.
```
[0]PETSC ERROR: - Error Message
--
Thank you. I recommend including these packages: [check, netgen, slepc]. Since
VTK does not have an ARM Linux package, it may not be suitable to include.
Best wishes,
Zongze
From: Barry Smith
Date: Saturday, July 5, 2025 at 09:09
To: Zongze Yang
Cc: PETSc users list
Subject: Re: [petsc
Hi all,
As PETSc can install Firedrake along with it, would it be possible to support
optional extras for Firedrake—such as [vtk], [netgen]—during the installation?
Is this something that could be considered?
Best regards,
Zongze
Dear PETSc team,
I’m encountering an issue when running a PETSc-based application with TAU
instrumentation on macOS with Apple Silicon (M-series chip). The command I used
is:
```
tau_exec ./ex56 -log_perfstubs
```
However, it results in the following error:
```
❯ tau_exec ./ex56 -log_perf
w we do it in FreeFEM without DMPlex) and the error
> persists, please let me know.
>
> Thanks,
> Pierre
>
>> On 7 Oct 2024, at 6:12 PM, Zongze Yang wrote:
>>
>>
>> Hi Pierre,
>>
>> Thank you for the advice. I will look into implementing it as
but plain Mmg instead which is
> much more robust.
> I don’t know how easy it is to do with DMPlex though (gather a DM and a
> metric on a single process), especially from the command line.
>
> Thanks,
> Pierre
>
>> On 7 Oct 2024, at 4:46 PM, Zongze Yang > <mailt
sion
> Apple clang version 15.0.0 (clang-1500.3.9.4)
> Target: arm64-apple-darwin23.4.0
> Thread model: posix
> InstalledDir: /Library/Developer/CommandLineTools/usr/bin
> petsc@mpro petsc.x %
>
>
> On Tue, 2 Apr 2024, Zongze Yang wrote:
>
>> Thank you for th
ish Balay wrote:
>
> On Mon, 1 Apr 2024, Zongze Yang wrote:
>
>>
>> I noticed this in the config.log of OpenMPI:
>> ```
>> configure:30230: checking to see if mpifort compiler needs additional linker
>> flags
>> configure:30247: gfortran -o conftest -fPI
> On 1 Apr 2024, at 23: 38, Satish Balay wrote: > > On Sun, 31 Mar 2024, Zongze Yang wrote: >>> --- >>> petsc@ npro petsc % ./configure --download-bison --download-chaco --download-ctetgen
ZjQcmQRYFpfptBannerStart
This Message
> On 31 Mar 2024, at 02: 59, Satish Balay via petsc-users wrote: > > I'll just note - I can reproduce with: > > petsc@ npro petsc. x % ./configure --download-mpich --download-mumps --download-scalapack
ZjQcmQRYFpfptBannerStart
This Message Is From an External Sender
The issue of openblas was resolved by this pr
https://urldefense.us/v3/__https://github.com/OpenMathLib/OpenBLAS/pull/4565__;!!G_uCfscf7eWS!b09n5clcTFuLceLY_9KfqtSsgmmCIBLFbqciRVCKvnvFw9zTaNF8ssK0MiQlBOXUJe7H88nl-7ExdfhB-cMXLQ2d$
Best wishes,
Zongze
> On 18 Mar 2024, at 00:50, Zongze Y
tGAkpTvBmzy7BFDc9SrazFoooQ$
>> , so that’s another good argument in favor of -framework Accelerate.
>>
>> Thanks,
>> Pierre
>>
>> PS: anyone benchmarked those
>> https://urldefense.us/v3/__https://developer.apple.com/documentation/accelerate/sparse
m.
>
>
>> On Mar 17, 2024, at 9:58 AM, Zongze Yang wrote:
>>
>> This Message Is From an External Sender
>> This message came from outside your organization.
>> Adding the flag `--download-openblas-make-options=TARGET=GENERIC` did not
>> resolve the is
Adding the flag `--download-openblas-make-options=TARGET=GENERIC` did not
resolve the issue. The same error persisted.
Best wishes,
Zongze
> On 17 Mar 2024, at 20:58, Pierre Jolivet wrote:
>
>
>
>> On 17 Mar 2024, at 1:04 PM, Zongze Yang wrote:
>>
>&g
G_uCfscf7eWS!fPbzaiH98Qdihq74E8kcX8JQE_EGk_PhMqYnheGPjwJD2l8AhgXg3ZrEkFZmV0lfG8vVlKawhAaQq4eSMRZVffeF$
> , so you need to use an up-to-date release branch of SLEPc.
>
> Thanks,
> Pierre
>
>> On 15 Mar 2024, at 3:44 PM, Zongze Yang wrote:
>>
>> This Message Is From an External Sender
>> This message came from outside your organiz
hes,
Zongze
Best regards,
>
> Yann
>
> Le 5/23/2023 à 11:59 AM, Matthew Knepley a écrit :
> > On Mon, May 22, 2023 at 10:42 PM Zongze Yang > <mailto:yangzon...@gmail.com>> wrote:
> >
> > On Tue, 23 May 2023 at 05:31, Stefano Zampini
> > mailto
On Tue, 23 May 2023 at 19:51, Zongze Yang wrote:
> Thank you for your suggestion. I solved the problem with SuperLU_DIST, and
> it works well.
>
This is solved with four nodes, each equipped with 500G of memory.
Best wishes,
Zongze
Best wishes,
> Zongze
>
>
> On Tue,
Thank you for your suggestion. I solved the problem with SuperLU_DIST, and
it works well.
Best wishes,
Zongze
On Tue, 23 May 2023 at 18:00, Matthew Knepley wrote:
> On Mon, May 22, 2023 at 10:46 PM Zongze Yang wrote:
>
>> I have an additional question to ask: Is it poss
I have an additional question to ask: Is it possible for the SuperLU_DIST
library to encounter the same MPI problem (PMPI_Iprobe failed) as MUMPS?
Best wishes,
Zongze
On Tue, 23 May 2023 at 10:41, Zongze Yang wrote:
> On Tue, 23 May 2023 at 05:31, Stefano Zampini
> wrote:
>
>&g
the problem with 90 processes distributed
across three nodes, each equipped with 500G of memory. If this amount of
memory is sufficient for solving the matrix with approximately 3 million
degrees of freedom?
Thanks!
Zongze
Il giorno lun 22 mag 2023 alle ore 20:03 Zongze Yang
> ha scrit
Thanks!
Zongze
Matthew Knepley 于2023年5月23日 周二00:09写道:
> On Mon, May 22, 2023 at 11:07 AM Zongze Yang wrote:
>
>> Hi,
>>
>> I hope this letter finds you well. I am writing to seek guidance
>> regarding an error I encountered while solving a matrix using MUMPS on
&g
Hi,
I hope this letter finds you well. I am writing to seek guidance regarding
an error I encountered while solving a matrix using MUMPS on multiple nodes:
```bash
Abort(1681039) on node 60 (rank 60 in comm 240): Fatal error in
PMPI_Iprobe: Other MPI error, error stack:
PMPI_Iprobe(124)..
Got it. Thank you for your explanation!
Best wishes,
Zongze
On Mon, 15 May 2023 at 23:28, Matthew Knepley wrote:
> On Mon, May 15, 2023 at 9:55 AM Zongze Yang wrote:
>
>> On Mon, 15 May 2023 at 17:24, Matthew Knepley wrote:
>>
>>> On Sun, May 14, 2023 at 7:2
On Mon, 15 May 2023 at 17:24, Matthew Knepley wrote:
> On Sun, May 14, 2023 at 7:23 PM Zongze Yang wrote:
>
>> Could you try to project the coordinates into the continuity space by
>> enabling the option
>> `-dm_plex_gmsh_project_petscdualspace_lagrange_continuity
Could you try to project the coordinates into the continuity space by
enabling the option
`-dm_plex_gmsh_project_petscdualspace_lagrange_continuity true`?
Best wishes,
Zongze
On Mon, 15 May 2023 at 04:24, Matthew Knepley wrote:
> On Sun, May 14, 2023 at 12:27 PM Zongze Yang wr
On Sun, 14 May 2023 at 23:54, Matthew Knepley wrote:
> On Sun, May 14, 2023 at 9:21 AM Zongze Yang wrote:
>
>> Hi, Matt,
>>
>> The issue has been resolved while testing on the latest version of PETSc.
>> It seems that the problem has been fixed in the fol
ith
> high order tetrahedral elements (we've been coping with it for months and
> someone asked last week) and plan to look at it as soon as possible now
> that my semester finished.
>
> Zongze Yang writes:
>
> > Hi, Matt,
> >
> > The issue has been res
tthew Knepley wrote:
> On Sat, May 13, 2023 at 6:08 AM Zongze Yang wrote:
>
>> Hi, Matt,
>>
>> There seem to be ongoing issues with projecting high-order coordinates
>> from a gmsh file to other spaces. I would like to inquire whether there are
>> any plans to res
Yang wrote:
> Thank you for your reply. May I ask for some references on the order of
> the dofs on PETSc's FE Space (especially high order elements)?
>
> Thanks,
>
> Zongze
>
> Matthew Knepley 于2022年6月18日周六 20:02写道:
>
>> On Sat, Jun 18, 2022 at 2:16 AM
ffsetof) && defined(__cplusplus)
>
> Satish
>
> On Tue, 18 Apr 2023, Jacob Faibussowitsch wrote:
>
> > This is a bug in GCC 9. Can you try the following:
> >
> > $ make clean
> > $ make CFLAGS+='-std=gnu11’
> >
> > Best regards,
> &
t defined as
> macro in assert.h
> #include
>
>
> Satish
>
> On Tue, 18 Apr 2023, Zongze Yang wrote:
>
> > Hi, I am building petsc using gcc@9.5.0, and found the following error:
> >
> > ```
> > In file included from /usr/include/alloca.h:25,
&g
Thank you for your clarification.
Best wishes,
Zongze
On Mon, 20 Mar 2023 at 20:00, Matthew Knepley wrote:
> On Mon, Mar 20, 2023 at 2:41 AM Zongze Yang wrote:
>
>> Hi,
>>
>> Hope this email finds you well. I am using firedrake to solve linear
>> problem
Hi,
Hope this email finds you well. I am using firedrake to solve linear
problems, which use SNES with KSPONLY.
I found that the solution did not update when the `ksp` failed with
DIVERGED_ITS.
The macro `SNESCheckKSPSolve` called in `SNESSolve_KSPONLY` make it return
before the solution is updat
Thank you for your suggestion.
Best wishes,
Zongze
On Mon, 6 Mar 2023 at 02:40, Matthew Knepley wrote:
> On Sun, Mar 5, 2023 at 3:14 AM Zongze Yang wrote:
>
>>
>>
>> Hello,
>>
>> I am trying to catch the "not converged" error in a loop with
Hello,
I am trying to catch the "not converged" error in a loop with the
`ksp_error_if_not_converged` option on. However, it seems that PETSc only
raises the exception once, even though the solver does not converge after
that. Is this expected behavior? Can I make it raise an exception every
time?
Thanks, I will give it a try.
Best wishes,
Zongze
On Sat, 4 Mar 2023 at 23:09, Pierre Jolivet wrote:
>
>
> On 4 Mar 2023, at 3:26 PM, Zongze Yang wrote:
>
>
>
>
> On Sat, 4 Mar 2023 at 22:03, Pierre Jolivet wrote:
>
>>
>>
>> On 4 Mar 2023, a
On Sat, 4 Mar 2023 at 22:03, Pierre Jolivet wrote:
>
>
> On 4 Mar 2023, at 2:51 PM, Zongze Yang wrote:
>
>
>
> On Sat, 4 Mar 2023 at 21:37, Pierre Jolivet wrote:
>
>>
>>
>> > On 4 Mar 2023, at 2:30 PM, Zongze Yang wrote:
>> >
>>
On Sat, 4 Mar 2023 at 21:37, Pierre Jolivet wrote:
>
>
> > On 4 Mar 2023, at 2:30 PM, Zongze Yang wrote:
> >
> > Hi,
> >
> > I am writing to seek your advice regarding a problem I encountered while
> using multigrid to solve a certain issue.
> > I
Hi,
I am writing to seek your advice regarding a problem I encountered while
using multigrid to solve a certain issue.
I am currently using multigrid with the coarse problem solved by PCLU.
However, the PC failed randomly with the error below (the value of INFO(2)
may differ):
```shell
[ 0] Error
Yes, It seems that firedrake only works with DMPlex. Thanks.
Best wishes,
Zongze
On Mon, 27 Feb 2023 at 22:53, Matthew Knepley wrote:
> On Mon, Feb 27, 2023 at 9:45 AM Zongze Yang wrote:
>
>> Hi, Matt
>>
>> Thanks for your clarification. Can I change the type
Knepley wrote:
> On Sat, Feb 18, 2023 at 6:41 AM Zongze Yang wrote:
>
>> Another question on mesh coarsening is about `DMCoarsen` which will fail
>> when running in parallel.
>>
>> I generate a mesh in Firedrake, and then create function space and
>> functio
Hi, Matt
Thanks for your clarification. Can I change the type of DMPlex to DMForest?
Best wishes,
Zongze
On Mon, 27 Feb 2023 at 20:18, Matthew Knepley wrote:
> On Sat, Feb 18, 2023 at 2:25 AM Zongze Yang wrote:
>
>> Dear PETSc Group,
>>
>> I am writing to in
a bug in the KSP eigenmonitor viewing that I had
> to fix to get this to work so you'll need to checkout the
> *barry/2023-02-08/fix-ksp-monitor-eigenvalues-draw *branch of PETSc to
> use the option I suggest.
>
> Barry
>
>
>
>
> On Feb 8, 2023, at 5:09 AM, Zongze
Hi, PETSc group,
I was trying to save figures of the residual and eigenvalues with different
names but not default names.
The default name is used when I use `-draw_save .png`. All images are saved.
```
python test.py -N 16 -test1_ksp_type gmres -test1_pc_type jacobi
-test1_ksp_view_eigenvalues d
uild has the order (hence fails):
> > - petsc
> > - slepc
> > - slepc4py
> > - bamg
> >
> > I guess the alternative is: build slepc4py separately after
> petsc/slepc/bamg are built.
> >
> > Satish
> >
> > On Thu, 17 Nov 2022, Z
Thanks!
Matthew Knepley 于2022年10月6日周四 16:19写道:
> On Thu, Oct 6, 2022 at 9:16 AM Zongze Yang wrote:
>
>> Hi, everyone,
>>
>> I am trying to run some test cases with x window, but the x window never
>> showed up with command `make -f ./gmakefile test ...`. It seems
Hi, everyone,
I am trying to run some test cases with x window, but the x window never
showed up with command `make -f ./gmakefile test ...`. It seems a default
option `-nox` is set. How to disable this option for `make test`?
An example is shown below:
```
z2yang@ws6:~/repos/petsc$ PETSC_ARCH=ar
Matthew Knepley 于2022年10月5日周三 00:33写道:
> On Tue, Oct 4, 2022 at 3:19 PM Zongze Yang wrote:
>
>> Hi everyone,
>>
>> I am learning how to use the `DMAdaptLabel` for `DMPlex`, and found the
>> example `src/dm/impls/plex/tests/ex20.c` which label one cell to refine.
&
is
causing the crash.
Abort(59) on node 0 (rank 0 in comm 0): application called
MPI_Abort(MPI_COMM_WORLD, 59) - process 0
```
Thanks,
Zongze Yang
pre_adapt.pdf
Description: Adobe PDF document
post_adapt.pdf
Description: Adobe PDF document
Thank you for your reply. May I ask for some references on the order of the
dofs on PETSc's FE Space (especially high order elements)?
Thanks,
Zongze
Matthew Knepley 于2022年6月18日周六 20:02写道:
> On Sat, Jun 18, 2022 at 2:16 AM Zongze Yang wrote:
>
>> In order to check if I mad
relation of the closure and the order of the dofs for the cell?
Thanks!
Zongze
Matthew Knepley 于2022年6月17日周五 01:11写道:
> On Thu, Jun 16, 2022 at 12:06 PM Zongze Yang wrote:
>
>>
>>
>> 在 2022年6月16日,23:22,Matthew Knepley 写道:
>>
>>
>> On Thu, J
> 在 2022年6月16日,23:22,Matthew Knepley 写道:
>
>
>> On Thu, Jun 16, 2022 at 11:11 AM Zongze Yang wrote:
>
>> Hi, if I load a `gmsh` file with second-order elements, the coordinates will
>> be stored in a DG-P2 space. After obtaining the coordinates of a cell, how
53 matches
Mail list logo