Fabian,
That is indeed a typo. Thanks very much for pointing it out.
Cheers,
David
On 25/11/2024 08:45, Fabian.Jakub via petsc-users wrote:
This email was sent to you by someone outside the University.
You should only click on links or attachments if you are certain that
the email is genuine
test_configuration_options.F90:l.55
max_msg_length is quite large I guess the pow() is a typo.
Cheers,
Fabian
On 11/25/24 09:32, David Scott wrote:
I'll have a look at heaptrack.
The code that I am looking at the moment does not create a mesh. All it
does is read a petscrc file.
Thanks,
I'll have a look at heaptrack.
The code that I am looking at the moment does not create a mesh. All it
does is read a petscrc file.
Thanks,
David
On 25/11/2024 05:27, Jed Brown wrote:
This email was sent to you by someone outside the University.
You should only click on links or attachments
You're clearly doing almost all your allocation *not* using PetscMalloc (so not
in a Vec or Mat). If you're managing your own mesh yourself, you might be
allocating a global amount on each rank, instead of strictly using scalable
data structures (i.e., always partitioned).
My favorite tool for
OK.
I had started to wonder if that was the case. I'll do some further
investigation.
Thanks,
David
On 22/11/2024 22:10, Matthew Knepley wrote:
This email was sent to you by someone outside the University.
You should only click on links or attachments if you are certain that
the email is g
On Fri, Nov 22, 2024 at 12:57 PM David Scott wrote:
> Matt,
>
> Thanks for the quick response.
>
> Yes 1) is trivially true.
>
> With regard to 2), from the SLURM output:
> [0] Maximum memory PetscMalloc()ed 29552 maximum size of entire process
> 4312375296
> [1] Maximum memory PetscMalloc()ed 29
Matt,
Thanks for the quick response.
Yes 1) is trivially true.
With regard to 2), from the SLURM output:
[0] Maximum memory PetscMalloc()ed 29552 maximum size of entire process
4312375296
[1] Maximum memory PetscMalloc()ed 29552 maximum size of entire process
4311990272
Yes only 29KB was mal
On Fri, Nov 22, 2024 at 11:36 AM David Scott wrote:
> Hello,
>
> I am using the options mechanism of PETSc to configure my CFD code. I
> have introduced options describing the size of the domain etc. I have
> noticed that this consumes a lot of memory. I have found that the amount
> of memory use
Hello,
I am using the options mechanism of PETSc to configure my CFD code. I
have introduced options describing the size of the domain etc. I have
noticed that this consumes a lot of memory. I have found that the amount
of memory used scales linearly with the number of MPI processes used.
This re