Junchao,
I won't be feasible to share the code but I will run a similar test
as you have done (large problem); I will
try with both MPICH and OpenMPI. I also agree that deltas are not ideal
as there they do not account for latency in the freeing of memory
etc. But I will note when we have th
On Mon, Jun 3, 2019 at 6:56 PM Zhang, Junchao via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> On Mon, Jun 3, 2019 at 5:23 PM Stefano Zampini
> wrote:
>
>>
>>
>> On Jun 4, 2019, at 1:17 AM, Zhang, Junchao via petsc-users <
>> petsc-users@mcs.anl.gov> wrote:
>>
>> Sanjay & Barry,
>> Sorry, I
On Mon, Jun 3, 2019 at 5:23 PM Stefano Zampini
mailto:stefano.zamp...@gmail.com>> wrote:
On Jun 4, 2019, at 1:17 AM, Zhang, Junchao via petsc-users
mailto:petsc-users@mcs.anl.gov>> wrote:
Sanjay & Barry,
Sorry, I made a mistake that I said I could reproduced Sanjay's experiments.
I found
Sanjay & Barry,
Sorry, I made a mistake that I said I could reproduced Sanjay's experiments.
I found 1) to correctly use PetscMallocGetCurrentUsage() when petsc is
configured without debugging, I have to add -malloc to run the program. 2) I
have to instrument the code outside of KSPSolve(). In