Re: [petsc-users] Read/Write large dense matrix

2024-08-06 Thread Barry Smith
I have removed an unnecessary PetscMPIIntCast() on MPI rank zero that was causing your test code to fail. See https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/merge_requests/7747__;!!G_uCfscf7eWS!fyVpTZyH7SS1nVHiZBR-6MSWa6uJ0mSExyg1aNmU4PyWfukw1682_dX9rwUKstiGP6Z8i22L4pmElEi9qsfHA6

Re: [petsc-users] Read/Write large dense matrix

2024-08-05 Thread Sreeram R Venkat
Here's an example code that should replicate the error: https://urldefense.us/v3/__https://github.com/s769/petsc-test/tree/master__;!!G_uCfscf7eWS!dWU2gJCvykWqg3TTfkkQOsW3q32Sny3r399zmyr6MCiJQh6_dH-T3IktQLg9fbvc4okbbHP2koQZkzL0fCjOTrC90w$ . I tried using the PETSC_FORMAT_NATIVE, but I still get t

Re: [petsc-users] Read/Write large dense matrix

2024-08-05 Thread Barry Smith
By default PETSc MatView() to a binary viewer uses the "standard" compressed sparse storage format. This is not efficient (or reasonable) for dense matrices and produces issues with integer overflow. To store a dense matrix as dense on disk, use the PetscViewerFormat of PETSC_VIEWER_NAT

Re: [petsc-users] Read/Write large dense matrix

2024-08-05 Thread Matthew Knepley
On Mon, Aug 5, 2024 at 1:26 PM Sreeram R Venkat wrote: > I do have 64 bit indices turned on. The problem I think is that the > PetscMPIInt is always a 32 bit int, and that's what's overflowing > We should be using the large count support from MPI. However, it appears we forgot somewhere. Would i

Re: [petsc-users] Read/Write large dense matrix

2024-08-05 Thread Sreeram R Venkat
I do have 64 bit indices turned on. The problem I think is that the PetscMPIInt is always a 32 bit int, and that's what's overflowing On Mon, Aug 5, 2024 at 12:25 PM Matthew Knepley wrote: > On Mon, Aug 5, 2024 at 1:10 PM Sreeram R Venkat > wrote: > >> I have a large dense matrix (size ranging

Re: [petsc-users] Read/Write large dense matrix

2024-08-05 Thread Matthew Knepley
On Mon, Aug 5, 2024 at 1:10 PM Sreeram R Venkat wrote: > I have a large dense matrix (size ranging from 5e4 to 1e5) that arises as > a result of doing MatComputeOperator() on a MatShell. When the total number > of nonzeros exceeds the 32 bit integer value, I get an error (MPI buffer > size too bi

[petsc-users] Read/Write large dense matrix

2024-08-05 Thread Sreeram R Venkat
I have a large dense matrix (size ranging from 5e4 to 1e5) that arises as a result of doing MatComputeOperator() on a MatShell. When the total number of nonzeros exceeds the 32 bit integer value, I get an error (MPI buffer size too big) when trying to do MatView() on this to save to binary. Is ther