Re: [petsc-users] Incoherent data entries in array from a dense sub matrix

2025-02-07 Thread medane.tchako...@univ-fcomte.fr
Re:
Please find below the output from the previous code, running on only one 
processor.

Mat Object: 1 MPI process
  type: seqdense
7.2003197397953400e-01 3.980919128602e-01 9.8405227390177075e-01 
1.4405427480322786e-01
6.1793966542126100e-02 7.3036588248200474e-02 7.3851607000756303e-01 
9.9650445216117589e-01
1.0022337819588500e-02 1.0386628927366459e-01 4.0114727059134836e-01 
1.0677308875937896e-01
1.4463931936456476e-01 2.5078039364333193e-01 5.2764865382548720e-01 
9.8905332488367748e-01

buffer[0] = 7.200320e-01
buffer[1] = 6.179397e-02
buffer[2] = 1.002234e-02
buffer[3] = 1.446393e-01
buffer[4] = 0.00e+00
buffer[5] = 0.00e+00
buffer[6] = 0.00e+00
buffer[7] = 0.00e+00
buffer[8] = 3.98e-01
buffer[9] = 7.303659e-02
buffer[10] = 1.038663e-01
buffer[11] = 2.507804e-01
buffer[12] = 0.00e+00
buffer[13] = 0.00e+00
buffer[14] = 0.00e+00
buffer[15] = 0.00e+00

Mat Object: 1 MPI process
  type: seqdense
7.2003197397953400e-01 3.980919128602e-01 9.8405227390177075e-01 
1.4405427480322786e-01
6.1793966542126100e-02 7.3036588248200474e-02 7.3851607000756303e-01 
9.9650445216117589e-01
1.0022337819588500e-02 1.0386628927366459e-01 4.0114727059134836e-01 
1.0677308875937896e-01
1.4463931936456476e-01 2.5078039364333193e-01 5.2764865382548720e-01 
9.8905332488367748e-01
0.e+00 0.e+00 0.e+00 
0.e+00
0.e+00 0.e+00 0.e+00 
0.e+00
0.e+00 0.e+00 0.e+00 
0.e+00
0.e+00 0.e+00 0.e+00 
0.e+00


I was expecting to get in “buffer”, only the data entries from R_part. Please, 
let me know if this is the excepted behavior and I’am missing something. 

Thanks,
Medane



> On 7 Feb 2025, at 11:34, Pierre Jolivet  wrote:
> 
> 
> 
>> On 7 Feb 2025, at 11:05 AM, medane.tchako...@univ-fcomte.fr wrote:
>> 
>> 
>> Dear all,
>> 
>> I have been experiencing incoherent data entries from this code below, when 
>> printing the array. Maybe I’am doing something wrong.
> 
> What is incoherent?
> Everything looks OK to me.
> 
> Thanks,
> Pierre
> 
>> 
>> 
>>PetscInt nlines = 8; // lines
>>   PetscInt ncols = 4;  // columns
>>   PetscMPIInt rank;
>>   PetscMPIInt size;
>> 
>>   // Initialize PETSc
>>   PetscCall(PetscInitialize(&argc, &args, NULL, NULL));
>>   PetscCallMPI(MPI_Comm_rank(MPI_COMM_WORLD, &rank));
>>   PetscCallMPI(MPI_Comm_size(MPI_COMM_WORLD, &size));
>> 
>>   Mat R_full;
>>   Mat R_part;
>>   PetscInt idx_first_row = 0;
>>   PetscInt idx_one_plus_last_row = nlines / 2;
>>   PetscCall(MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, 
>> nlines, ncols, NULL, &R_full));
>> 
>>   // Get sub matrix
>>   PetscCall(MatDenseGetSubMatrix(R_full, idx_first_row, 
>> idx_one_plus_last_row, PETSC_DECIDE, PETSC_DECIDE, &R_part));
>>   // Add entries to sub matrix
>>   MatSetRandom(R_part, NULL);
>>   //View sub matrix
>>   PetscCall(MatView(R_part, PETSC_VIEWER_STDOUT_WORLD));
>> 
>>   // Get array from sub matrix and print entries
>>   PetscScalar *buffer;
>>   PetscCall(MatDenseGetArray(R_part, &buffer));
>>   PetscInt idx_end = (nlines/2) * ncols;
>> 
>>   for (int i = 0; i < idx_end; i++)
>>   {
>>   PetscPrintf(PETSC_COMM_SELF, "buffer[%d] = %e \n", i, buffer[i]);
>>   }
>> 
>>   //Restore array to sub matrix
>>   PetscCall(MatDenseRestoreArray(R_part, &buffer));
>>   // Restore sub matrix
>>   PetscCall(MatDenseRestoreSubMatrix(R_full, &R_part));
>>   // View the initial matrix
>>   PetscCall(MatView(R_full, PETSC_VIEWER_STDOUT_WORLD));
>> 
>>   PetscCall(MatDestroy(&R_full));
>> 
>>   PetscCall(PetscFinalize());
>>   return 0;
>> 
>> 
>> 
>> 
>> Thanks
>> Medane
> 
> 



[petsc-users] Incoherent data entries in array from a dense sub matrix

2025-02-07 Thread medane.tchako...@univ-fcomte.fr


Dear all,

I have been experiencing incoherent data entries from this code below, when 
printing the array. Maybe I’am doing something wrong. 



 PetscInt nlines = 8; // lines
PetscInt ncols = 4;  // columns
PetscMPIInt rank;
PetscMPIInt size;

// Initialize PETSc
PetscCall(PetscInitialize(&argc, &args, NULL, NULL));
PetscCallMPI(MPI_Comm_rank(MPI_COMM_WORLD, &rank));
PetscCallMPI(MPI_Comm_size(MPI_COMM_WORLD, &size));

Mat R_full;
Mat R_part;
PetscInt idx_first_row = 0;
PetscInt idx_one_plus_last_row = nlines / 2;
PetscCall(MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, 
nlines, ncols, NULL, &R_full));

// Get sub matrix
PetscCall(MatDenseGetSubMatrix(R_full, idx_first_row, 
idx_one_plus_last_row, PETSC_DECIDE, PETSC_DECIDE, &R_part));
// Add entries to sub matrix
MatSetRandom(R_part, NULL);
//View sub matrix
PetscCall(MatView(R_part, PETSC_VIEWER_STDOUT_WORLD));

// Get array from sub matrix and print entries
PetscScalar *buffer;
PetscCall(MatDenseGetArray(R_part, &buffer));
PetscInt idx_end = (nlines/2) * ncols;

for (int i = 0; i < idx_end; i++)
{
PetscPrintf(PETSC_COMM_SELF, "buffer[%d] = %e \n", i, buffer[i]);
}

//Restore array to sub matrix
PetscCall(MatDenseRestoreArray(R_part, &buffer));
// Restore sub matrix
PetscCall(MatDenseRestoreSubMatrix(R_full, &R_part));
// View the initial matrix
PetscCall(MatView(R_full, PETSC_VIEWER_STDOUT_WORLD));

PetscCall(MatDestroy(&R_full));

PetscCall(PetscFinalize());
return 0;




Thanks
Medane

[petsc-users] Copy dense matrix into half part of another dense matrix

2025-01-27 Thread medane.tchako...@univ-fcomte.fr
Dear PETSc users,

I hope this message finds you well. I don’t know If my question is relevant, 
but I’am currently working with DENSE type matrix, and would like to copy one 
matrix R_part [ n/2 x m] (resulted from a MatMatMult operation) into another 
dense matrix R_full [n x m].  
Both matrices being on the same communicator, I would like to efficiently copy 
R_part in the first half of R_full. 
I have being using MatSetValues, but for large matrices, the subsequent 
assembling operation is costly.
Please could you suggest me some strategies or functions to do this 
efficiently. 

Thank you for your time and assistance.

Best regards,
Tchakorom Medane



Re: [petsc-users] Copy dense matrix into half part of another dense matrix

2025-01-28 Thread medane.tchako...@univ-fcomte.fr

Re:

Thank you Pierre, I really appreciate. I’am testing it right now to access the 
improvements. 

BR,
Medane


> On 27 Jan 2025, at 20:19, Pierre Jolivet  wrote:
> 
> Please always keep the list in copy.
> The way you create A is not correct, I’ve attached a fixed code.
> If you want to keep your own distribution for A (and not the one associated 
> to R_part), you’ll need to first call 
> https://urldefense.us/v3/__https://petsc.org/main/manualpages/Mat/MatCreateSubMatrix/__;!!G_uCfscf7eWS!Ye7A4fqD5xLobOPjgvYkh9cj1-JExIqX_EJIHFm-NHw5rEk2PU5kvs3GfKlJd2TZPorWhvb0Jh7eTcKii9t7Z7tgYSIoeSHTchrF1snH$
>   to redistribute A and then do a MatCopy() of the resulting Mat into R_part
> 
> Thanks,
> Pierre
> 
> $ /Volumes/Data/repositories/petsc/arch-darwin-c-debug-real/bin/mpirun -n 4 
> ./ex1234
> Mat Object: 4 MPI processes
>   type: mpidense
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
> Mat Object: 4 MPI processes
>   type: mpidense
>   2.6219599187040323e+00   1.9661197867318445e+00   1.5218640363910978e+00   
>   3.5202261875977947e+00   3.6311893358251384e+00   2.2279492868785069e+00   
>   2.7505403755038014e+00   3.1546072728892756e+00   1.8416294994524489e+00   
>   2.4676055638467314e+00   2.3185625557889602e+00   2.0401666986599833e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.e+00   0.e+00   
>   0.e+00   0.0000e+00   0.e+00   
> 
> 
> 
> 
> 
>> On 27 Jan 2025, at 6:53 PM, medane.tchako...@univ-fcomte.fr wrote:
>> 
>> Re:
>> 
>> This is a small reproductible example using MatDenseGetSubMatrix
>> 
>> 
>> Command: petscmpiexec -n 4 ./example
>> 
>> ==
>> 
>>PetscInt nlines = 8;   // lines
>> PetscInt ncolumns = 3; // columns
>> PetscInt random_size = 12;
>> PetscInt rank;
>> PetscInt size;
>> 
>> // Initialize PETSc
>> PetscInitialize(&argc, &args, NULL, NULL);
>> 
>> MPI_Comm_rank(MPI_COMM_WORLD, &rank);
>> MPI_Comm_size(MPI_COMM_WORLD, &size);
>> 
>> // R_full with all values to zero
>> Mat R_full;
>> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, nlines, 
>> ncolumns, NULL, &R_full);
>> MatZeroEntries(R_full);
>> MatView(R_full, PETSC_VIEWER_STDOUT_WORLD);
>> 
>> // Creating and setting A and S to rand values
>> Mat A, S;
>> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, nlines / 2, 
>> random_size, NULL, &A);
>> MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, 
>> random_size, ncolumns, NULL, &S);
>> MatSetRandom(A, NULL);
>> MatSetRandom(S, NULL);
>> 
>> // Computing R_part
>> Mat R_part;
>> MatDenseGetSubMatrix(R_full, PETSC_DECIDE, nlines / 2, PETSC_DECIDE, 
>> PETSC_DECIDE, &R_part);
>> MatMatMult(A, S, MAT_REUSE_MATRIX, PETSC_DECIDE, &R_part);
>> 
>> // Visualizing R_full
>> MatDenseRestoreSubMatrix(R_full, &R_part);
>> MatView(R_full, PETSC_VIEWER_STDOUT_WORLD);
>> 
>> // Destroying matrices
>> MatDestroy(&R_part);
>> MatDestroy(&R_full);
>> 
>> PetscFinalize();
>> return 0;
>> 
>> ==
>> 
>> 
>> Part of the error output contains….:
>> 
>> "Cannot change/reset row sizes to 1 local 4 global after previously setting 
>> them to 2 local 4 global ….”
>> 
>> 
>> 
>> 
>> ==
>> 
>> PetscInt nlines = 8;   // lines
>> PetscInt ncolumns = 3; // columns
>> PetscInt random_size = 12;
>> PetscInt rank;
>> PetscInt size;
>> 
>>

Re: [petsc-users] Incoherent data entries in array from a dense sub matrix

2025-02-08 Thread medane.tchako...@univ-fcomte.fr

Dear petsc team,

Thank you for all your answers. I really appreciate.

Best regards,
Medane


> On 7 Feb 2025, at 15:22, Matthew Knepley  wrote:
> 
> On Fri, Feb 7, 2025 at 8:20 AM medane.tchako...@univ-fcomte.fr 
> <mailto:medane.tchako...@univ-fcomte.fr>  <mailto:medane.tchako...@univ-fcomte.fr>> wrote:
>> Re:
>> Please find below the output from the previous code, running on only one 
>> processor.
>> 
>> Mat Object: 1 MPI process
>>   type: seqdense
>> 7.2003197397953400e-01 3.980919128602e-01 9.8405227390177075e-01 
>> 1.4405427480322786e-01
>> 6.1793966542126100e-02 7.3036588248200474e-02 7.3851607000756303e-01 
>> 9.9650445216117589e-01
>> 1.0022337819588500e-02 1.0386628927366459e-01 4.0114727059134836e-01 
>> 1.0677308875937896e-01
>> 1.4463931936456476e-01 2.5078039364333193e-01 5.2764865382548720e-01 
>> 9.8905332488367748e-01
>> 
>> buffer[0] = 7.200320e-01
>> buffer[1] = 6.179397e-02
>> buffer[2] = 1.002234e-02
>> buffer[3] = 1.446393e-01
>> buffer[4] = 0.00e+00
>> buffer[5] = 0.00e+00
>> buffer[6] = 0.00e+00
>> buffer[7] = 0.00e+00
>> buffer[8] = 3.98e-01
>> buffer[9] = 7.303659e-02
>> buffer[10] = 1.038663e-01
>> buffer[11] = 2.507804e-01
>> buffer[12] = 0.00e+00
>> buffer[13] = 0.00e+00
>> buffer[14] = 0.00e+00
>> buffer[15] = 0.00e+00
>> 
>> Mat Object: 1 MPI process
>>   type: seqdense
>> 7.2003197397953400e-01 3.980919128602e-01 9.8405227390177075e-01 
>> 1.4405427480322786e-01
>> 6.1793966542126100e-02 7.3036588248200474e-02 7.3851607000756303e-01 
>> 9.9650445216117589e-01
>> 1.0022337819588500e-02 1.0386628927366459e-01 4.0114727059134836e-01 
>> 1.0677308875937896e-01
>> 1.4463931936456476e-01 2.5078039364333193e-01 5.2764865382548720e-01 
>> 9.8905332488367748e-01
>> 0.e+00 0.e+00 0.e+00 
>> 0.e+00
>> 0.e+00 0.e+00 0.e+00 
>> 0.e+00
>> 0.e+00 0.e+00 0.e+00 
>> 0.e+00
>> 0.e+00 0.e+00 0.e+00 
>> 0.e+00
>> 
>> 
>> I was expecting to get in “buffer”, only the data entries from R_part. 
>> Please, let me know if this is the excepted behavior and I’am missing 
>> something.
> 
> As Jose already pointed out, SubMatrix() does not copy. It gives you a Mat 
> front end to the same data, but with changed sizes. In this case, the LDA is 
> 4, not 2, so when you iterate over the values, you skip over the ones you 
> don't want.
> 
>   Thanks,
> 
> Matt
>  
>> Thanks,
>> Medane
>> 
>> 
>> 
>> > On 7 Feb 2025, at 11:34, Pierre Jolivet > > <mailto:pie...@joliv.et>> wrote:
>> > 
>> > 
>> > 
>> >> On 7 Feb 2025, at 11:05 AM, medane.tchako...@univ-fcomte.fr 
>> >> <mailto:medane.tchako...@univ-fcomte.fr> wrote:
>> >> 
>> >> 
>> >> Dear all,
>> >> 
>> >> I have been experiencing incoherent data entries from this code below, 
>> >> when printing the array. Maybe I’am doing something wrong.
>> > 
>> > What is incoherent?
>> > Everything looks OK to me.
>> > 
>> > Thanks,
>> > Pierre
>> > 
>> >> 
>> >> 
>> >>PetscInt nlines = 8; // lines
>> >>   PetscInt ncols = 4;  // columns
>> >>   PetscMPIInt rank;
>> >>   PetscMPIInt size;
>> >> 
>> >>   // Initialize PETSc
>> >>   PetscCall(PetscInitialize(&argc, &args, NULL, NULL));
>> >>   PetscCallMPI(MPI_Comm_rank(MPI_COMM_WORLD, &rank));
>> >>   PetscCallMPI(MPI_Comm_size(MPI_COMM_WORLD, &size));
>> >> 
>> >>   Mat R_full;
>> >>   Mat R_part;
>> >>   PetscInt idx_first_row = 0;
>> >>   PetscInt idx_one_plus_last_row = nlines / 2;
>> >>   PetscCall(MatCreateDense(PETSC_COMM_WORLD, PETSC_DECIDE, PETSC_DECIDE, 
>> >> nlines, ncols, NULL, &R_full));
>> >> 
>> >>   // Get sub matrix
>> >>   PetscCall(MatDenseGetSubMatrix(R_full, idx_first_row, 
>> >> idx_one_plus_last_row, PETSC_DECIDE, PETSC_DECIDE, &R_part));
>> >>   // Add entries to sub matrix
>> >>   MatSetRandom(R_part, NULL);
>> >>   //View su