Where does global_vec come from, and are you sure that all MPI processes 
that share global_vec enter this code region?



> On Dec 8, 2024, at 4:52 PM, Qiyue Lu <qiyue...@gmail.com> wrote:
> 
> Thank you all for the reply. Here is the code. Since I need to fetch data 
> from the local_vec, so I set the type as VECSEQ, so I can use VecGetValues(). 
>         std::vector<std::vector<double>> vc(numNodesSurface, 
> std::vector<double>(numDimsProb)); // 2-D matrix storing velocity u, v.
>         PetscScalar *ptr_vc = NULL; // memory storing the fetched values
>         ptr_vc = new double[numNodesSurface*numDimsProb]; // numNodesSurfac = 
> 6, numDimsProb = 2, are number of nodes per element and problem dimension
>         PetscInt idx_global[numNodesSurface*numDimsProb]; // for creating 
> global Index Set
>         PetscInt idx_local[numNodesSurface*numDimsProb]; // for creating 
> local Index Set
>         for (int i = 0; i < numNodesSurface; i++){
>             for (int j = 0; j < numDimsProb; j++){
>                 idx_local[i*numDimsProb + j] = i*numDimsProb + j; // local 
> index
>                 idx_global[i*numDimsProb + j] = node_el[i]*numDOFs + j; // 
> global index
>             }
>         }
>         // global_vec is distributed, and uses MPI_COMM_WORLD as a 
> communicator which includes all processes. 
>         // I want to fetch 12 (numNodesSurface*numDimsProb) values from 
> global_vec and put them in a local vector,
>         // Therefore, I set up this local_vec as VECSEQ and using 
> PETSC_COMM_SELF.
>         IS is_source, is_dest;
>         ISCreateGeneral(PETSC_COMM_SELF, numNodesSurface*numDimsProb, 
> idx_global, PETSC_COPY_VALUES, &is_source);
>         ISCreateGeneral(PETSC_COMM_SELF, numNodesSurface*numDimsProb, 
> idx_local, PETSC_COPY_VALUES, &is_dest);
>         Vec local_vec;
>         VecCreate(PETSC_COMM_SELF, &local_vec);
>         VecSetSizes(local_vec, PETSC_DECIDE, numNodesSurface*numDimsProb); 
>         VecSetType(local_vec, VECSEQ);
>         VecScatter scat;
>         VecScatterCreate(global_vec, is_source, local_vec, is_dest, &scat); 
> // Got Stuck here
>         VecScatterBegin(scat, global_vec, local_vec, INSERT_VALUES, 
> SCATTER_FORWARD);
>         VecScatterEnd(scat, global_vec, local_vec, INSERT_VALUES, 
> SCATTER_FORWARD);
>         VecGetValues(local_vec, numNodesSurface*numDimsProb, idx_local, 
> ptr_vc);
>         for (int i = 0; i < numNodesSurface; i++){
>             for (int j = 0; j < numDimsProb; j++){
>                 vc[i][j] = ptr_vc[i*numDimsProb + j]; // From 1-D to 2-D
>             }
>         }
>         ISDestroy(&is_source);
>         ISDestroy(&is_dest);
>         VecDestroy(&local_vec);
>         VecScatterDestroy(&scat);
> 
> On Sun, Dec 8, 2024 at 11:12 AM Barry Smith <bsm...@petsc.dev 
> <mailto:bsm...@petsc.dev>> wrote:
>> 
>>    You can scatter from a global vector to a local vector; numerous PETSc 
>> examples demonstrate this. So hanging here is surprising. Please display the 
>> entire code so we can see the context of the VecScatterCreate() usage. 
>> Perhaps not all the MPI process that are in the global_vec communicator are 
>> involved in the call to VecScatterCreate(). 
>> 
>>   To determine where the hang occurs you can run with -start_in_debugger use 
>> c for continue in each debugger window and then after a long time of hanging 
>> do control d in the hanging windows and then type bt to see where the code 
>> is hanging.
>> 
>>   Barry
>>  
>> 
>>> On Dec 7, 2024, at 9:47 PM, Qiyue Lu <qiyue...@gmail.com 
>>> <mailto:qiyue...@gmail.com>> wrote:
>>> 
>>> Hello,
>>> I am trying to fetch 12 entries from a distributed vector global_vec and 
>>> put them into a local vector on each process. 
>>> 
>>>         IS is_source, is_dest;
>>>         ISCreateGeneral(PETSC_COMM_SELF, 12, idx_global, PETSC_COPY_VALUES, 
>>> &is_source);
>>>         ISCreateGeneral(PETSC_COMM_SELF, 12, idx_local, PETSC_COPY_VALUES, 
>>> &is_dest);
>>>         Vec local_vec;
>>>         VecCreate(PETSC_COMM_SELF, &local_vec);
>>>         VecSetSizes(local_vec, PETSC_DECIDE, 12); 
>>>         VecSetType(local_vec, VECSEQ);
>>>         VecScatter scat;
>>>         VecScatterCreate(global_vec, is_source, local_vec, is_dest, &scat);
>>> 
>>> 
>>> I create the local vector as sequential. However, the last two lines which 
>>> create a scatter object, will cause more than half processes to hang and no 
>>> error pops out. 
>>> 
>>> Does the scatter have to be VECMPI to VECMPI and cannot VECMPI to VECSEQ?
>>> 
>>> Thanks,
>>> Qiyue Lu
>> 

Reply via email to