Dear PETSc users,

I would like to use Petsc4Py for a project extension, which consists mainly of:

-          Storing data and matrices on several rank/nodes which could not fit 
on a single node.

-          Performing some linear algebra in a parallel fashion (solving sparse 
linear system for instance)

-          Exchanging those data structures (parallel vectors) between 
non-overlapping MPI communicators, created for instance by splitting 
MPI_COMM_WORLD.

While the two first items seems to be well addressed by PETSc, I am wondering 
about the last one.

Is it possible to access the data of a vector, defined on a communicator from 
another, non-overlapping communicator? From what I have seen from the 
documentation and the several threads on the user mailing-list, I would say no. 
But maybe I am missing something? If not, is it possible to transfer a vector 
defined on a given communicator on a communicator which is a subset of the 
previous one?

Best regards,
Jean-Christophe


Reply via email to