Hi, I am new to using petsc and need its nonlinear solvers for my code. I am currently using parmetis (outside petsc) to partition an unstructured mesh element-wise, but working with data on the vertices of the mesh. Consequently, I have repeated vertices in different MPI-processes/ranks. At the solver stage, I need to solve for the data on vertices (solution vector is defined on the vertices). So, I need to create a distributed vector over vertices of the mesh, but the distribution in MPI-ranks is not contiguous since partitioning is (has to be) done element wise. I am trying to figure out, 1. if I need only Local to Global IS or do I need to combine them with AO? 2. Even at the VecCreateMPI stage, is it possible to inform petsc that, although, say, rank_i has n_i components of the vector, but those components are not arranged contiguously?
For instance, Global vertices vector v : [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11] v_rank_1 : [2, 3, 4, 8, 9, 7] ; v_rank_2 : [0, 1, 2, 3, 6, 10, 11, 8, 9, 5] Any help is greatly appreciated. Thank you. Prateek Gupta