Hi. The HDF5 solution looks good to me, but I now get this error

$ ../src/saveDemo
Creating mesh with (10,10) faces
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Need to call DMCreateDS() before calling DMGetDS()
[0]PETSC ERROR: See 
https://urldefense.us/v3/__https://petsc.org/release/faq/__;!!G_uCfscf7eWS!aQGZ72ku3riSEIsR_uRzVpbrS8s0nCNB91RQiigaVSevPhI-eLawlvrtybboEPpmaKHwaeqhEaTP5EpUDnBn2BsaaYMpSbj6ii1DUw$
  for trouble shooting.
[0]PETSC ERROR: Petsc Release Version 3.21.5, unknown
[0]PETSC ERROR: ../src/saveDemo on a  named dentdherens by matteo Fri Oct 25 00:00:44 2024 [0]PETSC ERROR: Configure options --COPTFLAGS="-O3 -march=native -mtune=native -mavx2" --CXXOPTFLAGS="-O3 -march=native -mtune=native -mavx2" --FOPTFLAGS="-O3 -march=native -mtune=native -mavx2" --PETSC_ARCH=op t --with-strict-petscerrorcode --download-hdf5 --prefix=/home/matteo/software/petsc/3.21-opt/ --with-debugging=0 --with-gmsh --with-metis --with-parmetis --with-triangle PETSC_DIR=/home/matteo/software/petsc --
force
[0]PETSC ERROR: #1 DMGetDS() at /home/matteo/software/petsc/src/dm/interface/dm.c:5525 [0]PETSC ERROR: #2 DMPlexInsertBoundaryValues_Plex() at /home/matteo/software/petsc/src/dm/impls/plex/plexfem.c:1136 [0]PETSC ERROR: #3 DMPlexInsertBoundaryValues() at /home/matteo/software/petsc/src/dm/impls/plex/plexfem.c:1274 [0]PETSC ERROR: #4 VecView_Plex_HDF5_Internal() at /home/matteo/software/petsc/src/dm/impls/plex/plexhdf5.c:477 [0]PETSC ERROR: #5 VecView_Plex() at /home/matteo/software/petsc/src/dm/impls/plex/plex.c:656 [0]PETSC ERROR: #6 VecView() at /home/matteo/software/petsc/src/vec/vec/interface/vector.c:806
[0]PETSC ERROR: #7 main() at saveDemo.cpp:123
[0]PETSC ERROR: No PETSc Option Table entries
[0]PETSC ERROR: ----------------End of Error Message -------send entire error message to petsc-ma...@mcs.anl.gov----------
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_SELF
with errorcode 73.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

I attach the modified sample code that produced the above error.

Thanks

    Matteo

Il 24/10/24 22:20, Matthew Knepley ha scritto:
I just looked at the code. The VTK code is very old, and does not check for cell overlap.

We have been recommending that people use either HDF5 or CGNS, both of which work in this case I believe. I can fix VTK if that is what you want, but it might take me a little while as it is very busy at
work right now. However, if you output HDF5, then you can run

  ./lib/petsc/bin/petsc_gen_xdmf.py mesh.h5

and it will generate an XDMF file so you can load it into ParaView. Or you can output CGNS which I think
ParaView understands.

  Thanks,

     Matt

On Thu, Oct 24, 2024 at 4:02 PM Semplice Matteo <matteo.sempl...@uninsubria.it> wrote:

    Hi,
          I tried again today and have (re?)discovered this example
    
https://urldefense.us/v3/__https://petsc.org/release/src/dm/impls/plex/tutorials/ex14.c.html__;!!G_uCfscf7eWS!aQGZ72ku3riSEIsR_uRzVpbrS8s0nCNB91RQiigaVSevPhI-eLawlvrtybboEPpmaKHwaeqhEaTP5EpUDnBn2BsaaYMpSbiLx3Cm6A$
 ,
    but I cannot understand if in my case I should call
    PetscSFCreateSectionSF
    
<https://urldefense.us/v3/__https://petsc.org/release/manualpages/PetscSF/PetscSFCreateSectionSF/__;!!G_uCfscf7eWS!aQGZ72ku3riSEIsR_uRzVpbrS8s0nCNB91RQiigaVSevPhI-eLawlvrtybboEPpmaKHwaeqhEaTP5EpUDnBn2BsaaYMpSbhTy11XoA$
 > and,
    if so, how should I then activate the returned SF.
    Matteo
    ------------------------------------------------------------------------
    *Da:* Semplice Matteo
    *Inviato:* martedì 22 ottobre 2024 00:24
    *A:* Matthew Knepley <knep...@gmail.com>
    *Cc:* PETSc <petsc-users@mcs.anl.gov>
    *Oggetto:* Re: [petsc-users] local/global DMPlex Vec output

    Dear Matt,

        I guess you're right: thresholding by rank==0 and rank==1 in
    paraview reveals that it is indeed the overlap cells that are
    appear twice in the output.

    The attached file is not exactly minimal but hopefully short
    enough. If I run it in serial, all is ok, but with

        mpirun -np 2 ./saveDemo

    it creates a 10x10 grid, but I get "output.vtu" with a total of
    120 cells. However the pointSF of the DMPlex seems correct.

    Thanks

        Matteo

    Il 21/10/24 19:15, Matthew Knepley ha scritto:
    On Mon, Oct 21, 2024 at 12:22 PM Matteo Semplice via petsc-users
    <petsc-users@mcs.anl.gov> wrote:

        Dear petsc-users,

            I am having issues with output of parallel data attached
        to a DMPlex (or maybe more fundamental ones about DMPlex...).

        So I currently

         1. create a DMPlex (DMPlexCreateGmshFromFile or
            DMPlexCreateBoxMesh)
         2. partition it
         3. and create a section for my data layout with
            DMPlexCreateSection(ctx.dmMesh, NULL, numComp, numDof,
            numBC, NULL, NULL, NULL, NULL, &sUavg)
         4. DMSetLocalSection(ctx.dmMesh, sUavg)
         5. create solLoc and solGlob vectors with
            DMCreateGlobalVector and DMCreateLocalVector
         6. solve ....
         7. VecView(ctx.solGlob, vtkViewer) on a .vtu file

        but when I load data in ParaView I get more cells than
        expected and it is as if the cells in the halo are put twice
        in output. (I could create a MWE if the above is not clear)

    I think we need an MWE here, because from the explanation above,
    it should work.

    However, I can try to guess the problem. When you partition the
    mesh, I am guessing that you have cells in the overlap. These cells
    must be in the point SF in order for the global section to give
    them a unique owner. Perhaps something has gone wrong here.

      Thanks,

         Matt

        I guess that the culprit is point (4), but if I replace it
        with DMSetGlobalSection then I cannot create the local vector
        at point (5).

        How should I handle this properly? In my code I need to
        create both local and global vectors, to perform at least
        GlobalToLocal and to save the global data.

        (On a side note, I tried also HDF5 but then it complains
        about the DM not having a DS...; really, any working solution
        that allows data to be explored with Paraview is fine)

        Thanks for any advice!

        Matteo Semplice



-- What most experimenters take for granted before they begin their
    experiments is infinitely more interesting than any results to
    which their experiments lead.
    -- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!aQGZ72ku3riSEIsR_uRzVpbrS8s0nCNB91RQiigaVSevPhI-eLawlvrtybboEPpmaKHwaeqhEaTP5EpUDnBn2BsaaYMpSbixfvYF1A$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!aQGZ72ku3riSEIsR_uRzVpbrS8s0nCNB91RQiigaVSevPhI-eLawlvrtybboEPpmaKHwaeqhEaTP5EpUDnBn2BsaaYMpSbiaSaocIA$ >

-- ---
    Professore Associato in Analisi Numerica
    Dipartimento di Scienza e Alta Tecnologia
    Università degli Studi dell'Insubria
    Via Valleggio, 11 - Como



--
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.
-- Norbert Wiener

https://urldefense.us/v3/__https://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!aQGZ72ku3riSEIsR_uRzVpbrS8s0nCNB91RQiigaVSevPhI-eLawlvrtybboEPpmaKHwaeqhEaTP5EpUDnBn2BsaaYMpSbixfvYF1A$ <https://urldefense.us/v3/__http://www.cse.buffalo.edu/*knepley/__;fg!!G_uCfscf7eWS!aQGZ72ku3riSEIsR_uRzVpbrS8s0nCNB91RQiigaVSevPhI-eLawlvrtybboEPpmaKHwaeqhEaTP5EpUDnBn2BsaaYMpSbiaSaocIA$ >

--
---
Professore Associato in Analisi Numerica
Dipartimento di Scienza e Alta Tecnologia
Università degli Studi dell'Insubria
Via Valleggio, 11 - Como
#include <petscdmplex.h>
#include <petscsf.h>
#include <petscviewerhdf5.h>

int main(int argc, char **argv)
{
  PetscFunctionBeginUser;

  PetscCall( PetscInitialize(&argc, &argv, NULL,"") );
  PetscMPIInt rank;
  PetscCallMPI( MPI_Comm_rank(PETSC_COMM_WORLD,&rank) );
 
  PetscScalar lower[2]={0.,0.};
  PetscScalar upper[2]={1.,1.};
  DM dmMesh;
  PetscScalar dx=0.1;
  PetscInt faces[2];
  DMBoundaryType periodicity[2]={DM_BOUNDARY_NONE,DM_BOUNDARY_NONE};
  PetscBool simplex=PETSC_FALSE;

  PetscCall( PetscOptionsGetBool(NULL,NULL,"-simplex",NULL,&simplex) );
  const PetscBool interpolate=PETSC_TRUE;
  PetscCall( PetscOptionsGetScalar(NULL,NULL,"-dx",&dx,NULL) );
  faces[0] = std::ceil( (upper[0] - lower[0]) / dx );
  faces[1] = std::ceil( (upper[1] - lower[1]) / dx );
  PetscCall( PetscPrintf( PETSC_COMM_WORLD, "Creating mesh with (%d,%d) faces  \n", faces[0],faces[1] ) );
  PetscCall( DMPlexCreateBoxMesh(MPI_COMM_WORLD, 2, simplex, faces, lower, upper, periodicity, interpolate, &(dmMesh)) );
  PetscCall( PetscObjectSetName((PetscObject) dmMesh, "Mesh") );
  PetscCall( DMSetFromOptions(dmMesh) );

  //PARTITION THE MESH
  PetscPartitioner partitioner;
  PetscCall( DMPlexGetPartitioner(dmMesh, &partitioner) );
  //PetscCall(  PetscPartitionerSetType(partitioner, PETSCPARTITIONERPARMETIS) );
  PetscCall(  PetscPartitionerSetFromOptions(partitioner) );
  DM dmDist=NULL;
  PetscCall( DMSetBasicAdjacency(dmMesh, PETSC_TRUE, PETSC_TRUE) );
  PetscCall( DMPlexDistribute(dmMesh, 1, NULL, &dmDist) );
  if (dmDist) {PetscCall( DMDestroy(&(dmMesh)) ); dmMesh = dmDist; }

  PetscInt     cStart, cEnd;
  PetscCall( DMPlexGetHeightStratum(dmMesh, 0, &cStart, &cEnd) );

  PetscSF sfPoint;
  PetscCall( DMGetPointSF(dmMesh, &sfPoint) );
  //PetscViewerPushFormat(PETSC_VIEWER_STDOUT_WORLD,PETSC_VIEWER_ASCII_INFO_DETAIL);
  //PetscSFView(sfPoint,PETSC_VIEWER_STDOUT_WORLD);
  //PetscViewerPopFormat(PETSC_VIEWER_STDOUT_WORLD);

  PetscInt numLeaves;
  const PetscInt    *localPoints;
  PetscCall( PetscSFGetGraph(sfPoint, NULL, &numLeaves, &localPoints, NULL) );
  DMLabel inghostLabel;
  PetscCall( DMCreateLabel(dmMesh, "inghosts") );
  PetscCall( DMGetLabel(dmMesh, "inghosts", &inghostLabel) );
  for (PetscInt i=0; i<numLeaves; i++){
    //PetscPrintf(PETSC_COMM_SELF,"[%d] %D is inner ghost\n",user.rank,localPoints[i]);
    PetscCall( DMLabelSetValue(inghostLabel, localPoints[i], 1) );
  }

  PetscInt numRoots;
  const PetscSFNode *remotePoints;
  PetscCall( PetscSFGetGraph(sfPoint, &numRoots, &numLeaves, &localPoints, &remotePoints) );
  for (PetscInt i=0; i<numLeaves; i++)
    if (localPoints[i]<cEnd)
      PetscCall( PetscPrintf(PETSC_COMM_SELF,"[on rank %d] %d <- (%d,%d)\n",rank,localPoints[i],remotePoints[i].rank,remotePoints[i].index) );

  /* Setup the section for cell averages*/
  PetscSection sUavg;
  {
    const PetscInt numFields  = 3;
    const PetscInt numComp[numFields] = {1,2,1};
    const PetscInt numDof[numFields*3] = { 0,0,1 , 0,0,2, 0,0,1};
    const PetscInt numBC = 0;
    PetscCall(DMSetNumFields(dmMesh, numFields));
    PetscCall(DMPlexCreateSection(dmMesh, NULL, numComp, numDof, numBC, NULL, NULL, NULL, NULL, &sUavg));
    /* Name the Field variables */
    PetscCall(PetscSectionSetFieldName(sUavg, 0, "rho"));
    PetscCall(PetscSectionSetFieldName(sUavg, 1, "M"));
    PetscCall(PetscSectionSetFieldName(sUavg, 2, "E"));
    //PetscCall(PetscSectionView(section, PETSC_VIEWER_STDOUT_WORLD));
  }

//PetscSF      sectionSF;
//PetscInt    *remoteOffsets;
///** Construct the communication pattern for halo exchange between local vectors */
///* Get the point SF: an object that says which copies of mesh points (cells,
 //* vertices, faces, edges) are copies of points on other processes */
////PetscCall(DMGetPointSF(dm, &pointSF));
///* Relate the locations of ghost degrees of freedom on this process
 //* to their locations of the non-ghost copies on a different process */
//PetscCall(PetscSFCreateRemoteOffsets(sfPoint, sUavg, sUavg, &remoteOffsets));
///* Use that information to construct a star forest for halo exchange
 //* for data described by the local section */
//PetscCall(PetscSFCreateSectionSF(sfPoint, sUavg, remoteOffsets, sUavg, &sectionSF));
//PetscCall(PetscFree(remoteOffsets));

  //PetscCall( DMSetSectionSF(dmMesh, sectionSF) );

  PetscCall( DMSetLocalSection(dmMesh, sUavg) );
  //maybe we need this? Alas, it does not make a difference...
  PetscCall( PetscSectionDestroy(&sUavg) );

  Vec solGlob, solLoc;  
  PetscCall( DMCreateGlobalVector(dmMesh, &solGlob) );
  PetscCall( PetscObjectSetName((PetscObject) solGlob , "Uavg") );
  PetscCall( DMCreateLocalVector(dmMesh, &solLoc) );

  //Fill in local values and then loc2glob
  PetscScalar *values, *cellVal;
  PetscCall( VecGetArray(solLoc, &values) );
  for (PetscInt c=cStart; c<cEnd; ++c){
    PetscCall( DMPlexPointLocalRef(dmMesh, c, values, &cellVal) );
    for (int m=0; m<4; ++m) cellVal[m] = rank + m/10.;
  }
  PetscCall( VecRestoreArray(solLoc, &values) );
  PetscCall( DMLocalToGlobal(dmMesh, solLoc, INSERT_VALUES, solGlob) );

  //output the global solution
  PetscViewer h5Viewer;
  PetscCall(PetscViewerCreate(PETSC_COMM_WORLD, &h5Viewer));
  PetscCall(PetscViewerHDF5Open(PETSC_COMM_WORLD, "output.h5", FILE_MODE_WRITE, &h5Viewer));
  PetscCall(VecView(solGlob, h5Viewer));
  PetscCall(PetscViewerDestroy(&h5Viewer));

  PetscCall( VecDestroy( &solGlob) );
  PetscCall( VecDestroy( &solLoc) );
  PetscCall( DMDestroy( &dmMesh) );
  PetscCall( PetscFinalize() );
}

Reply via email to