On Tue, Jul 21, 2020 at 12:11 PM Pierpaolo Minelli <pierpaolo.mine...@cnr.it> wrote:
> > > Il giorno 21 lug 2020, alle ore 16:58, Mark Adams <mfad...@lbl.gov> ha > scritto: > > This also looks like it could be some sort of library mismatch. You might > try deleting your architecture directory and start over. This PETSc's "make > realclean" > > > I hope this is not the case, because I am working on CINECA HPC Facility > (Italy) and in this center I need to load modules for each software I need. > I asked Cineca support to compile a version of Petsc with 64 bit integers > and all that external packages and after they have done it, I loaded > directly this new module, so the older version (3.8.x with integer in > single precision) is not involved at all. > At least I hope… > You can run a PETSc test. Download PETSc and go into the PETSc directory. On an execution node or batch script run 'make check' with PETSC_ARCH and PETSC_DIR set as usual (ie, not $PWD). THis should run a few tests including a hypre test. If it works, then there may be a problem with how you are linking your program. If it fails then then ask the Facility to help you. > > Thanks > > Pierpaolo > > > On Tue, Jul 21, 2020 at 10:45 AM Stefano Zampini < > stefano.zamp...@gmail.com> wrote: > >> >> >> On Jul 21, 2020, at 1:32 PM, Pierpaolo Minelli <pierpaolo.mine...@cnr.it> >> wrote: >> >> Hi, >> >> I have asked to compile a Petsc Version updated and with 64bit indices. >> Now I have Version 3.13.3 and these are the configure options used: >> >> #!/bin/python >> if __name__ == '__main__': >> import sys >> import os >> sys.path.insert(0, os.path.abspath('config')) >> import configure >> configure_options = [ >> '--CC=mpiicc', >> '--CXX=mpiicpc', >> '--download-hypre', >> '--download-metis', >> '--download-mumps=yes', >> '--download-parmetis', >> '--download-scalapack', >> '--download-superlu_dist', >> '--known-64-bit-blas-indices', >> >> >> '--prefix=/cineca/prod/opt/libraries/petsc/3.13.3_int64/intelmpi--2018--binary', >> '--with-64-bit-indices=1', >> >> >> '--with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl', >> '--with-cmake-dir=/cineca/prod/opt/tools/cmake/3.12.0/none', >> '--with-debugging=0', >> '--with-fortran-interfaces=1', >> '--with-fortran=1', >> 'FC=mpiifort', >> 'PETSC_ARCH=arch-linux2-c-opt', >> ] >> configure.petsc_configure(configure_options) >> >> Now, I receive an error on hypre: >> >> forrtl: error (78): process killed (SIGTERM) >> Image PC Routine Line >> Source >> libHYPRE-2.18.2.s 00002B33CF465D3F for__signal_handl >> Unknown Unknown >> libpthread-2.17.s 00002B33D5BFD370 Unknown >> Unknown Unknown >> libpthread-2.17.s 00002B33D5BF96D3 pthread_cond_wait >> Unknown Unknown >> libiomp5.so 00002B33DBA14E07 Unknown >> Unknown Unknown >> libiomp5.so 00002B33DB98810C Unknown >> Unknown Unknown >> libiomp5.so 00002B33DB990578 Unknown >> Unknown Unknown >> libiomp5.so 00002B33DB9D9659 Unknown >> Unknown Unknown >> libiomp5.so 00002B33DB9D8C39 Unknown >> Unknown Unknown >> libiomp5.so 00002B33DB993BCE __kmpc_fork_call >> Unknown Unknown >> PIC_3D 00000000004071C0 Unknown >> Unknown Unknown >> PIC_3D 0000000000490299 Unknown >> Unknown Unknown >> PIC_3D 0000000000492C17 Unknown >> Unknown Unknown >> PIC_3D 000000000040562E Unknown >> Unknown Unknown >> libc-2.17.so 00002B33DC5BEB35 __libc_start_main >> Unknown Unknown >> PIC_3D 0000000000405539 Unknown >> Unknown Unknown >> >> Is it possible >> >> that I need to ask also to compile hypre with an option for 64bit indices? >> >> >> These configure options compile hypre with 64bit indices support. >> It should work just fine. Can you run a very small case of your code to >> confirm? >> >> >> Is it possible to instruct this inside Petsc configure? >> Alternatively, is it possible to use a different multigrid PC inside >> PETSc that accept 64bit indices? >> >> Thanks in advance >> >> Pierpaolo >> >> >> Il giorno 27 mag 2020, alle ore 11:26, Stefano Zampini < >> stefano.zamp...@gmail.com> ha scritto: >> >> You need a version of PETSc compiled with 64bit indices, since the >> message indicates the number of dofs in this case is larger the INT_MAX >> 2501×3401×1601 = 13617947501 >> >> I also suggest you upgrade to a newer version, 3.8.3 is quite old as the >> error message reports >> >> Il giorno mer 27 mag 2020 alle ore 11:50 Pierpaolo Minelli < >> pierpaolo.mine...@cnr.it> ha scritto: >> >>> Hi, >>> >>> I am trying to solve a Poisson equation on this grid: >>> >>> Nx = 2501 >>> Ny = 3401 >>> Nz = 1601 >>> >>> I received this error: >>> >>> [0]PETSC ERROR: --------------------- Error Message >>> -------------------------------------------------------------- >>> [0]PETSC ERROR: Overflow in integer operation: >>> http://www.mcs.anl.gov/petsc/documentation/faq.html#64-bit-indices >>> [0]PETSC ERROR: Mesh of 2501 by 3401 by 1 (dof) is too large for 32 bit >>> indices >>> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html >>> for trouble shooting. >>> [0]PETSC ERROR: Petsc Release Version 3.8.3, Dec, 09, 2017 >>> [0]PETSC ERROR: >>> /marconi_scratch/userexternal/pminelli/PIC3D/2500_3400_1600/./PIC_3D on a >>> arch-linux2-c-opt named r129c09s02 by pminelli Tu >>> e May 26 20:16:34 2020 >>> [0]PETSC ERROR: Configure options >>> --prefix=/cineca/prod/opt/libraries/petsc/3.8.3/intelmpi--2018--binary >>> CC=mpiicc FC=mpiifort CXX=mpiicpc >>> F77=mpiifort F90=mpiifort --with-debugging=0 >>> --with-blaslapack-dir=/cineca/prod/opt/compilers/intel/pe-xe-2018/binary/mkl >>> --with-fortran=1 >>> --with-fortran-interfaces=1 >>> --with-cmake-dir=/cineca/prod/opt/tools/cmake/3.5.2/none >>> --with-mpi-dir=/cineca/prod/opt/compilers/intel/pe-xe- >>> 2018/binary/impi/2018.4.274 --download-scalapack --download-mumps=yes >>> --download-hypre --download-superlu_dist --download-parmetis --downlo >>> ad-metis >>> [0]PETSC ERROR: #1 DMSetUp_DA_3D() line 218 in >>> /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/ >>> impls/da/da3.c >>> [0]PETSC ERROR: #2 DMSetUp_DA() line 25 in >>> /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/impl >>> s/da/dareg.c >>> [0]PETSC ERROR: #3 DMSetUp() line 720 in >>> /marconi/prod/build/libraries/petsc/3.8.3/intelmpi--2018--binary/BA_WORK/petsc-3.8.3/src/dm/interf >>> ace/dm.c >>> forrtl: error (76): Abort trap signal >>> >>> >>> I am on an HPC facility and after I loaded PETSC module, I have seen >>> that it is configured with INTEGER size = 32 >>> >>> I solve my problem with these options and it works perfectly with >>> smaller grids: >>> >>> -dm_mat_type hypre -pc_type hypre -pc_hypre_type boomeramg >>> -pc_hypre_boomeramg_relax_type_all SOR/Jacobi >>> -pc_hypre_boomeramg_coarsen_type PMIS -pc_hypre_boomeramg_interp_type FF1 >>> -ksp_type richardson >>> >>> Is it possible to overcome this if I ask them to install a version with >>> INTEGER SIZE = 64? >>> Alternatively, is it possible to overcome this using intel compiler >>> options? >>> >>> Thanks in advance >>> >>> Pierpaolo Minelli >> >> >> >> -- >> Stefano >> >> >> >> >