Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-07 Thread Victoria Rolandi
, Zhang, Hong via petsc-users < >>>> petsc-users@mcs.anl.gov> wrote: >>>> >>>> Victoria, >>>> "** Maximum transversal (ICNTL(6)) not allowed because matrix is >>>> distributed >>>> Ordering based on METIS" >>>

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-07 Thread Pierre Jolivet
t;>>>>>> >>>>>>>> On Nov 1, 2023, at 12:17 PM, Pierre Jolivet >>>>>>> <mailto:pie...@joliv.et>> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>&g

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-07 Thread Victoria Rolandi
>> $ ../../../../arch-darwin-c-debug-real/bin/mpirun -n 2 ./ex2 -pc_type lu >>> -mat_mumps_icntl_4 2 >>> Entering DMUMPS 5.6.2 from C interface with JOB, N = 1 56 >>> executing #MPI = 2, without OMP >>> >>> ===

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-03 Thread Pierre Jolivet
> (I’m not saying switching to ParMETIS will not make the issue go away) >>>>>> >>>>>> Thanks, >>>>>> Pierre >>>>>> >>>>>> $ ../../../../arch-darwin-c-debug-real/bin/mpirun -n 2 ./ex2 -pc_type lu >>>>&g

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-03 Thread Victoria Rolandi
parallelism: Working host >> >> ** ANALYSIS STEP **** >> >> ** Maximum transversal (ICNTL(6)) not allowed because matrix is >> distributed >> Processing a graph of size:56 with 194 edges >> Ordering based on AMF >> WAR

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-02 Thread Victoria Rolandi
194 edges > Ordering based on AMF > WARNING: Largest root node of size26 not selected for parallel > execution > > Leaving analysis phase with ... > INFOG(1) = 0 > INFOG(2) = 0 > […] > > Try parmetis

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-02 Thread Pierre Jolivet
ion -Dptscotch >>>> MUMPS compiled with option -Dscotch >>>> = >>>> L U Solver for unsymmetric matrices >>>> Type of parallelism: Working host >>>> >>>> ****** AN

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-01 Thread Pierre Jolivet
>>>> ** ANALYSIS STEP >>>> >>>> ** Maximum transversal (ICNTL(6)) not allowed because matrix is >>>> distributed >>>> Processing a graph of size:56 with 194 edges >>>> Ordering based on A

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-01 Thread Barry Smith
dges >>> Ordering based on AMF >>> WARNING: Largest root node of size26 not selected for parallel >>> execution >>> >>> Leaving analysis phase with ... >>> INFOG(1) = 0 >>&g

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-01 Thread Pierre Jolivet
;> INFOG(1) = 0 >> INFOG(2) = 0 >> […] >> >>> Try parmetis. >>> Hong >>> From: petsc-users on behalf of Victoria >>> Rolandi >>

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-01 Thread Barry Smith
> INFOG(2) = 0 > […] > >> Try parmetis. >> Hong >> From: petsc-users on behalf of Victoria >> Rolandi >> Sent: Tuesday, October 31, 2023 10:30 PM >> To: petsc-users@mcs.anl.gov >> Subject: [pe

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-01 Thread Pierre Jolivet
behalf of Victoria > Rolandi > Sent: Tuesday, October 31, 2023 10:30 PM > To: petsc-users@mcs.anl.gov > Subject: [petsc-users] Error using Metis with PETSc installed with MUMPS > > Hi, > > I'm solving a large sparse linear system in parallel and I am using PETSc > wi

Re: [petsc-users] Error using Metis with PETSc installed with MUMPS

2023-11-01 Thread Zhang, Hong via petsc-users
v Subject: [petsc-users] Error using Metis with PETSc installed with MUMPS Hi, I'm solving a large sparse linear system in parallel and I am using PETSc with MUMPS. I am trying to test different options, like the ordering of the matrix. Everything works if I use the -mat_mumps_icntl_7 2 or -ma

[petsc-users] Error using Metis with PETSc installed with MUMPS

2023-10-31 Thread Victoria Rolandi
Hi, I'm solving a large sparse linear system in parallel and I am using PETSc with MUMPS. I am trying to test different options, like the ordering of the matrix. Everything works if I use the *-mat_mumps_icntl_7 2 *or *-mat_mumps_icntl_7 0 *options (with the first one, AMF, performing better than