, Zhang, Hong via petsc-users <
>>>> petsc-users@mcs.anl.gov> wrote:
>>>>
>>>> Victoria,
>>>> "** Maximum transversal (ICNTL(6)) not allowed because matrix is
>>>> distributed
>>>> Ordering based on METIS"
>>>
t;>>>>>>
>>>>>>>> On Nov 1, 2023, at 12:17 PM, Pierre Jolivet >>>>>>> <mailto:pie...@joliv.et>> wrote:
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>&g
>> $ ../../../../arch-darwin-c-debug-real/bin/mpirun -n 2 ./ex2 -pc_type lu
>>> -mat_mumps_icntl_4 2
>>> Entering DMUMPS 5.6.2 from C interface with JOB, N = 1 56
>>> executing #MPI = 2, without OMP
>>>
>>> ===
> (I’m not saying switching to ParMETIS will not make the issue go away)
>>>>>>
>>>>>> Thanks,
>>>>>> Pierre
>>>>>>
>>>>>> $ ../../../../arch-darwin-c-debug-real/bin/mpirun -n 2 ./ex2 -pc_type lu
>>>>&g
parallelism: Working host
>>
>> ** ANALYSIS STEP ****
>>
>> ** Maximum transversal (ICNTL(6)) not allowed because matrix is
>> distributed
>> Processing a graph of size:56 with 194 edges
>> Ordering based on AMF
>> WAR
194 edges
> Ordering based on AMF
> WARNING: Largest root node of size26 not selected for parallel
> execution
>
> Leaving analysis phase with ...
> INFOG(1) = 0
> INFOG(2) = 0
> […]
>
> Try parmetis
ion -Dptscotch
>>>> MUMPS compiled with option -Dscotch
>>>> =
>>>> L U Solver for unsymmetric matrices
>>>> Type of parallelism: Working host
>>>>
>>>> ****** AN
>>>> ** ANALYSIS STEP
>>>>
>>>> ** Maximum transversal (ICNTL(6)) not allowed because matrix is
>>>> distributed
>>>> Processing a graph of size:56 with 194 edges
>>>> Ordering based on A
dges
>>> Ordering based on AMF
>>> WARNING: Largest root node of size26 not selected for parallel
>>> execution
>>>
>>> Leaving analysis phase with ...
>>> INFOG(1) = 0
>>&g
;> INFOG(1) = 0
>> INFOG(2) = 0
>> […]
>>
>>> Try parmetis.
>>> Hong
>>> From: petsc-users on behalf of Victoria
>>> Rolandi
>>
> INFOG(2) = 0
> […]
>
>> Try parmetis.
>> Hong
>> From: petsc-users on behalf of Victoria
>> Rolandi
>> Sent: Tuesday, October 31, 2023 10:30 PM
>> To: petsc-users@mcs.anl.gov
>> Subject: [pe
behalf of Victoria
> Rolandi
> Sent: Tuesday, October 31, 2023 10:30 PM
> To: petsc-users@mcs.anl.gov
> Subject: [petsc-users] Error using Metis with PETSc installed with MUMPS
>
> Hi,
>
> I'm solving a large sparse linear system in parallel and I am using PETSc
> wi
v
Subject: [petsc-users] Error using Metis with PETSc installed with MUMPS
Hi,
I'm solving a large sparse linear system in parallel and I am using PETSc with
MUMPS. I am trying to test different options, like the ordering of the matrix.
Everything works if I use the -mat_mumps_icntl_7 2 or -ma
Hi,
I'm solving a large sparse linear system in parallel and I am using PETSc
with MUMPS. I am trying to test different options, like the ordering of the
matrix. Everything works if I use the *-mat_mumps_icntl_7 2 *or
*-mat_mumps_icntl_7
0 *options (with the first one, AMF, performing better than
14 matches
Mail list logo