Hi,
I'm solving a large sparse linear system in parallel and I am using PETSc
with MUMPS. I am trying to test different options, like the ordering of the
matrix. Everything works if I use the *-mat_mumps_icntl_7 2 *or
*-mat_mumps_icntl_7
0 *options (with the first one, AMF, performing better than
In reading the error message I see that I did not clone A, to get P, so P
was the wrong type with a device.
Thanks,
Mark
On Tue, Oct 31, 2023 at 2:24 PM Mark Adams wrote:
> Correction, I get the same message with -mat_type aijcusparse.
>
> Thanks,
> Mark
>
> On Tue, Oct 31, 2023 at 9:29 AM Mark
Correction, I get the same message with -mat_type aijcusparse.
Thanks,
Mark
On Tue, Oct 31, 2023 at 9:29 AM Mark Adams wrote:
> I am getting this error.
> This is in GAMG/HEM setup. PtAP for the coarse grid construction works,
> but I call this in a graph routine
> (/global/u2/m/madams/petsc/sr
I am getting this error.
This is in GAMG/HEM setup. PtAP for the coarse grid construction works, but
I call this in a graph routine
(/global/u2/m/madams/petsc/src/mat/coarsen/impls/hem/hem.c:1043).
Also, this PtAP does not need to be on the GPU anyway because P is
extremely sparse ... can I pin, s
On Mon, Sep 25, 2023 at 8:58 AM Azeddine Messikh via petsc-users <
petsc-users@mcs.anl.gov> wrote:
> Dear developers
>
> I tried to run ex24.c
> https://petsc.org/release/src/snes/tutorials/ex24.c.html using the
> following command line
>
> ./ex24 -sol_type quadratic -dm_plex_simplex 0 -field_pet
Dear Matt and Jed,
Thank you so much for your insights.
Jed, as far as I know, the format is custom internal structure. I will
double-check this. If it is used outside, I'm more than willing to contribute
the reader.
Best,
Onur
Sent with [Proton Mail](https://proton.me/) secure email.
-