Re: [petsc-users] MPIAIJ MatMult and non conforming object sizes

2024-09-05 Thread Karthikeyan Chockalingam - STFC UKRI via petsc-users
Thank you both for your response. From: Barry Smith Date: Thursday, 5 September 2024 at 23:19 To: Chockalingam, Karthikeyan (STFC,DL,HC) Cc: petsc-users@mcs.anl.gov Subject: Re: [petsc-users] MPIAIJ MatMult and non conforming object sizes You can use MatCreateVecs() to create vectors sized p

Re: [petsc-users] MPIAIJ MatMult and non conforming object sizes

2024-09-05 Thread Barry Smith
You can use MatCreateVecs() to create vectors sized properly for both x and y in y = Ax when A is not square. > On Sep 5, 2024, at 5:20 PM, Karthikeyan Chockalingam - STFC UKRI via > petsc-users wrote: > > Hello, > > I am unsure why the program crashes even while running the code seriall

Re: [petsc-users] MPIAIJ MatMult and non conforming object sizes

2024-09-05 Thread Junchao Zhang
It is triggered by PetscCheck(mat->rmap->N == y->map->N, PetscObjectComm((PetscObject)mat), PETSC_ERR_ARG_SIZ, "Mat mat,Vec y: global dim %" PetscInt_FMT " %" PetscInt_FMT, mat->rmap->N, y->map->N); The error says in your y = Ax, A has 40 rows but y has 25. Generally, VecDuplicate(par_xcoord,&par

[petsc-users] MPIAIJ MatMult and non conforming object sizes

2024-09-05 Thread Karthikeyan Chockalingam - STFC UKRI via petsc-users
Hello, I am unsure why the program crashes even while running the code serially petscErr = MatCreate(mesh.comm().get(), &par_G); petscErr = MatSetType(par_G, MATMPIAIJ); petscErr = MatSetSizes(par_G, local_num_rows, local_num_cols, total_num_rows, total_num_cols); PetscInt d_nz = 2;

Re: [petsc-users] KSPSolve + MUMPS memory growth issues

2024-09-05 Thread Matthew Knepley
On Thu, Sep 5, 2024 at 2:46 PM Corbijn van Willenswaard, Lars (UT) < l.j.corbijnvanwillenswa...@utwente.nl> wrote: > Thank you, that makes testing so much easier. So far, I’ve been able to > shrink the matrix (now only 64x64) and see that it still has growing memory > usage over time. Unfortunatel

Re: [petsc-users] KSPSolve + MUMPS memory growth issues

2024-09-05 Thread Corbijn van Willenswaard, Lars (UT) via petsc-users
Thank you, that makes testing so much easier. So far, I’ve been able to shrink the matrix (now only 64x64) and see that it still has growing memory usage over time. Unfortunately, I’ve no access to a linux machine right now, so running through valgrind like Barry suggested has to wait. Lars Fr

Re: [petsc-users] KSPSolve + MUMPS memory growth issues

2024-09-05 Thread Matthew Knepley
On Thu, Sep 5, 2024 at 1:40 PM Corbijn van Willenswaard, Lars (UT) via petsc-users wrote: > Dear PETSc, > > For the last months I’ve struggled with a solver that I wrote for a FEM > eigenvalue problem running out of memory. I’ve traced it to KSPSolve + > MUMPS being the issue, but I'm getting stu

Re: [petsc-users] KSPSolve + MUMPS memory growth issues

2024-09-05 Thread Barry Smith
Use Valgrind. It will show the exact lines that memory is allocated that does not get freed later. I am guess some memory allocation within MUMPS is not properly freed. Barry > On Sep 5, 2024, at 1:33 PM, Corbijn van Willenswaard, Lars (UT) via > petsc-users wrote: > > Dear PETSc, >

[petsc-users] KSPSolve + MUMPS memory growth issues

2024-09-05 Thread Corbijn van Willenswaard, Lars (UT) via petsc-users
Dear PETSc, For the last months I’ve struggled with a solver that I wrote for a FEM eigenvalue problem running out of memory. I’ve traced it to KSPSolve + MUMPS being the issue, but I'm getting stuck on digging deeper. The reason I suspect the KSPSolve/MUMPS is that when commenting out the KSP

Re: [petsc-users] Bug or mis-use for 64 indices PETSc mpi linear solver server with more than 8 cores

2024-09-05 Thread Satish Balay
I fixed the "garbled" text at https://urldefense.us/v3/__https://gitlab.com/petsc/petsc/-/issues/1643__;!!G_uCfscf7eWS!YEs2AyYzwZQphspkgEhgojcPm1ypNLOD9_BZwl9NzvkOhN7BdIXzBHuN2n_6s2yV2e8VMCn58IgSGrN6NakWS5sRZnw$ - and best if additional followup is on the issue tracker [so that the replies to t

Re: [petsc-users] Bug or mis-use for 64 indices PETSc mpi linear solver server with more than 8 cores

2024-09-05 Thread Mark Adams
Barry's suggestion for testing got garbled in the gitlab issue posting. Here it is, I think: 07:53 main *= ~/Codes/petsc$ make test s=ksp_ksp_tutorials-ex1_mpi_linear_solver_server_1 /usr/local/bin/gmake --no-print-directory -f /Users/markadams/Codes/petsc/gmakefile.test PETSC_ARCH=arch-macosx-gnu