On Wed, Jul 24, 2024 at 9:32 PM Matthew Thomas
wrote:
> Hi Matt,
>
> I have attached the configuration file below.
>
>From the log:
MPI:
Version:3
mpiexec: /apps/intel-tools/intel-mpi/2021.11.0/bin/mpiexec
Implementation: mpich3
I_MPI_NUMVERSION: 20211100300 MPICH_NUMVERSION: 30400
On Wed, Jul 24, 2024 at 8:37 PM Matthew Thomas
wrote:
> Hello Matt,
>
> Thanks for the help. I believe the problem is coming from an incorrect
> linking with MPI and PETSc.
>
> I tried running with petscmpiexec from
> $PETSC_DIR/lib/petsc/bin/petscmpiexec. This gave me the error
>
> Error build l
Hello Matt,
Thanks for the help. I believe the problem is coming from an incorrect linking
with MPI and PETSc.
I tried running with petscmpiexec from $PETSC_DIR/lib/petsc/bin/petscmpiexec.
This gave me the error
Error build location not found! Please set PETSC_DIR and PETSC_ARCH correctly
for
On Tue, Jul 23, 2024 at 8:02 PM Matthew Thomas
wrote:
> Hello Matt,
>
> I have attached the output with mat_view for 8 and 40 processors.
>
> I am unsure what is meant by the matrix communicator and the partitioning.
> I am using the default behaviour in every case. How can I find this
> informat
Hello Matt,
I have attached the output with mat_view for 8 and 40 processors.
I am unsure what is meant by the matrix communicator and the partitioning. I am
using the default behaviour in every case. How can I find this information?
I have attached the log view as well if that helps.
Thanks,
Hi Barry,
The minimal example is shown below.
#include
int main(int argc,char **argv)
{
MatA; /* problem matrix */
PetscInt n=10,i,Istart,Iend;
PetscFunctionBeginUser;
PetscCall(SlepcInitialize(&argc,&argv,(char*)0,help));
PetscCall(PetscOpti
Also, you could run with
-mat_view ::ascii_info_detail
and send the output for both cases. The storage of matrix values is not
redundant, so something else is
going on. First, what communicator do you use for the matrix, and what
partitioning?
Thanks,
Matt
On Mon, Jul 22, 2024 at 10:2
Send the code.
> On Jul 22, 2024, at 9:18 PM, Matthew Thomas via petsc-users
> wrote:
>
> This Message Is From an External Sender
> This message came from outside your organization.
> Hello,
>
> I am using petsc and slepc to solve an eigenvalue problem for sparse
> matrices. When I run my
Hello, I am using petsc and slepc to solve an eigenvalue problem for sparse matrices. When I run my code with double the number of processors, the memory usage also doubles. I am able to reproduce this behaviour with ex1 of slepc’s hands on
ZjQcmQRYFpfptBannerStart
This