Hi Matthew,
Many thanks for the tip re: the synchronized print, I
wasn't aware of that routine.
It is great how many useful utility routines PETSC seems to have - it's a big
timesaver!
Thanks,
Dan
Thank you!
From: Vijay S. Mahadevan
Sent: Thursday, May 20, 2021 1:20 PM
To: Barry Smith
Cc: Bhamidipati, Vikram ; petsc-users
; moab-...@mcs.anl.gov
Subject: Re: [petsc-users] nghosts in 'DMMoabLoadFromFile'
[EXTERNAL EMAIL]
Dear vikram,
Yes if you are running in serial or if you do not requ
Dear vikram,
Yes if you are running in serial or if you do not require any ghost layers,
initialize the parameter input to zero. Otherwise, it should specify the
the number of ghost layers of elements you need when running in parallel.
I'm not well at the moment but will update the documentation
Probably. It is not documented unfortunately, but does seem to related to the
how many ghost layers are needed.
Barry
> On May 20, 2021, at 11:25 AM, Bhamidipati, Vikram
> wrote:
>
> Hello,
>
> I am in the process of updating PetSc versions and I see that
> ‘DMMoabLoadFromFile’ func
You can also have the processes with no values print an array of length zero.
Like
if (rank3 == PROC_ROW) then ! IF mpi PROCESS OWNS THIS ROW THEN ..
..
else
NO_A_ENTRIES = 0
call PetscIntView(NO_A_ENTRIES,JALOC(1:NO_A_ENTRIES), &
& PETSC_VIEWER
Hello,
I am in the process of updating PetSc versions and I see that
'DMMoabLoadFromFile' function (in dmmutil.cxx) has a new argument 'nghosts'.
For those of us who don't use ghost cells should we set it to 0?
Thanks,
Vikram
---
Vikram Bhamidipa
On Thu, May 20, 2021 at 5:32 AM dazza simplythebest
wrote:
> Dear Jose,
> Many thanks for the prompt explanation - that would
> definitely explain what is going on,
> I will adjust my code accordingly .
>
If you want to print different things from each process in parallel, I
sug
Dear Jose,
Many thanks for the prompt explanation - that would definitely
explain what is going on,
I will adjust my code accordingly .
Thanks again,
Dan.
From: Jose E. Roman
Sent: Thursday, May 20, 2021 9:06 AM
To: dazz
If you look at the manpage
https://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/Sys/PetscIntView.html
you will see that PetscIntView() is collective. This means that all MPI
processes must call this function, so it is forbidden to call it within an IF
rank==...
Jose
> El 20 may 2021,
Dear All,
As part of preparing a code to call the SLEPC eigenvalue solving
library,
I am constructing a matrix in sparse CSR format row-by-row. Just for debugging
purposes I write out the column values for a given row, which are stored in a
PetscInt allocatable vector, using PetscIntV
10 matches
Mail list logo