Hi,
take a look at slide 10 of [1], there is visually explained what the
overlap between partitions is.
[1] https://www.archer.ac.uk/training/virtual/files/2015/06-PETSc/slides.pdf
On Thu, Aug 11, 2016 at 8:48 AM, leejearl wrote:
> Hi, all:
> I want to use PETSc to build my FVM code. Now,
Hi, all:
I want to use PETSc to build my FVM code. Now, I have a question about
the function DMPlexDistribute(DM dm, PetscInt overlap, PetscSF *sf, DM
*dmOverlap) .
In the example "/petsc-3.7.2/src/ts/examples/tutorials/ex11.c".
When I set the overlap
as 0 or 1, it works well. But, if
Title: Acque Sotterranee - Italian Journal of Groundwater | PAGEPress PublicationsOpen Access Academic journals | PGP
PAGEPress PublicationsOpen Access Academic journals
Sidebar[Skip]
Follow us on Twitter @AS_ITJGW
Follow Acque Sotterranee onĀ
REA
I have been recently hearing about Docker [1] and its potential as a
"platform-independent" environment for application development. At first it
sounded just like another VM to me until I came across this IBM Research
paper [2] and this SO post [3] (to be fair I still don't get the detailed
differe
Effectively utilizing multiple right hand sides with the same system can
result in roughly 2 or at absolute most 3 times improvement in solve time. A
great improvement but when you have a million right hand sides not a giant
improvement.
The first step is to get the best (most efficient
On Wed, Aug 10, 2016 at 9:54 PM, Harshad Ranadive wrote:
> Hi All,
>
> I have currently added the PETSc library with our CFD solver.
>
> In this I need to use KSPSolve(...) multiple time for the same matrix A. I
> have read that PETSc does not support passing multiple RHS vectors in the
> form of
Hi All,
I have currently added the PETSc library with our CFD solver.
In this I need to use KSPSolve(...) multiple time for the same matrix A. I
have read that PETSc does not support passing multiple RHS vectors in the
form of a matrix and the only solution to this is calling KSPSolve multiple
ti