On Tue, 23 Mar 2021, Matthew Knepley wrote:
> On Tue, Mar 23, 2021 at 9:08 PM Junchao Zhang
> wrote:
>
> > In the new log, I saw
> >
> > Summary of Stages: - Time -- - Flop -- --- Messages ---
> > -- Message Lengths -- -- Reductions --
> > Avg %
On Tue, Mar 23, 2021 at 9:08 PM Junchao Zhang
wrote:
> In the new log, I saw
>
> Summary of Stages: - Time -- - Flop -- --- Messages ---
> -- Message Lengths -- -- Reductions --
> Avg %Total Avg %TotalCount %Total
> Avg
In the new log, I saw
Summary of Stages: - Time -- - Flop -- ---
Messages --- -- Message Lengths -- -- Reductions --
Avg %Total Avg %TotalCount
%Total Avg %TotalCount %Total
0: Main Stage: 5.4095e+00 2.3% 4.37
Thanks Dave for your reply.
For sure PETSc is awesome :D
Yes, in both cases petsc was configured with --with-debugging=0 and
fortunately I do have the old and new -log-veiw outputs which I attached.
Best,
Mohammad
On Tue, Mar 23, 2021 at 1:37 AM Dave May wrote:
> Nice to hear!
> The answer is
Ok I will investigate implementing it using SLATE, thanks.
Miguel
From: Matthew Knepley
Date: Tuesday, March 23, 2021 at 12:57 PM
To: "Salazar De Troya, Miguel"
Cc: Barry Smith , "Jorti, Zakariae via petsc-users"
Subject: Re: [petsc-users] Local Discontinuous Galerkin with PETSc TS
On Tue, M
On Tue, Mar 23, 2021 at 11:54 AM Salazar De Troya, Miguel <
salazardet...@llnl.gov> wrote:
> The calculation of p1 and p2 are done by solving an element-wise local
> problem using u^n. I guess I could embed this calculation inside of the
> calculation for G = H(p1, p2). However, I am hoping to be
On Tue, Mar 23, 2021 at 12:39 PM Marcel Huysegoms
wrote:
> Hello everyone,
>
> I have a large system of nonlinear equations for which I'm trying to find
> the optimal solution.
> In order to get familiar with the SNES framework, I created a standalone
> python script (see below), which creates a
I agree. If you are mixing C and Fortran, everything is /nota bene. /It
is easy to miss argument mismatches.
-sanjay
On 3/23/21 11:04 AM, Barry Smith wrote:
In a pure Fortran code using -fdefault-integer-8 is probably fine.
But MUMPS is a mixture of Fortran and C code and PETSc uses MUMPs
In a pure Fortran code using -fdefault-integer-8 is probably fine. But MUMPS
is a mixture of Fortran and C code and PETSc uses MUMPs C interface. The
-fdefault-integer-8 doesn't magically fix anything in the C parts of MUMPS. I
also don't know about MPI calls and if they would need editing
Hi Zakariae - sorry about the delay - responses inline below.
I'd be curious to see your code (which you can send directly to me if you don't
want to post it publicly), so I can give you more comments, as DMStag is a new
component.
> Am 23.03.2021 um 00:54 schrieb Jorti, Zakariae :
>
> Hi,
Hello everyone,
I have a large system of nonlinear equations for which I'm trying to find the
optimal solution.
In order to get familiar with the SNES framework, I created a standalone python script
(see below), which creates a set of 2D points and transforms them using an affine
transformatio
The calculation of p1 and p2 are done by solving an element-wise local problem
using u^n. I guess I could embed this calculation inside of the calculation for
G = H(p1, p2). However, I am hoping to be able to solve the problem using
firedrake-ts so the formulation is all clearly in one place and
You mpicc is somehow not linking the MPI/IO stuff?
configure:5932: /usr/local/Cellar/mpich/3.4.1_1/bin/mpicc -o conftest
-fstack-protector -fno-stack-check -Qunused-arguments -g -O0
-Wno-implicit-function-declaration
-I/usr/local/Cellar/mpich/3.4.1_1/include conftest.c -lz -llapack -lblas
-L/usr
Nice to hear!
The answer is simple, PETSc is awesome :)
Jokes aside, assuming both petsc builds were configured with
—with-debugging=0, I don’t think there is a definitive answer to your
question with the information you provided.
It could be as simple as one specific implementation you use was i
14 matches
Mail list logo