Hi, Chris,
First, I had to fix an error in your test by adding "
PetscCallA(MatSetFromOptions(AA,ierr))" at line 254.
[0]PETSC ERROR: - Error Message
--
[0]PETSC ERROR: Object is in wrong state
[0]PETSC ERROR: Mat ob
Hi Alberto,
1. To check the array pointer on the PETSc side, you can do
print(hex(y_petsc.array.ctypes.data)). Then you will see a pointer mismatch
caused by the line y = jnp.from_dlpack(y_petsc, copy=False). This is because
you configured PETSc in double precision, but JAX uses single precisio
For a python example, please take a look at
src/binding/petsc4py/demo/legacy/ode/vanderpol.py and you will see how jvp is
done in the class IJacShell.
Hong
From: petsc-users on behalf of "Zhang, Hong
via petsc-users"
Reply-To: "Zhang, Hong"
Date: Tuesday, July 8, 2025 at 12:21 PM
To: Art ,
Dear Petsc,
Big fan here, you are amazing.
I want to test a finite difference method with parallel multigrid. I have done
it using DMDA and now I need to do it with local mesh (non-conforming)
refinement. I notice that the unreleased PetscFD is all I need. I did find Dr.
Abhishek's thesis, whi
Hi Art,
Here is a TS example that uses MatShell for implicit time integration and
adjoint sensitivity calculation:
src/ts/tutorials/advection-diffusion-reaction/ex5adj_mf.c
You will need to provide a (jvp) routine like MyIMatMult() in this example.
Adjoints require vjp (vector-Jacobian produ
Also note that MatShell is _exactly_ the same as the CVODE interface. It is
just a wrapper for that function pointer so that we do not need to change
the top-level interface.
Thanks,
Matt
On Tue, Jul 8, 2025 at 2:10 AM Jed Brown wrote:
> Using MatShell is the standard method. Note that