[Numpy-discussion] ANN: MyGrad 2.0 - Drop-in autodiff for NumPy

2021-04-18 Thread Ryan Soklaski
All,

I am excited to announce the release of MyGrad 2.0.

MyGrad's primary goal is to make automatic differentiation accessible and
easy to use across the NumPy ecosystem (see [1] for more detailed comments).

Source: https://github.com/rsokl/MyGrad
Docs: https://mygrad.readthedocs.io/en/latest/

MyGrad's only dependency is NumPy and (as of version 2.0) it makes keen use
of NumPy's excellent protocols for overriding functions and ufuncs. Thus
you can "drop in" a mygrad-tensor into your pure NumPy code and compute
derivatives through it.

Ultimately, MyGrad could be extended to bring autodiff to other array-based
libraries like CuPy, Sparse, and Dask.

For full release notes see [2]. Feedback, critiques, and ideas are welcome!

Cheers,
Ryan Soklaski

[1] MyGrad is not meant to "compete" with the likes of PyTorch and JAX,
which are fantastically-fast and powerful autodiff libraries. Rather, its
emphasis is on being lightweight and seamless to use in NumPy-centric
workflows.
[2] https://mygrad.readthedocs.io/en/latest/changes.html#v2-0-0
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] ANN: MyGrad 2.0 - Drop-in autodiff for NumPy

2021-04-20 Thread Ryan Soklaski
Hi Stephan,

You are correct that MyGrad takes an object-oriented design, rather than a
functional one. This enables a more imperative style of workflow [1], which
is how many people approach doing data science in notebooks and REPLs.
MyGrad feels similar to NumPy and PyTorch in this way.

Ultimately, swapping one (or more) `ndarray` with a `Tensor`is all your need
to do differentiate your numpy-based code with respect to that variable:

from stephans_library import func_using_numpy
   
x = mygrad.tensor(1.)
y = mygrad.tensor(2.)
z = func_using_numpy(x, y)  # coerced into returning a Tensor
z.backward()  # computes dz/dx and dz/dy
x.grad  # stores dz/dx
y.grad  # stores dz/dy

Thus with MyGrad, you can truly drop in a tensor into code that is written
in vanilla NumPy (assuming said code involves numpy functions currently
implemented in MyGrad), no matter what style of code that is – functional or
otherwise.

Regarding autograd, I would maybe describe them as "swap out" autodiff for
NumPy, rather than "drop-in". Indeed, you need to use the functions supplied
by `autograd.numpy` in order to leverage their functionality, in addition to
adopting a functional code style. Which means that
`stephans_library.func_using_numpy` can't be differentiated either unless it
used autograd.numpy.

Furthermore, autograd does not really aim to be "NumPy with autodiff" to the
same fidelity. For example, and in contrast with MyGrad, it does not
support:
  - in-place operations
  - specifying dtype, where, or out in ufuncs
  - common use-cases of einsum like traces and broadcast-reduction [2]
 
And, unfortunately, autograd has some long-standing bugs, including cases
where it simply gives you the wrong derivatives for relatively simple
functions [3]. In general, it doesn't seem like there is much activity
towards dealing with bug reports in the library.

That all being said, here are some pros and cons of MyGrad by my own
estimate:

Some cons of MyGrad:
  - autograd provides rich support for computing Jacobians and higher-order
derivatives. MyGrad doesn't.
  - Still plenty of NumPy functions that need implementing
  - Supporting a flexible imperative style along with in-place operations
and views comes at a (mitigable) performance cost [4]
  - Currently maintained just by me in my personal time (hard to match
Harvard/Google/Facebook!), which doesn't scale
  - Nowhere close to the level of adoption of autograd

Some pros of MyGrad:
   - Easy for NumPy users to just pick up and use (NumPy +
`Tensor.backward()`)
   - Big emphasis on correctness and completeness in terms of parity with
NumPy
   - Object-oriented approach has lots of perks
  - Easy for users to implement their own differentiable functions [5]
  - Tensor can be wrapped by, say, a xarray-Datarray for backprop
through xarray [6]
  - Tensor could wrap CuPy/Dask/sparse array instead of ndarray to bring
autodiff to them  
   - Polished docs, type hints, UX
   - High-quality test suite (leverages Hypothesis [7] extensively)


[1]
https://gist.github.com/rsokl/7c2812264ae622bbecc990fad4af3fd2#getting-a-derivative
[2]
https://gist.github.com/rsokl/7c2812264ae622bbecc990fad4af3fd2#some-derivatives-not-supported-by-autograd
[3]
https://gist.github.com/rsokl/7c2812264ae622bbecc990fad4af3fd2#some-places-where-autograd-returns-incorrect-derivatives
[4]
https://mygrad.readthedocs.io/en/latest/performance_tips.html#controlling-memory-guarding-behavior
[5] https://mygrad.readthedocs.io/en/latest/operation.html
[6] Still lots of sharp edges here:
https://gist.github.com/rsokl/7c2812264ae622bbecc990fad4af3fd2#a-crude-prototype-of-using-mygrad-w-xarray
[7] https://hypothesis.readthedocs.io/en/latest/



--
Sent from: http://numpy-discussion.10968.n7.nabble.com/
___
NumPy-Discussion mailing list
NumPy-Discussion@python.org
https://mail.python.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Re: dtype=(bool) vs dtype=bool

2021-10-19 Thread Ryan Soklaski
As he said: this is not the appropriate use for this mailing list.

On Tue, Oct 19, 2021, 10:05 AM  wrote:

> > You could use `dis.dis` to compare the two expressions and see that they
> compile to the same bytecode.
>
> Do you mean the following:
>
> In [1]: import numpy as np
> In [2]: from dis import dis
> In [7]: dis('bool')
>   1   0 LOAD_NAME0 (bool)
>   2 RETURN_VALUE
>
> In [8]: dis('(bool)')
>   1   0 LOAD_NAME0 (bool)
>   2 RETURN_VALUE
> ___
> NumPy-Discussion mailing list -- numpy-discussion@python.org
> To unsubscribe send an email to numpy-discussion-le...@python.org
> https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
> Member address: rsokla...@gmail.com
>
___
NumPy-Discussion mailing list -- numpy-discussion@python.org
To unsubscribe send an email to numpy-discussion-le...@python.org
https://mail.python.org/mailman3/lists/numpy-discussion.python.org/
Member address: arch...@mail-archive.com