Re: [Numpy-discussion] ANN: MyGrad 2.0 - Drop-in autodiff for NumPy

2021-04-20 Thread Ryan Soklaski
Hi Stephan, You are correct that MyGrad takes an object-oriented design, rather than a functional one. This enables a more imperative style of workflow [1], which is how many people approach doing data science in notebooks and REPLs. MyGrad feels similar to NumPy and PyTorch in this way. Ultimate

Re: [Numpy-discussion] ANN: MyGrad 2.0 - Drop-in autodiff for NumPy

2021-04-18 Thread Stephan Hoyer
On Sun, Apr 18, 2021 at 9:11 AM Ryan Soklaski wrote: > MyGrad is not meant to "compete" with the likes of PyTorch and JAX, which > are fantastically-fast and powerful autodiff libraries. Rather, its > emphasis is on being lightweight and seamless to use in NumPy-centric > workflows. > Thanks for

[Numpy-discussion] ANN: MyGrad 2.0 - Drop-in autodiff for NumPy

2021-04-18 Thread Ryan Soklaski
All, I am excited to announce the release of MyGrad 2.0. MyGrad's primary goal is to make automatic differentiation accessible and easy to use across the NumPy ecosystem (see [1] for more detailed comments). Source: https://github.com/rsokl/MyGrad Docs: https://mygrad.readthedocs.io/en/latest/