2012/6/13 James Bergstra <[email protected]>: > Further to the recent discussion on lazy evaluation & numba, I moved > what I was doing into a new project: > > PyAutoDiff: > https://github.com/jaberg/pyautodiff > > It currently works by executing CPython bytecode with a numpy-aware > engine that builds a symbolic expression graph with Theano... so you > can do for example: > >>>> import autodiff, numpy as np >>>> autodiff.fmin_l_bfgs_b(lambda x: (x + 1) ** 2, [np.zeros(())]) > > ... and you'll see `[array(-1.0)]` printed out. > > In the future, I think it should be able to export the > gradient-computing function as bytecode, which could then be optimized > by e.g. numba or a theano bytecode front-end. For now it just compiles > and runs the Theano graph that it built. > > It's still pretty rough (you'll see if you look at the code!) but I'm > excited about it.
Very interesting. Would it be possible to use bytecode introspection to printout the compute and display a symbolic representation of an arbitrary python + numpy expression? E.g. something along the lines of: >>> g = autodiff.gradient(lambda x: (x + 1) ** 2, [np.zeros(())]) >>> print g f(x) = 2 * x + 2 >>> g(np.arrange(3)) array[2, 4, 6] -- Olivier http://twitter.com/ogrisel - http://github.com/ogrisel _______________________________________________ NumPy-Discussion mailing list [email protected] http://mail.scipy.org/mailman/listinfo/numpy-discussion
