Just to make this thread more useful for someone interested in these
topics, this seems to be "the book" on automatic differentiation (it's
one of the references in the autodiff website)
"Evaluating Derivatives: Principles and Techniques of Algorithmic
Differentiation" (2nd ed.), by Andreas Griewa
On Tue, May 4, 2010 at 9:23 PM, wrote:
> In [2] I didn't see anything about higher derivatives, so to get the
> Hessian I still had to do a finite difference (Jacobian) on the
> complex_step_grad. Even then the results look pretty good.
Yes, the traditional complex step does not solve the second
On Tue, May 4, 2010 at 8:23 PM, Guilherme P. de Freitas
wrote:
> On Tue, May 4, 2010 at 2:57 PM, Sebastian Walter
> wrote:
>> playing devil's advocate I'd say use Algorithmic Differentiation
>> instead of finite differences ;)
>> that would probably speed things up quite a lot.
>
> I would sugges
I forgot to mention one thing: if you are doing optimization, a good
solution is a modeling package like AMPL (or GAMS or AIMMS, but I only
know AMPL, so I will restrict my attention to it). AMPL has a natural
modeling language and provides you with automatic differentiation.
It's not free, but the
On Tue, May 4, 2010 at 2:57 PM, Sebastian Walter
wrote:
> playing devil's advocate I'd say use Algorithmic Differentiation
> instead of finite differences ;)
> that would probably speed things up quite a lot.
I would suggest that too, but aside from FuncDesigner[0] (reference in
the end), I could
playing devil's advocate I'd say use Algorithmic Differentiation
instead of finite differences ;)
that would probably speed things up quite a lot.
On Tue, May 4, 2010 at 11:36 PM, Davide Lasagna wrote:
> If your x data are equispaced I would do something like this
> def derive( func, x):
> """
If your x data are equispaced I would do something like this
def derive( func, x):
"""
Approximate the first derivative of function func at points x.
"""
# compute the values of y = func(x)
y = func(x)
# compute the step
dx = x[1] - x[0]
# kernel array for second order accuracy centered
On Tue, May 4, 2010 at 4:06 PM, gerardob wrote:
>
> Hello, I have written a very simple code that computes the gradient by finite
> differences of any general function. Keeping the same idea, I would like
> modify the code using numpy to make it faster.
> Any ideas?
> Thanks.
>
> def grad_fi
Hello, I have written a very simple code that computes the gradient by finite
differences of any general function. Keeping the same idea, I would like
modify the code using numpy to make it faster.
Any ideas?
Thanks.
def grad_finite_dif(self,x,user_data = None):