Re: [math] Using the hessian in scalar unconstrained optimization

2013-08-18 Thread Konstantin Berlin
The general problem is not computationally tractable. You can trying stochastic algorithms, like simulated annealing or genetic programmig, but results depend on the problem. There is no point in computing derivatives in that case either. On Sunday, August 18, 2013, Ajo Fod wrote: > Looks like Jo

Re: [math] Using the hessian in scalar unconstrained optimization

2013-08-18 Thread Ajo Fod
Looks like Joptimizer is restricted to solving convex problems. My application is to minimize a generic non-linear function with linear constraints. Know of anything that does it? -Ajo On Thu, Aug 15, 2013 at 5:48 PM, Konstantin Berlin wrote: > There would be an advantage, true. I don't know i

Re: [math] Using the hessian in scalar unconstrained optimization

2013-08-16 Thread Ajo Fod
If you're talking about analysis.differentiation package. I'm looking forward to using it. Though, my main point in this thread was that if one can compute the hessian, it should be possible in certain cases to speed up convergence rather than depending on an update rule to estimate an *approximat

Re: [math] Using the hessian in scalar unconstrained optimization

2013-08-16 Thread Luc Maisonobe
Le 16/08/2013 18:55, Ajo Fod a écrit : > The algorithm computes the Hessian using an update rule. My question was > what if you can compute the hessian analytically? > > Hessian: http://en.wikipedia.org/wiki/Hessian_matrix > Gradient: http://en.wikipedia.org/wiki/Gradient We do have support to he

Re: [math] Using the hessian in scalar unconstrained optimization

2013-08-16 Thread Ajo Fod
The algorithm computes the Hessian using an update rule. My question was what if you can compute the hessian analytically? Hessian: http://en.wikipedia.org/wiki/Hessian_matrix Gradient: http://en.wikipedia.org/wiki/Gradient Cheers, -Ajo On Fri, Aug 16, 2013 at 9:39 AM, Luc Maisonobe wrote: > L

Re: [math] Using the hessian in scalar unconstrained optimization

2013-08-16 Thread Luc Maisonobe
Le 15/08/2013 22:59, Ajo Fod a écrit : > Hello, > > Is'nt there an advantage to being able to compute the Jacobian of the > gradient precisely at a point? > > If so, is there a class that uses the Jacobian instead of estimating the > jacobian from the last few iteration as NonLinearConjugateGradi

Re: [math] Using the hessian in scalar unconstrained optimization

2013-08-15 Thread Ajo Fod
Yes that works. Thanks! On Thursday, August 15, 2013, Konstantin Berlin wrote: > There would be an advantage, true. I don't know if commons has one > (doesn't look like it). You can also try http://www.joptimizer.com/ > > On Thu, Aug 15, 2013 at 4:59 PM, Ajo Fod > > wrote: > > Hello, > > > > Is'

Re: [math] Using the hessian in scalar unconstrained optimization

2013-08-15 Thread Konstantin Berlin
There would be an advantage, true. I don't know if commons has one (doesn't look like it). You can also try http://www.joptimizer.com/ On Thu, Aug 15, 2013 at 4:59 PM, Ajo Fod wrote: > Hello, > > Is'nt there an advantage to being able to compute the Jacobian of the > gradient precisely at a point

[math] Using the hessian in scalar unconstrained optimization

2013-08-15 Thread Ajo Fod
Hello, Is'nt there an advantage to being able to compute the Jacobian of the gradient precisely at a point? If so, is there a class that uses the Jacobian instead of estimating the jacobian from the last few iteration as NonLinearConjugateGradientOptimizer does? Thanks, -Ajo