Hi Dimitri,

Your problem has little to do with local versus global optimum.  You can 
convince yourself that the solution you got is not even a local optimum by 
checking the gradient at the solution.

The main issue is that your objective function is not differentiable 
everywhere.  So, you have 2 options: either you use a smooth objective function 
(e.g. squared residuals) or you use an optimization algorithm than can handle 
non-smooth objective function.

Here I show that your problem is well solved by the `nmkb' function (a 
bound-constraints version of Nelder-Mead simplex method) from the "dfoptim" 
package.

library(dfoptim)

> myopt2 <- nmkb(fn=myfunc, par=c(0.1,max(IV)), lower=0)
> myopt2
$par
[1] 8.897590e-01 9.470163e+07

$value
[1] 334.1901

$feval
[1] 204

$restarts
[1] 0

$convergence
[1] 0

$message
[1] "Successful convergence"

Then, there is also the issue of properly scaling your function, because it is 
poorly scaled.   Look how different the 2 parameters are - they are 7 orders of 
magnitude apart.  You are really asking for trouble here.

Hope this is helpful,
Ravi.

-------------------------------------------------------
Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontology School of Medicine Johns Hopkins 
University

Ph. (410) 502-2619
email: rvarad...@jhmi.edu<mailto:rvarad...@jhmi.edu>


        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to