At the "solution" -- which nlm seems to find OK -- you have a very nasty scaling issue. exp(z) has value > 10^300.
Better transform your problem somehow to avoid that. You are taking log of this except for adding 1, so effectively have just z. But you should look at it carefully and do a number of checks to actually evaluate the function. And I would not trust the results if you cannot get analytic gradient of your function. If you have the gradient, then you can do just a Jacobian of it numerically to get the Hessian. numDeriv has a jacobian() function that works nicely for this, and you are then doing only 1 level of numerical approximation. However, if that language doesn't mean anything to you, you probably should not be attempting this problem yourself. JN On 16-04-06 02:31 PM, Alaa Sindi wrote: > hello all, > > I am getting wrong estimates from this code. do you know what could be the > problem. > > thanks > > > x<- c(1.6, 1.7, 1.7, 1.7, 1.8, 1.8, 1.8, 1.8) > y <- c( 6, 13, 18, 28, 52, 53, 61, 60) > n <- c(59, 60, 62, 56, 63, 59, 62, 60) > > DF <- data.frame(x, y, n) > > # note: there is no need to have the choose(n, y) term in the likelihood > fn <- function(p, DF) { > z <- p[1]+p[2]*DF$x > sum( - (DF$y*z) - DF$n*log(1+exp(z))) > > #sum( - (y*(p[1]+p[2]*x) - n*log(1+exp(p[1]+p[2]*x))) ) > } > out <- nlm(fn, p = c(1,1),DF, hessian = TRUE, print.level=2) > print(out) > eigen(out$hessian) > ______________________________________________ > R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > ______________________________________________ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.