Peter

I see there is no mistake. The phrase about the 'number of parameters' confused 
me, it is a little ambiguous.
Many thanks for taking the time to help me.

Geoff



> On 5 Mar 2014, at 11:20, "Peter Dalgaard-2 [via R]" 
> <ml-node+s789695n4686243...@n4.nabble.com> wrote:
> 
> 
> On 04 Mar 2014, at 21:21 , Geoff Loveman <[hidden email]> wrote: 
> 
> > 
> > 
> > In 'An Introduction to R', section 11.7 on nonlinear least squares fitting, 
> > the following example is given for obtaining the standard errors of the 
> > estimated parameters: 
> > 
> > "To obtain the approximate standard errors (SE) of the estimates we do: 
> > sqrt(diag(2*out$minimum/(length(y) - 2) * solve(out$hessian)))The 2 in the 
> > line above represents the number of parameters." 
> > 
> > I know the inverted Hessian is multiplied by the mean square error and that 
> > the denominator of the MSE is the degrees of freedom (number of samples - 
> > number of parameters) but why does the numerator of the MSE (which is the 
> > RSS) get multiplied by the number of parameters? I have read through 
> > explanations of the method for obtaining the SE but I don't see where the 
> > MSE gets multiplied by the number of parameters or why this is needed as 
> > shown in the example? 
> >
> 
> 
> There are two 2's in that line, and I'd expect that only the last one has to 
> do with the number of parameters, and the other one has to do with whether 
> the Hessian is the second derivative of the sum of squares or of the negative 
> loglikelihood function (half the sum of squares). 
> 
> Quick check: In a linear model, we have 
> 
> ssd = || Y- X beta ||^2 
> gradient = -2 (Y - X beta )'X 
> Hessian H = 2 X'X 
> 
> and as we know, V(beta) = sigma^2 (X'X)^-1 = 2 sigma^2 H^-1 
> 
> -pd 
> 
> > Thanks for any help! 
> > 
> > Geoff Loveman 
> > Tech lead SMERAS 
> > QQ Maritime Life Support 
> > 
> > 
> > 
> > 
> > 
> > -- 
> > View this message in context: 
> > http://r.789695.n4.nabble.com/Is-this-a-mistake-in-An-Introduction-to-R-tp4686217.html
> > Sent from the R help mailing list archive at Nabble.com. 
> > 
> > ______________________________________________ 
> > [hidden email] mailing list 
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> > and provide commented, minimal, self-contained, reproducible code.
> 
> -- 
> Peter Dalgaard, Professor 
> Center for Statistics, Copenhagen Business School 
> Solbjerg Plads 3, 2000 Frederiksberg, Denmark 
> Phone: (+45)38153501 
> Email: [hidden email]  Priv: [hidden email] 
> 
> ______________________________________________ 
> [hidden email] mailing list 
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code. 
> 
> 
> If you reply to this email, your message will be added to the discussion 
> below:
> http://r.789695.n4.nabble.com/Is-this-a-mistake-in-An-Introduction-to-R-tp4686217p4686243.html
> To unsubscribe from Is this a mistake in 'An Introduction to R'?, click here.
> NAML




--
View this message in context: 
http://r.789695.n4.nabble.com/Is-this-a-mistake-in-An-Introduction-to-R-tp4686217p4686291.html
Sent from the R help mailing list archive at Nabble.com.
        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to