Joris, Ridge regression is a type of regularized estimation approach. The objective function for least-squares, (Y - Xb)^t (Y - Xb) is modified by adding a quadratic penalty, k b^t b. Because of this the log-likelihood value (sum of squared residuals), for a fixed k, does not have much meaning, and is not really useful. However, a key issue in such regularized estimation is how to choose the regularization parameter "k". You can see that lm.ridge gives two different ways to estimate k (there are other ways).
Ravi. ____________________________________________________________________ Ravi Varadhan, Ph.D. Assistant Professor, Division of Geriatric Medicine and Gerontology School of Medicine Johns Hopkins University Ph. (410) 502-2619 email: rvarad...@jhmi.edu ----- Original Message ----- From: joris meys <jorism...@gmail.com> Date: Tuesday, March 17, 2009 7:37 pm Subject: [R] Likelihood of a ridge regression (lm.ridge)? To: R-help Mailing List <r-help@r-project.org> > Dear all, > > I want to get the likelihood (or AIC or BIC) of a ridge regression model > using lm.ridge from the MASS library. Yet, I can't really find it. As > lm.ridge does not return a standard fit object, it doesn't work with > functions like e.g. BIC (nlme package). Is there a way around it? I would > calculate it myself, but I'm not sure how to do that for a ridge regression. > > Thank you in advance > Kind regards > Joris > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help@r-project.org mailing list > > PLEASE do read the posting guide > and provide commented, minimal, self-contained, reproducible code. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.