Greg Snow wrote:
-----Original Message-----
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
project.org] On Behalf Of Frank E Harrell Jr
Sent: Saturday, September 27, 2008 7:15 PM
To: Darin Brooks
Cc: [EMAIL PROTECTED]; [EMAIL PROTECTED];
[EMAIL PROTECTED]
Subject: Re: [R] FW: logistic regression

Darin Brooks wrote:
Glad you were amused.

I assume that "booking this as a fortune" means that this was an
idiotic way
to model the data?
Dieter was nominating this for the "fortunes" package in R.  (Thanks
Dieter)

MARS?  Boosted Regression Trees?  Any of these a better choice to
extract
significant predictors (from a list of about 44) for a measured
dependent
variable?
Or use a data reduction method (principal components, variable
clustering, etc.) or redundancy analysis (to remove individual
predictors before examining associations with Y), or fit the full model
using penalized maximum likelihood estimation.  lasso and lasso-like
methods are also worth pursuing.

Frank (and any others who want to share an opinion):

What are your thoughts on model averaging as part of the above list?

Model averaging has good performance but no advantage over fitting a single complex model using penalized maximum likelihood estimation.

Frank



--
Gregory (Greg) L. Snow Ph.D.
Statistical Data Center
Intermountain Healthcare
[EMAIL PROTECTED]
801.408.8111





--
Frank E Harrell Jr   Professor and Chair           School of Medicine
                     Department of Biostatistics   Vanderbilt University

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to