ehalf Of Bert Gunter
Sent: Tuesday, August 03, 2010 4:52 PM
To: Michael Haenlein
Cc: r-help@r-project.org
Subject: Re: [R] Collinearity in Moderated Multiple Regression
"biased regression coefficients" is nonsense. The coefficients are
unbiased: their expectation (in the appropri
haenl...@gmail.com wrote:
I'm sorry -- I think I chose a bad example. Let me start over again:
I want to estimate a moderated regression model of the following form:
y = a*x1 + b*x2 + c*x1*x2 + e
Based on my understanding, including an interaction term (x1*x2) into the
regression in addition
y case as it leads to biased regression coefficients (which
> is what I feared).
>
> Thanks,
>
> Michael
>
>
>
> -Original Message-
> From: Bert Gunter [mailto:gunter.ber...@gene.com]
> Sent: Tuesday, August 03, 2010 22:37
> To: Dennis Murphy
> Cc: ha
, 2010 22:37
To: Dennis Murphy
Cc: haenl...@gmail.com; r-help@r-project.org
Subject: Re: [R] Collinearity in Moderated Multiple Regression
Absolutely right.
But I think it's also worth adding that when the predictors _are_
correlated, the estimates of their coefficients depend on which are include
Absolutely right.
But I think it's also worth adding that when the predictors _are_
correlated, the estimates of their coefficients depend on which are
included in the model. This means that one should generally not try to
interpret the individual coefficients, e.g. as a way to assess their
relati
Hi:
On Tue, Aug 3, 2010 at 6:51 AM, wrote:
> I'm sorry -- I think I chose a bad example. Let me start over again:
>
> I want to estimate a moderated regression model of the following form:
> y = a*x1 + b*x2 + c*x1*x2 + e
>
No intercept? What's your null model, then?
>
> Based on my understand
org]
On Behalf Of Michael Haenlein
Sent: Tuesday, August 03, 2010 10:44 AM
To: 'Nikhil Kaza'
Cc: r-help@r-project.org
Subject: Re: [R] Collinearity in Moderated Multiple Regression
Thanks very much -- it seems that Ridge Regression can do what I'm
looking
for!
Best,
Michael
e: [R] Collinearity in Moderated Multiple Regression
My usual strategy of dealing with multicollinearity is to drop the offending
variable or transform one them. I would also check vif functions in car and
Design.
I think you are looking for lm.ridge in MASS package.
Nikhil Kaza
Asst. Professor,
My usual strategy of dealing with multicollinearity is to drop the
offending variable or transform one them. I would also check vif
functions in car and Design.
I think you are looking for lm.ridge in MASS package.
Nikhil Kaza
Asst. Professor,
City and Regional Planning
University of North
On Aug 3, 2010, at 9:51 AM, haenl...@gmail.com wrote:
I'm sorry -- I think I chose a bad example. Let me start over again:
I want to estimate a moderated regression model of the following form:
y = a*x1 + b*x2 + c*x1*x2 + e
Based on my understanding, including an interaction term (x1*x2)
in
I'm sorry -- I think I chose a bad example. Let me start over again:
I want to estimate a moderated regression model of the following form:
y = a*x1 + b*x2 + c*x1*x2 + e
Based on my understanding, including an interaction term (x1*x2) into the
regression in addition to x1 and x2 leads to issues
I think you are attributing to "collinearity" a problem that is due to
your small sample size. You are predicting 9 points with 3 predictor
terms, and incorrectly concluding that there is some "inconsistency"
because you get an R^2 that is above some number you deem surprising.
(I got valu
Thanks for your comment!
Actually, they are continuous variables which have a very low correlation --
I just wanted to make the whole story easier for explanation.
My general question is: Does R offer an alternative to lm for situations
where there is substantial collinearity between the independe
Are x1 and x2 are factors (dummy variables)? cor does not make sense
in this case.
Nikhil Kaza
Asst. Professor,
City and Regional Planning
University of North Carolina
nikhil.l...@gmail.com
On Aug 3, 2010, at 9:10 AM, Michael Haenlein wrote:
Dear all,
I have one dependent variable y and tw
Dear all,
I have one dependent variable y and two independent variables x1 and x2
which I would like to use to explain y. x1 and x2 are design factors in an
experiment and are not correlated with each other. For example assume that:
x1 <- rbind(1,1,1,2,2,2,3,3,3)
x2 <- rbind(1,2,3,1,2,3,1,2,3)
co
15 matches
Mail list logo