Dear Gavin, Gad and all,
Thank you very much for the reply!
Indeed my problem resembles Gavin's previously encountered one, and the
suggested method of package brglm() and profilemodel() seems solve the
problem right on target.
Right now I get stabilized AIC and realistic prediction, I'll check
On Fri, 2009-03-20 at 12:39 +1100, Gad Abraham wrote:
> Maggie Wang wrote:
> > Hi, Dieter, Gad, and all,
> >
> > Thank you very much for your reply!
> >
> > So here is my data, you can copy it into a file names "sample.txt"
>
> Hi Maggie,
>
> With this data (allowing for more iterations) I g
Maggie Wang wrote:
Hi, Dieter, Gad, and all,
Thank you very much for your reply!
So here is my data, you can copy it into a file names "sample.txt"
Hi Maggie,
With this data (allowing for more iterations) I get:
lr <- glm(fo, family=binomial(link=logit), data=matrix,
control=glm.contr
On Thu, 19 Mar 2009, Maggie Wang wrote:
Dear Thomas,
Thank you very much for the answering!
Yet why the situation happens only on some model, not all models? -
that is, why for other model it can drop some variables but for this
one it can't?
Presumably the other models don't have perfect se
Hi, Dieter, Gad, and all,
Thank you very much for your reply!
So here is my data, you can copy it into a file names "sample.txt"
0 -0.074 -0.098 -0.192 0.1 -0.106
0 -0.234 -0.212 -0.074 0.267 -0.122
0 -0.015 0.176 -0.061 0.179 0.178
0 -0.319 0.097 -0.122 0.08 -0.045
0 -0.106 -0.167 -0.209 -0.02
Dear Thomas,
Thank you very much for the answering!
Yet why the situation happens only on some model, not all models? -
that is, why for other model it can drop some variables but for this
one it can't?
Thanks!!
Best regards,
Maggie
On Wed, Mar 18, 2009 at 3:38 PM, Thomas Lumley wrote:
>
>
Maggie Wang wrote:
Dear R-users,
I use glm() to do logistic regression and use stepAIC() to do stepwise model
selection.
The common AIC value comes out is about 100, a good fit is as low as around
70. But for some model, the AIC went to extreme values like 1000. When I
check the P-values, All t
With 30 variables and only 55 residual degrees of freedom you probably have
perfect separation due to not having enough data. Look at the coefficients --
they are infinite, implying perfect overfitting.
-thomas
On Wed, 18 Mar 2009, Maggie Wang wrote:
Dear R-users,
I use glm() to do
Maggie Wang ust.hk> writes:
> I use glm() to do logistic regression and use stepAIC() to do stepwise model
> selection.
>
> The common AIC value comes out is about 100, a good fit is as low as around
> 70. But for some model, the AIC went to extreme values like 1000. When I
> check the P-values,
Dear R-users,
I use glm() to do logistic regression and use stepAIC() to do stepwise model
selection.
The common AIC value comes out is about 100, a good fit is as low as around
70. But for some model, the AIC went to extreme values like 1000. When I
check the P-values, All the independent variab
10 matches
Mail list logo