Right, I've worked this one out - the problem is that the example (above) I was using to test run this package only contains 3 variables. When you add in a fourth it works fine:

    d = runif(30)

And run again telling it to use GA:

babs <- glmulti(y~a*b*c*d, level = 2, fitfunc = lmer.glmulti, random = "+(1|x)", method = "g")

Returns:

    ...

    After 190 generations:
    Best model: y~1
    Crit= 159.374382952181
    Mean crit= 163.380382861026
Improvements in best and average IC have bebingo en below the specified goals.
    Algorithm is declared to have converged.
    Completed.

Using glmulti out-of-the-box with a GLM gives the same result if you try to use GA with less than three variables. This is not really an issue however as if you've only got three variables it is possible to do an exhaustive search. The problem was the example.

Thomas


------- Original message -------
From: Thomas <tf...@bath.ac.uk>
To: r-help@r-project.org
Cc:
Subject: glmulti runs indefinitely when using genetic algorithm with lme4
Date: Sun, 02 Sep 2012 11:23:23 +0100

Dear List,

I'm using glmulti for model averaging in R. There are ~10 variables in my
model, making exhaustive screening impractical - I therefore need to use
the genetic algorithm (GA) (call: method = "g").

I need to include random effects so I'm using glmulti as a wrapper for
lme4. Methods for doing this are available here
http://www.inside-r.org/packages/cran/glmulti/docs/glmulti and there is
also a pdf included with the glmulti package that goes into more detail.
The problem is that when telling glmulti to use GA in this setting it runs
indefinitely, even after the best model has been found.

This is the example taken from the pdf included in the glmulti package:

      library(lme4)
      library(glmulti)

      # create a function for glmulti to act as a wrapper for lmer:
      lmer.glmulti <- function (formula, data, random = "", ...) {
      lmer(paste(deparse(formula), random), data = data, REML=F, ...)
      }

      # set some random variables:
      y = runif(30,0,10) # mock dependent variable
      a = runif(30) # dummy covariate
      b = runif(30) # another dummy covariate
      c = runif(30) # an another one
      x = as.factor(round(runif(30),1))# dummy grouping factor

      # run exhaustive screening with lmer:
      bab <- glmulti(y~a*b*c, level = 2, fitfunc = lmer.glmulti, random =
"+(1|x)")

This works fine. The problem is when I tell it to use the genetic
algorithm:

      babs <- glmulti(y~a*b*c, level = 2, fitfunc = lmer.glmulti, random =
"+(1|x)", method = "g")

It just keeps running indefinitely and the AIC does not change:

      After 19550 generations:
      Best model: y~1
      Crit= 161.038899734164
      Mean crit= 164.13629335762
      Change in best IC: 0 / Change in mean IC: 0

      After 19560 generations:
      Best model: y~1
      Crit= 161.038899734164
      Mean crit= 164.13629335762
      Change in best IC: 0 / Change in mean IC: 0

      After 19570 generations:
      Best model: y~1
      Crit= 161.038899734164
      Mean crit= 164.13629335762

      etc.

I have tried using calls that tell glmulti when to stop (deltaB = 0,
deltaM = 0.01, conseq = 6) but nothing seems to work. I think the problem
must lie with setting the function (?). It may be something really obvious
however I'm new to R and I can't work it out.

I am using R v.2.15.0, glmulti v.1.0.6 and lme4 v.0.999999-0 on Windows.

Any help with this would be much appreciated.

Thomas

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to