I'm trying to run bGLMM on a large dataset and it is failing after trying to 
create huge vectors with the error:
Error: cannot allocate vector of size 553.7 Gb
The dataset that I'm using is fairly large, around 1.5 million observations. 
The command is as follows:
boost1 <- bGLMM(A~ B + C, rnd = list(D=~1), data = train, family =binomial)
A is binary, B and C are continuous and D is a factor variable with 250,000 
levels that I'm trying to model as a random effect.
A similar model runs fine on lme4. Is my dataset simply much too large for this 
package or am I doing something obviously wrong?                                
          
        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to