Anybody?
________________________________ To: r-help@r-project.org Sent: Tue, March 2, 2010 2:23:19 PM Subject: bigglm Memory Issues Hi all, I'm somewhat of a novice in terms of programming, so I thought I'd come here to seek some help with an issue I'm having. [[elided Yahoo spam]] Here is the particular line of code that is giving me trouble: >mod = bigglm(Pres/wt ~ Xdes, data=dat, family=poisson(), weights = ~wt, >maxit=100, tol=1.e-10, chunksize=(dim(Xdes)[1] + 1)) When I attempt to run this, it gives me the following error: >Error: cannot allocate vector of size 255.9 Mb However, I had set the maximum memory to 4095 Mb and had used only 926 Mb to that point. If I try to use a smaller chunksize, I get this error: >Error in model.frame.default(tt, chunk) : variable lengths differ (found for >'Xdes') Can anybody help me with this? Thanks, Ian [[alternative HTML version deleted]] ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.