Thomas, I tried biglm and it does not work see
http://r.789695.n4.nabble.com/unable-to-get-bigglm-working-ATTN-Thomas-Lumley-td2276524.html#a2278381 . There are other posts from people who cannot get biglm working and others who get strange results. Please, advise if you can help. I have row based native code which works, but it is inconvenient as it does not produce an R object, which can be fed to anova etc. offered it to the developer forum, but message is still waiting for mod approval. regards Stephen B -----Original Message----- From: Thomas Lumley [mailto:tlum...@uw.edu] Sent: Friday, March 30, 2012 7:32 PM To: Bond, Stephen Cc: r-help@r-project.org Subject: Re: [R] ff usage for glm On Sat, Mar 31, 2012 at 9:05 AM, Bond, Stephen <stephen.b...@cibc.com> wrote: > Greetings useRs, > > Can anyone provide an example how to use ff to feed a very large data frame > to glm? > The data.frame cannot be loaded in R using conventional read.csv as it is too > big. > > glm(...,data=ff.file) ?? > I shouldn't think glm() will work on data that are too big to read into R. However, bigglm() from the biglm package should work. You just need to write a function that supplies chunks of data from ff.file as requested (see the example on ?bigglm). I haven't used ff, but it looks from the documentation as though chunk() will do all the difficult parts. -thomas -- Thomas Lumley Professor of Biostatistics University of Auckland ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.