You have to take some things into account : - the maximum memory set for R might not be the maximum memory available - R needs the memory not only for the dataset. Matrix manipulations require frquently double of the amount of memory taken by the dataset. - memory allocation is important when dealing with large datasets. There is plenty of information about that - R has some packages to get around memory problems with big datasets.
Read this discussione for example : http://tolstoy.newcastle.edu.au/R/help/05/05/4507.html and this page of Matthew Keller is a good summary too : http://www.matthewckeller.com/html/memory.html Cheers Joris On Sat, Jun 5, 2010 at 12:32 AM, Nathan Stephens <nwsteph...@gmail.com> wrote: > For me, I've found that I can easily work with 1 GB datasets. This includes > linear models and aggregations. Working with 5 GB becomes cumbersome. > Anything over that, and R croaks. I'm using a dual quad core Dell with 48 > GB of RAM. > > I'm wondering if there is anyone out there running jobs in the 100 GB > range. If so, what does your hardware look like? > > --Nathan > > [[alternative HTML version deleted]] > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > -- Ghent University Faculty of Bioscience Engineering Department of Applied mathematics, biometrics and process control tel : +32 9 264 59 87 joris.m...@ugent.be ------------------------------- Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.