Alan,

More RAM will definitely help.  But if you have an object needing more than
2^31-1 ~ 2 billion elements, you'll hit a wall regardless.  This could be
particularly limiting for matrices.  It is less limiting for data.frame
objects (where each column could be 2 billion elements).  But many R
analytics under the hood use matrices, so you may not know up front where
you could hit a limit.

Jay

---- Original message ----
I have a Windows Server 2008 R2 Enterprise machine, with 64bit R installed
running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 1066 Mhz
RAM.  I am seeking to analyse very large data sets (perhaps as much as
10GB), without the addtional coding overhead of a package such as
bigmemory().

My question is this - if we were to increase the RAM on the machine to
(say) 128GB, would this become a possibility?  I have read the
documentation on memory limits and it seems so, but would like some
additional confirmation before investing in any extra RAM.
---------------------------------------------


-- 
John W. Emerson (Jay)
Associate Professor of Statistics, Adjunct, and Acting Director of Graduate
Studies
Department of Statistics
Yale University
http://www.stat.yale.edu/~jay

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to