On 7/6/2009 3:52 PM, Scott Zentz wrote:
Hello Everyone,

We have recently purchased a server which has 64GB of memory running a 64bit OS and I have compiled R from source with the following config

./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib --enable-BLAS-shlib --enable-shared --with-readline --with-iconv --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib

and I would like to verify that I can use 55GB-60GB of the 64GB of memory within R. Does anyone know how this is possible? Will R be able to access that amount of memory from a single process? I am not an R user myself but I just wanted to test this before I turned the server over to the researchers..

Individual vectors are limited to 2^31-1 elements, and the elements are 8 bytes each in a double precision vector. So executing

a <- numeric(2^30)

will use up 8 GB of memory. You can try this with other variable names, and see how often it succeeds:

b <- numeric(2^30) # total now 16 GB
c <- numeric(2^30) # total now 24 GB, etc.

Duncan Murdoch

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to