On Jul 6, 2009, at 8:39 PM, Duncan Murdoch wrote:

On 06/07/2009 4:16 PM, Peter Dalgaard wrote:
Scott Zentz wrote:
Hello Everyone,

We have recently purchased a server which has 64GB of memory running a 64bit OS and I have compiled R from source with the following config

./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib --enable- BLAS-shlib --enable-shared --with-readline --with-iconv --with-x -- with-tcktk --with-aqua --with-libpng --with-jpeglib

and I would like to verify that I can use 55GB-60GB of the 64GB of memory within R. Does anyone know how this is possible? Will R be able to access that amount of memory from a single process? I am not an R user myself but I just wanted to test this before I turned the server over to the researchers..
Hmm, it's slightly tricky because R often duplicates objects, so you may hit the limit only transiently. Also, R has an internal 2GB limit on single vectors. But something like this

Is it a 2 GB limit in size, or in the number of elements? I'm still spending almost all my time in 32 bit land, so it's hard to check.

Duncan Murdoch



I believe that Peter is generically referring to the vector length limit and not a RAM limit. The only figure that I have ever seen referenced over the years is the limit on the length of a vector, given that R uses signed 32 bit integers for indexing:

# 2 Gb
> 2 * 1024^3
[1] 2147483648


# Figure referenced in ?"Memory-limits"
> 2^31 - 1
[1] 2147483647


Since matrices and arrays are vectors with 'dim' attributes, these objects have this limit as well.

HTH,

Marc

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to