On Tue, 30 Oct 2007, David Bickel wrote:

> To help me make choices regarding a platform for running high-memory R
> processes in parallel, I would appreciate any responses to these
> questions:
>
> 1. Does the amount of RAM available to an R session depend on the
> processor (Intel vs. Sun) or on the OS (various Linux distributions vs.
> Solaris)?

Yes.

It depends on whether R uses 64-bit or 32-bit pointers. For 64-bit R you need a 
64-bit processor, an operating system that will run 64-bit programs, and a 
compiler that will produce them.

I'm not sure what the current Intel offerings are, but you can compile and run 
64-bit on AMD Opteron (Linux) and Sun (Solaris) systems.

> 2. Does R have any built-in limitations of RAM available to a session?
> For example, could it make use of 16 GB in one session given the right
> processor/OS platform?

R does have built-in limitations even in a 64-bit system, but they are large. 
It is certainly possible to use more than 16Gb of memory.

The main limit is that the length of a vector is stored in a C int, and so is 
no more than 2^31-1, or about two billion. A numeric vector of that length 
would take up 16Gb on its own.

> 3. Is there anything else I should consider before choosing a processor
> and OS?

I don't think there is anything else R-specific.


        -thomas

Thomas Lumley                   Assoc. Professor, Biostatistics
[EMAIL PROTECTED]       University of Washington, Seattle

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to