On Wed, 31 Oct 2007, David Bickel wrote: > Dr. Lumley and Prof. Ripley, > > Thank you very much for your helpful responses. > > Have you found any particular distribution of Linux to work well with > 64-bit R? For the cluster, I am currently considering Debian (since it > seems popular) and SUSE (since Matlab runs on it), but I remain open to > others.
These days I think there is no difference. (SuSE did much of the development for x86_64 Linux, and Debian was one of the later ones to support it. But that's going back ca 4 years.) Commercial products normally only support commercial distributions of Linux, but Matlab does run on many others. I would find the overriding consideration to be the availability of local support. > > Best regards, > David > > > -----Original Message----- > From: Prof Brian Ripley [mailto:[EMAIL PROTECTED] > Sent: Tuesday, October 30, 2007 4:51 PM > To: Thomas Lumley > Cc: David Bickel; [EMAIL PROTECTED] > Subject: Re: [R] high RAM on Linux or Solaris platform > > On Tue, 30 Oct 2007, Thomas Lumley wrote: > >> On Tue, 30 Oct 2007, David Bickel wrote: >> >>> To help me make choices regarding a platform for running high-memory > R >>> processes in parallel, I would appreciate any responses to these >>> questions: >>> >>> 1. Does the amount of RAM available to an R session depend on the >>> processor (Intel vs. Sun) or on the OS (various Linux distributions > vs. >>> Solaris)? >> >> Yes. >> >> It depends on whether R uses 64-bit or 32-bit pointers. For 64-bit R > you >> need a 64-bit processor, an operating system that will run 64-bit >> programs, and a compiler that will produce them. >> >> I'm not sure what the current Intel offerings are, but you can compile > >> and run 64-bit on AMD Opteron (Linux) and Sun (Solaris) systems. > > That is both Sparc Solaris and x86_64 Solaris (although for the latter > you > seem to need to use the SunStudio compilers). > > As far as I know all current desktop Intel processors run x86_64, and > Xeons seem to have a price-performance edge at the moment. We have > several > boxes with dual quad-core Xeons and lots of RAM. (Not all for use with > R, > some Linux, some Windows.) Core 2 Duos do, and are commonplace in quite > > low-end systems. > > >>> 2. Does R have any built-in limitations of RAM available to a > session? >>> For example, could it make use of 16 GB in one session given the > right >>> processor/OS platform? >> >> R does have built-in limitations even in a 64-bit system, but they are > >> large. It is certainly possible to use more than 16Gb of memory. >> >> The main limit is that the length of a vector is stored in a C int, > and >> so is no more than 2^31-1, or about two billion. A numeric vector of >> that length would take up 16Gb on its own. > > ?"Memory-limits" documents them. > >>> 3. Is there anything else I should consider before choosing a > processor >>> and OS? >> >> I don't think there is anything else R-specific. > > -- Brian D. Ripley, [EMAIL PROTECTED] Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/ University of Oxford, Tel: +44 1865 272861 (self) 1 South Parks Road, +44 1865 272866 (PA) Oxford OX1 3TG, UK Fax: +44 1865 272595 ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.