On Sun, 15 May 2011, Duncan Murdoch wrote:
On 15/05/2011 3:02 PM, Aram Fingal wrote:
On May 13, 2011, at 6:38 AM, Michael Haenlein wrote:
Dear all,
I'm currently running R on my laptop -- a Lenovo Thinkpad X201 (Intel Core
i7 CPU, M620, 2.67 Ghz, 8 GB RAM). The problem is that some of my
calculations run for several days sometimes even weeks (mainly simulations
over a large parameter space). Depending on the external conditions, my
laptop sometimes shuts down due to overheating.
You didn't mention whether you are using a 64-bit OS or not. A single
32-bit process can not use more than 2 GB RAM.
And that is also false. For Windows, see the rw-FAQ. It is
address space (not RAM) that is limited, and it is limited to 4GB *by
definition* in a 32-bit process. Many OSes can give your process 4GB
of address space, but may reserve some of it for the OS.
If your calculations would
benefit from the full 8 GB RAM on your machine, you need to be able to run
64-bit R. My understanding is that, on Windows, you either have to
install the OS as 32-bit and use all 32-bit software or install 64-bit
Windows and run all 64-bit software. A Mac can run 32-bit and 64-bit
software simultaneously and I'm not sure about Linux. In the case of
Linux, it probably doesn't matter so much because most Linux software is
available as open source and you can compile it yourself either way.
For the record, all modern 64-bit OSes on x86_64 cpus can do this
provided you install 32-bit versions of core dynamic libraries. I run
32- and 64-bit R on 64-bit Linux, Solaris, FreeBSD, Darwin (the OS of
Mac OS X), Windows .... As can AIX and IRIX on their CPUs.
No, 64 bit Windows can run either 32 or 64 bit Windows programs.
I'm now thinking about buying a more powerful desktop PC or laptop. Can
anybody advise me on the best configuration to run R as fast as possible?
I
will use this PC exclusively for R so any other factors are of limited
importance.
You need to evaluate whether RAM or raw processor speed is most critical
for what you're doing. In my case, I upgraded my Mac Pro to 16 GB RAM and
was able to do hierarchical clustering heatmaps overnight which previously
took more than a week to compute. Using the Activity Monitor utility, it
looks like some of the, even larger, heatmap computations would benefit
from 32 GB RAM or more.
Linux runs on the widest range of hardware and that allows you the greatest
ability to shop around. If RAM is the deciding factor, then you can look
around for a machine which can hold as much RAM as possible. If processor
speed is the factor, then you can optimize for that. Windows runs on a
reasonable array of hardware but has the disadvantage that the OS, itself,
uses a lot of resources.
Nothing like as much as Mac OS X, though. (I would say the main
disadvantage of Windows for R is the slowness of the file systems.)
The Mac has the advantage of flexibility. When you download the
precompiled R package, it comes with both a 32-bit and a 64-bit executable.
This is because 32-bit processes run a little faster if you don't need
large amounts of RAM. If you do need the RAM, then you run the 64-bit
version.
The same is true for Windows binaries on CRAN.
And of e.g. the Fedora binaries.
Duncan Murdoch
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
Mr Fingal: please do! You are clearly unfamiliar with the R manuals.
--
Brian D. Ripley, rip...@stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.