On 06.08.2012 09:34, David Winsemius wrote:

On Aug 5, 2012, at 3:52 PM, alan.x.simp...@nab.com.au wrote:

Dear all

I have a Windows Server 2008 R2 Enterprise machine, with 64bit R
installed
running on 2 x Quad-core Intel Xeon 5500 processor with 24GB DDR3 1066
Mhz
RAM.  I am seeking to analyse very large data sets (perhaps as much as
10GB), without the addtional coding overhead of a package such as
bigmemory().

It may depend in part on how that number is arrived at. And what you
plan on doing with it. (Don't consider creating a dist-object.)

My question is this - if we were to increase the RAM on the machine to
(say) 128GB, would this become a possibility?  I have read the
documentation on memory limits and it seems so, but would like some
additional confirmation before investing in any extra RAM.

The trypical advices is you will need memory that is 3 times as large as
a large dataset, and I find that even more headroom is needed. I have
32GB and my larger datasets occupy 5-6 GB and I generally have few
problems. I had quite a few problems with 18 GB, so I think the ratio
should be 4-5 x your 10GB object.  I predict you could get by with 64GB.
(please send check for half the difference in cost between 64GB abd 128
GB.)



10Gb objects should be fine, but note that a vector/array/matrix cannot exceed 2^31-1 elements, hence a 17Gb vector/matrix/array of doubles / reals.

Best,
Uwe Ligges

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to