On Jun 12, 2012, at 12:47 , Oliver Ruebenacker wrote:

>     Hello Hui,
> 
> On Tue, Jun 12, 2012 at 2:12 AM, Hui Wang <huiwang.biost...@gmail.com> wrote:
>> Dear all,
>> 
>> I've run into a question of handling large matrices in R. I'd like to
>> define a 70000*70000 matrix in R on Platform:
>> x86_64-apple-darwin9.8.0/x86_64 (64-bit), but it seems to run out of memory
>> to handle this. Is it due to R memory limiting size or RAM of my laptop? If
>> I use a cluster with larger RAM, will that be able to handle this large
>> matrix in R? Thanks much!
> 
>  Do you really mean 7e4 by 7e4? That would be 4.9e9 entries. If each
> entry takes 8 bytes (as it typically would on a 64 bit system), you
> would need close to 40 Gigabyte storage for this matrix. I'm not sure
> there is a laptop on the market with that amount of RAM.

What's more: log2(4.9e9) = 32.19, i.e. it exceeds the amount of memory that can 
be addressed with 32-bit pointers (and matrices are vectors internally). This 
maxes out at 2^31-1 = 2147483647 (maximum of a signed 32-bit integer).

In current R-devel, you _can_ actually store larger vectors, but you can't do 
much with them (yet):

  > d <- double(49e8)
  > sum(d)
  Error: long vectors not supported yet: 
../../../R/src/include/Rinlinedfuns.h:100


-- 
Peter Dalgaard, Professor
Center for Statistics, Copenhagen Business School
Solbjerg Plads 3, 2000 Frederiksberg, Denmark
Phone: (+45)38153501
Email: pd....@cbs.dk  Priv: pda...@gmail.com

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to