paul s wrote:
> hi -
> 
> i just started using R as i am trying to figure out how perform a linear 
> regression on a huge matrix.
> 
> i am sure this topic has passed through the email list before but could 
> not find anything in the archives.
> 
> i have a matrix that is 2,000,000 x 170,000 the values right now are 
> arbitray.
> 
> i try to allocate this on a x86_64 machine with 16G of ram and i get the 
> following:
> 
>  > x <- matrix(0,2000000,170000);
> Error in matrix(0, 2e+06, 170000) : too many elements specified
>  >
> 
> is R capable of handling data of this size? am i doing it wrong?

A quick calculation reveals that a matrix of that size requires about
2.7 TERAbytes of storage, so I'm a bit confused as to how you might
expect to fit it into 16GB of RAM...

However, even with terabytes of memory, you would be running into the
(current) limitation that a single vector in R can have at most 2^31-1 =
ca. 2 trillion elements.

Yes, you could be doing it wrong, but what is "it"? If the matrix is
sparse, there are sparse matrix tools around...


-- 
Peter Dalgaard
Center for Statistics, Copenhagen Business School
Phone: (+45)38153501
Email: pd....@cbs.dk  Priv: pda...@gmail.com

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to