On Wed, Jul 14, 2010 at 4:23 PM, paul s <r-project....@queuemail.com> wrote: > hi -
> i just started using R as i am trying to figure out how perform a linear > regression on a huge matrix. > i am sure this topic has passed through the email list before but could not > find anything in the archives. > i have a matrix that is 2,000,000 x 170,000 the values right now are > arbitray. > i try to allocate this on a x86_64 machine with 16G of ram and i get the > following: >> x <- matrix(0,2000000,170000); > Error in matrix(0, 2e+06, 170000) : too many elements specified R stores matrices and other data objects in memory. A matrix of that size would require > 2e+06*170000*8/2^30 [1] 2533.197 gigabytes of memory. Start looking for a machine with at least 5 terabytes of memory (you will need more than one copy of the matrix to be stored) or, probably easier, rethink your problem. Results from a linear regression producing 170,000 coefficient estimates are unlikely to be useful. The model matrix is essentially guaranteed to be rank deficient. >> > > is R capable of handling data of this size? am i doing it wrong? > > cheers > paul > > ______________________________________________ > R-help@r-project.org mailing list > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.