Are your matricies sparse?
This package (or one of it's reverse dependencies/suggests) may help keep
the memory down.
http://cran.r-project.org/web/packages/SparseM/
Joal Heagney
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinf
It is probably contiguous memory, I always suggest that you have 3-4X
memory than your largest object to ensure that you have room for
copies that might be made. So make a request for about 50GB of
memory.
On Mon, Apr 18, 2011 at 4:10 PM, svrieze wrote:
> Hello,
>
> I'm (eventually) attempting
Hello,
I'm (eventually) attempting a singular value decomposition of a 3200 x
527829 matrix in R version 2.10.1. The script is as follows:
###-Begin Script here---###
library(Matrix)
snps <- 527829 ## Number of SNPs
N <- 3200## Sample size
y
3 matches
Mail list logo