Hello, I have a rather large set of data I need to analyze, currently I need to 
work with a 200000 by 200000 matrix, I'm using the package bigmemory but so far 
I can only allocate a 66000 by 66000 matrix, when I increase those values I get 
the following error:  


> AdjMat <- big.matrix(nrow=68000,ncol=68000)
Cannot allocate memory
BigMatrix.cpp line 225
Error in big.matrix(nrow = 68000, ncol = 68000) : 
  Error: memory could not be allocated for instance of type big.matrix


As a part of my analyisis I need to calculate de correlation coefficient, but 
when I try to do that in a "smaller" matrix I get this other error. 





> A <- big.matrix(nrow=45000,ncol=45000)
> AM <- (cor(A, method="pearson"))
Error in as.vector(data) : 
  no method for coercing this S4 class to a vector
Calls: cor ... as.matrix -> as.matrix.default -> array -> as.vector
In addition: Warning message:
In is.na(x) : is.na() applied to non-(list or vector) of type 'S4'
Execution halted


My question is, how big of a matrix can I have when I use the package 
bigmemory, and if I use such a package, can I use the function cor()? If I 
can't, then do you have any suggestions on how can I address this problem?


Thank you very much.


ERV



        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to