Dear R community, I'm running R 32 bits in a 64-bits machine (with 16Gb of Ram) using a PAE kernel, as you can see here:
$ uname -a Linux mymachine 2.6.18-238.el5PAE #1 SMP Sun Dec 19 14:42:44 EST 2010 i686 i686 i386 GNU/Linux When I try to create a large matrix ( Q.obs <- matrix(NA, nrow=6940, ncol=9000) ), I got the following error: > Error: cannot allocate vector of size 238.3 Mb However, the amount of free memory in my machine seems to be much larger than this: system("free") \ total used free shared buffers cached Mem: 12466236 6354116 6112120 0 67596 2107556 -/+ buffers/cache: 4178964 8287272 Swap: 12582904 0 12582904 I tried to increase the memory limit available for R by using: $ R --min-vsize=10M --max-vsize=5000M --min-nsize=500k --max-nsize=5000M but it didn't work. Any hint about how can I get R using all the memory available in the machine ? Thanks in advance, Mauricio -- =============================== Linux user #454569 -- Ubuntu user #17469 ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.