Dear R developers, We have a big SGI Origin computation server with 32 cpu's and 64 Gb of RAM. In R 2.0.0 we could run large jobs, allocating 8 Gb of RAM was not a problem, for example by running: > v1 <- seq(1,2^29) > v2 <- seq(1,2^29) > v3 <- seq(1,2^29) > v4 <- seq(1,2^29) This yields an R process, consuming about 8 Gb of RAM: PID PGRP USERNAME PRI SIZE RES STATE TIME WCPU% CPU% COMMAND 177484 177484 mirjam 20 8225M 8217M sleep 1:18 29.3 0.00 R
After upgrading from R 2.0.0 to R 2.2.1, we cannot allocate more than about 1300 M of memory, as shown below: > v1 <- seq(1,2^29) Error: cannot allocate vector of size 2097152 Kb > v1 <- seq(1,2^28) > v2 <- seq(1,2^27) Error: cannot allocate vector of size 524288 Kb > v2 <- seq(1,2^25) > v3 <- seq(1,2^24) > v4 <- seq(1,2^23) > v5 <- seq(1,2^22) Error: cannot allocate vector of size 16384 Kb > v5 <- seq(1,2^21) > v6 <- seq(1,2^20) > v7 <- seq(1,2^19) > v8 <- seq(1,2^18) > q() Save workspace image? [y/n/c]: n Upgrading to R 2.3.0 yields the same results. This yields an R executable taking 1284M of RAM, refusing to allocate more RAM, with about 30Gb free on the machine. Is there any special configuration option I should turn on to make it possible to use more memory? The OS memory limits (ulimit -a) are set appropriately: data seg size (kbytes, -d) unlimited max memory size (kbytes, -m) 63385824 stack size (kbytes, -s) 65536 virtual memory (kbytes, -v) unlimited If it is not some special (compile time) option that I should have set, I think this is a bug..... With kind regards, Mirjam van Vroonhoven -- Dr. Mirjam van Vroonhoven system administrator/programmer, dept. of Bioinformatics Erasmus Medical Center, Rotterdam, The Netherlands Room Number Ee 15.32, phone +31-10-463 81 11 Web: http://www.erasmusmc.nl/bioinformatics/ E-mail: [EMAIL PROTECTED] ______________________________________________ R-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-devel