Hey Whit,

    That worked! I was able to consume all the memory on the server!

Thanks!

-scz

Whit Armstrong wrote:
> Seems strange. I can go all the way up to 50GB on our machine which
> has 64GB as well.  It starts swapping after that, so I killed the
> process.
>
> try this:
> ans <- list()
> for(i in 1:100) {
>     ans[[ i ]] <- numeric(2^30/2)
>     cat("iteration: ",i,"\n")
>     print(gc())
> }
>
>   
>> source("scripts/test.memory.r")
>>     
> iteration:  1
>             used   (Mb) gc trigger   (Mb)  max used   (Mb)
> Ncells    109171    5.9     350000   18.7    350000   18.7
> Vcells 536990895 4097.0  592369806 4519.5 536992095 4097.0
> iteration:  2
>              used   (Mb) gc trigger   (Mb)   max used   (Mb)
> Ncells     109276    5.9     350000   18.7     350000   18.7
> Vcells 1073861851 8193.0 1184270079 9035.3 1073861858 8193.0
> ...
> ...
> ...
> iteration:  13
>              used    (Mb) gc trigger    (Mb)   max used    (Mb)
> Ncells     109287     5.9     350000    18.7     350000    18.7
> Vcells 6979441897 53249.0 7458515495 56904.0 6979442076 53249.0
>
> you might want to check your kernel.shmmax setting, although I'm not
> sure if it will help R.
>
> -Whit
>
>
> On Tue, Jul 7, 2009 at 8:39 AM, Scott Zentz<ze...@email.unc.edu> wrote:
>   
>> Hello Everyone!
>>
>>   Thanks for all your replies! This was very helpful!  I found that there
>> seems to be a limitation to only 32GB of memory which I think will be fine.
>> I was able to consume the 32GB of memory with the following:
>> Start R with the following command: R --max-vsize 55000M
>> then within R run
>> x=rep(0.1,2.141e9)
>> and watch the process with top and R will consume about 32GB of memory...
>> Hopefully this will be enough for the researchers ;)
>>
>> Thanks!
>> -scz
>>
>> Scott Zentz wrote:
>>     
>>> Hello Everyone,
>>>
>>>   We have recently purchased a server which has 64GB of memory running a
>>> 64bit OS and I have compiled R from source with the following config
>>>
>>> ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib
>>> --enable-BLAS-shlib --enable-shared --with-readline --with-iconv --with-x
>>> --with-tcktk --with-aqua --with-libpng --with-jpeglib
>>>
>>> and I would like to verify that I can use 55GB-60GB of the 64GB of memory
>>> within R. Does anyone know how this is possible? Will R be able to access
>>> that amount of memory from a single process? I am not an R user myself but I
>>> just wanted to test this before I turned the server over to the
>>> researchers..
>>>
>>> Thanks!
>>> -scz
>>>
>>> ______________________________________________
>>> R-help@r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>>       
>> ______________________________________________
>> R-help@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>>     

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to