Hi Zack.

I don't know how helpful this is but I use R on 64bit ubuntu for the analysis 
of large microarray datasets. R may well be taking advantage of all your memory 
but the objects it's creating are too big. You could run 'top' in another 
terminal window to examine this behaviour. If R is using all your memory then I 
suppose you'll have to install some more.

Cheers

iain

zack holden <[EMAIL PROTECTED]> wrote: 
Dear list,
 
I've recently installed R on a 64 bit machine with 8 GB of RAM. I set this 
computer up as a dual-boot system, with windows XP 64 and Ubuntu 7.10. I 
downloaded the Linux 64 bit version of R and installed it. 
 
I'm trying to run rather large Random forest models and was running into severe 
memory limitations with my old windows 32 bit machines.
 
When I run Random Forest models on my new machine, I get the error message: 
Error: cannot allocate vector of size 2.1 GB. 
 
I'm a new Linux user, but I thought that R under Linux would by default take 
advantage of all available memory. I've searched previous posts and found only 
1 thread (posted by David Katz) that didn't really help.
 
Do I need to specify that the Ubuntu/Linux OS use all 8 Gigs of RAM? Could I 
have done something wrong when I downloaded and installed R? 
 
I'd appreciate any advice. 
 
Thanks in advance,
 
Zack
 [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.


        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to