Hello,
 
I am running an R job on a Windows 7 machine, having 4 cores and 16GB RAM , R 
3.0.1, and it takes 1.5 hours to complete.
I am running the same job in R on a Linux enviroment (Platform: 
x86_64-redhat-linux-gnu (64-bit))
with huge amounts of memory: 40 cores and .5 TB RAM., and the job takes 3h and 
15min
to complete (no other concurrent jobs).  The job uses the glmnet package to 
perform model selection on a simulated data set having 1 million records and 
150 variables.
My questions are:
1. Why R doesn't take advantage of the avaialble RAM?
2. Are there any changes that we can apply to the R configuration file in order 
to see superior performance? My expectations are that the Linux enviroment 
would performe a lot better when compared to the Windows enviroment.
 
Any help in sorting out these issues is much appreciated.
 
Thank you in advance!
Alina 
        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to