I have a question about multiple cores and CPU's for running R.  I've
been running various tests on different types of hardware and operating
systems (64 bit, 32 bit, Solaris, Linux, Windows, RV.10, .12, .15,
.15.1.)  Generally speaking, it seems that for a single user and process
that R prefers to have as much resources as possible; especially memory.

I've looked at some of the r-sig groups and it seems most threads about
multicore or CPU's have to deal with using packages like parallel or
snow or something such as this and leaving it up to the user what can be
parallelized and how which makes perfect sense to me.  

The question I have is about what the advantage of machines with
multiple cores or CPU's.  To me, it seems that until R is parallelized
(or if I am writing my own code that can run in a parallel fashion) that
a single user and single (non-parallel) process would work just as fine
on a single core, single CPU box as it would on (for example) a quad
core with 24 threads?  Is my observation off or am I missing something?
Of course, I know there are other "things" going on with most modern
systems such as operating system and other processes that in practice
affect this observation a bit.  In addition if I had a R environment
where multiple people were logging in and running R queries that would
likely gain from multiple CPU's and a multitasking OS I'm assuming.

I'm just trying to get a feeling and validate what I think I am seeing.
Thoughts and comments are appreciated.


Thanks,

Anthony

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to