On our local cluster, users relatively routinely crash the head node
by doing simple and silly things that inadvertently eat all memory and
crash the head node.  I read a bit about memory limits, but I am still
a bit unclear as to whether memory limits can be imposed a the R level
under linux.  Currently, I see this under help(Memory):

"R has a variable-sized workspace. Prior to R 2.15.0 there were
(rarely-used) command-line options to control its size, but it is now
sized automatically."

What approaches can I suggest to our cluster administrators to limit R
memory usage (and only R) on the head node (the only node with an
outside internet connection)?  The goal is to allow a small R instance
to run on the head node to allow package installation, but nothing
more complicated than that.

Thanks,
Sean

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to