Hi Lachlan, Thank you for sharing your environment. Everyone has their own set of rules and i appreciate everyone’s input. It seems as if the NFS share is a great place to start.
Best, Eric _____________________________________________________________________________________________________ Eric F. Alemany System Administrator for Research Division of Radiation & Cancer Biology Department of Radiation Oncology Stanford University School of Medicine Stanford, California 94305 Tel:1-650-498-7969<tel:1-650-498-7969> No Texting Fax:1-650-723-7382<tel:1-650-723-7382> On May 10, 2018, at 4:23 PM, Lachlan Musicman <data...@gmail.com<mailto:data...@gmail.com>> wrote: On 11 May 2018 at 01:35, Eric F. Alemany <ealem...@stanford.edu<mailto:ealem...@stanford.edu>> wrote: Hi All, I know this might sounds as a very basic question: where in the cluster should I install Python and R? Headnode? Execute nodes ? And is there a particular directory (path) I need to install Python and R. Background: SLURM on Ubuntu 18.04 1 headnode 4 execute nodes NFS shared drive among all nodes. Eric, To echo the others: we have a /binaries nfs share that utilises the standard Environment Modules software so that researchers can manipulate their $PATH on the fly with module load/module unload. That share is mounted on all the nodes. For Python, I use virtualenv's but instead of activating, the path is changed by the Module file. Personally, I find conda doesn't work very well in a shared environment. It's fine on a personal level/ For R, we have resorted to only installing the main point release because we have >700 libraries installed within R and I don't want to reinstall them every time. We do also have packrat installed so researchers can install their own libraries locally as well. Cheers L.