The singleton dependency seems exactly what I need!

However, does it really matter to the network if I upload five 1 GB files 
sequentially or all at once? I am not too savy on how routers operate. But 
don't they already do so some kind of load balancing to make sure enough 
bandwidth is available to other users?






On Monday, March 23, 2020, 11:36:46 AM EDT, Renfro, Michael <ren...@tntech.edu> 
wrote: 





Rather than configure it to only run one job at a time, you can use job 
dependencies to make sure only one job of a particular type at a time. A 
singleton dependency [1, 2] should work for this. From [1]:

  #SBATCH --dependency=singleton --job-name=big-youtube-upload

in any job script would ensure that only one job with that job name should run 
at a time.

[1] https://slurm.schedmd.com/sbatch.html
[2] https://hpc.nih.gov/docs/job_dependencies.html

-- 
Mike Renfro, PhD / HPC Systems Administrator, Information Technology Services
931 372-3601    / Tennessee Tech University

> On Mar 23, 2020, at 10:00 AM, Faraz Hussain <faraz_huss...@yahoo.com> wrote:
> 
> External Email Warning
> 
> This email originated from outside the university. Please use caution when 
> opening attachments, clicking links, or responding to requests.
> 
> ________________________________
> 
> I have a five node cluster of raspberry pis. Every hour they all have to 
> upload a local 1 GB file to YouTube. I want it so only one pi can upload at a 
> time so that network doesn't get bogged down.
> 
> Can slurm be configured to only run one job at a time? Or perhaps some other 
> way to accomplish what I want?
> 
> Thanks!
> 

Reply via email to