Not sure if it works, but you can try using "\${SLURM_ARRAY_JOB_ID}.
The "\" to escape the early evaluation of the env variables.

On Thu, Nov 10, 2022 at 6:53 PM Chase Schuette <chaseschue...@gmail.com> wrote:
>
> Due to needing to support existing HPC workflows. I have a need to pass a 
> bash script within a python subprocess. It was working great with openpbs, 
> now I need to convert it to SLURM. I have it largely working in SLURM hosted 
> on Ubuntu 20.04 except that the job array is not being populated.
>
> I've read from another user that BASH may try to evaluate variables before 
> they are defined by the SLURM job. I've also seen that errors in SBATCH 
> directives, such as a non-alphanumeric job name, can cause SLURM to not 
> evaluate the following directives. Can someone advise me on when SLURM 
> populates variables?
>
> I have a StackOverflow post here 
> https://stackoverflow.com/questions/74323372/slurm-array-job-bash-scripting-within-python-subprocess
>
> Regards,
> --
>
> Chase Schuette Pronouns: He/Him/His | Caterpillar
>
> Autonomy High Performance Computing | Iowa State University Relations
>
> Mobile: 507-475-1949 | Email: chase.schue...@gmail.com | LinkedIn
> Schedule 15mins here: https://calendly.com/chaseschuette

Reply via email to