Hi everyone,

Spark supports in application, job concurrency execution by using pools and
Spark's Fair scheduler (different than Yarn's Fair scheduler).
link:
https://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application

Is this feature supported when Yarn is used as a cluster manager? Are there
special configurations I have to set or common pitfalls I need to be aware
of?

Thanks,
Anton

Reply via email to