Balaji Reddy Balaji Reddy - 1 month ago 16
Scala Question

Is FAIR available for Spark Standalone cluster mode?

I'm having 2 node cluster with spark standalone cluster manager. I'm triggering more than one job using same

sc
with Scala multi threading.What I found is my jobs are scheduled one after another because of FIFO nature so I tried to use FAIR scheduling

conf.set("spark.scheduler.mode", "FAIR")
conf.set("spark.scheduler.allocation.file", sys.env("SPARK_HOME") + "/conf/fairscheduler.xml")
sc.setLocalProperty("spark.scheduler.pool", "mypool")


<pool name="mypool">
<schedulingMode>FAIR</schedulingMode>
<weight>1</weight>
<minShare>2</minShare>
</pool>


Even after setting these properties, my jobs are handled in FIFO.
Is FAIR available for Spark Standalone cluster mode?Is there a page
where it's described in more details? I can't seem to find much about
FAIR and Standalone in Job Scheduling.I'm following this SOF question.am I missing anything here ?

Answer

I don't think standalone is the problem. You described creating only one pool, so I think your problem is that you need at least one more pool and assign each job to a different pool.

FAIR scheduling is done across pools, anything within the same pool will run in FIFO mode anyway.

This is based on the documentation here: https://spark.apache.org/docs/latest/job-scheduling.html#default-behavior-of-pools