Krishna Reddy Krishna Reddy - 1 month ago 27x
Python Question

Spark - How many Executors and Cores are allocated to my spark job

Spark architecture is entirely revolves around the concept of executors and cores. I would like to see practically how many executors and cores running for my spark application running in a cluster.

I was trying to use below snippet in my application but no luck.

val conf = new SparkConf().setAppName("ExecutorTestJob")
val sc = new SparkContext(conf)

Is there any way to get those values using
Object or
object etc..


Scala :

getExecutorStorageStatus and getExecutorMemoryStatus both return the number of executors including driver. like below example snippet.

/** Method that just returns the current active/registered executors
        * excluding the driver.
        * @param sc The spark context to retrieve registered executors.
        * @return a list of executors each in the form of host:port.
       def currentActiveExecutors(sc: SparkContext): Seq[String] = {
         val allExecutors =
         val driverHost: String = sc.getConf.get("")
         allExecutors.filter(! _.split(":")(0).equals(driverHost)).toList

sc.getConf.getInt("spark.executor.instances", 1)

similarly get all properties and print like below you may get cores information as well..




Mostly spark.executor.cores for executors spark.driver.cores driver should have this value.

Python :

Above methods getExecutorStorageStatus and getExecutorMemoryStatus, In python api were not implemented