Make42 Make42 - 1 month ago 16
Scala Question

How do I run a local Spark 2.x Session?

For testing purposes I want to have Spark 2.x run in a local mode. How can I do this? Can I do this? Currently I write in a

main
:

val spark = SparkSession
.builder
.appName("RandomForestClassifierExample")
.getOrCreate()


and run the main in IntelliJ, but I get the error

org.apache.spark.SparkException: A master URL must be set in your configuration


I guess I need to have some local instance running or set a local mode or something like that. What should I do exactly?

Answer

You should configure a .master(..) before calling getOrCreate:

val spark = SparkSession.builder
  .master("local")
  .appName("RandomForestClassifierExample")
  .getOrCreate()

"local" means all of Spark's components (master, executors) will run locally within your single JVM running this code (very convenient for tests, pretty much irrelevant for real world scenarios). Read more about other "master" options here.

Comments