Make42 Make42 - 2 months ago 20
Scala Question

How do I run a local Spark 2.x Session?

For testing purposes I want to have Spark 2.x run in a local mode. How can I do this? Can I do this? Currently I write in a


val spark = SparkSession

and run the main in IntelliJ, but I get the error

org.apache.spark.SparkException: A master URL must be set in your configuration

I guess I need to have some local instance running or set a local mode or something like that. What should I do exactly?


You should configure a .master(..) before calling getOrCreate:

val spark = SparkSession.builder

"local" means all of Spark's components (master, executors) will run locally within your single JVM running this code (very convenient for tests, pretty much irrelevant for real world scenarios). Read more about other "master" options here.