For testing purposes I want to have Spark 2.x run in a local mode. How can I do this? Can I do this? Currently I write in a
val spark = SparkSession
org.apache.spark.SparkException: A master URL must be set in your configuration
You should configure a
.master(..) before calling
val spark = SparkSession.builder .master("local") .appName("RandomForestClassifierExample") .getOrCreate()
"local" means all of Spark's components (master, executors) will run locally within your single JVM running this code (very convenient for tests, pretty much irrelevant for real world scenarios). Read more about other "master" options here.