MARK MARK - 1 month ago 14x
Python Question

How to change SparkContext properties in Interactive PySpark session

How can I change spark.driver.maxResultSize in pyspark interactive shell? I have used the following code

from pyspark import SparkConf, SparkContext
conf = (SparkConf()
.set("spark.driver.maxResultSize", "10g"))

but it gives me the error

AttributeError: 'SparkConf' object has no attribute '_get_object_id'


So what your seeing is that the SparkConf isn't a java object, this is happening because its trying to use the SparkConf as the first parameter, if instead you do sc=SparkContext(conf=conf) it should use your configuration. That being said, you might be better of just starting a regular python program rather than stopping the default spark context & re-starting it, but you'll need to use the named parameter technique to pass in the conf object either way.