Raj Raj - 11 months ago 58
Scala Question

How to get default property values in Spark

I am using this version of Spark :

. I want to check few default properties. So I gave the following statement in

scala> sqlContext.getConf("spark.sql.hive.metastore.version")

I was expecting the call to method
to return a value of
as desribed in this link. But I got the below exception

java.util.NoSuchElementException: spark.sql.hive.metastore.version
at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)

Am I retrieving the properties in the right way?

Answer Source

You can use




which will return all values that have been set, however some defaults are in the code. In your specific example, it is indeed in the code:

getConf(HIVE_METASTORE_VERSION, hiveExecutionVersion)

where the default is indeed in the code:

val hiveExecutionVersion: String = "0.13.1"

So, getConf will attempt to pull the metastore version from the config, falling back to a default, but this is not listed in the conf itself.