Raj Raj - 3 months ago 18
Scala Question

How to get default property values in Spark

I am using this version of Spark :

spark-1.4.0-bin-hadoop2.6
. I want to check few default properties. So I gave the following statement in
spark-shell


scala> sqlContext.getConf("spark.sql.hive.metastore.version")


I was expecting the call to method
getConf
to return a value of
0.13.1
as desribed in this link. But I got the below exception

java.util.NoSuchElementException: spark.sql.hive.metastore.version
at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)
at org.apache.spark.sql.SQLConf$$anonfun$getConf$1.apply(SQLConf.scala:283)


Am I retrieving the properties in the right way?

Answer

You can use

sc.getConf.toDebugString

OR

sqlContext.getAllConfs

which will return all values that have been set, however some defaults are in the code. In your specific example, it is indeed in the code:

getConf(HIVE_METASTORE_VERSION, hiveExecutionVersion)

where the default is indeed in the code:

val hiveExecutionVersion: String = "0.13.1"

So, getConf will attempt to pull the metastore version from the config, falling back to a default, but this is not listed in the conf itself.

Comments