Stephane Stephane - 1 month ago 23
Scala Question

Why does submitting a job fail with "NoSuchMethodError: scala.runtime.VolatileObjectRef.zero()Lscala/runtime/VolatileObjectRef;"?

I'm trying to submit a spark job

It starts this way:

import javax.xml.parsers.{SAXParser, SAXParserFactory}

import org.apache.spark
import org.apache.spark.graphx.{Graph, Edge, VertexId}
import org.apache.spark.rdd.{PairRDDFunctions, RDD}
import org.apache.spark.storage.StorageLevel
import org.apache.spark.{SparkContext, SparkConf}
import scala.util.Try
import org.apache.log4j.{Level, Logger}


object MyApp {

def main(args: Array[String]) {

val sparkConf = new SparkConf().setAppName("MyApp")
val sc = new SparkContext(sparkConf)


And when I launch it I get the following error:

App > Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.VolatileObjectRef.zero()Lscala/runtime/VolatileObjectRef;
App > at MyApp$.main(MyApp.scala)
App > at MyApp.main(MyApp.scala)
App > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
App > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
App > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
App > at java.lang.reflect.Method.invoke(Method.java:606)
App > at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
App > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
App > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


What am I doing wrong?

EDIT: Included full stack trace.
Using Scala 2.10 and Spark 1.2.0.
What's weird is that in my jar, I have two classes. When I spark submit one, it works (it's a 4 lines dummy job), but when I run the longer one (about 40 lines), if fails with the error above

Answer

zero() on scala.runtime.VolatileObjectRef has been introduced in Scala 2.11 You probably have a library compiled against Scala 2.11 and running on a Scala 2.10 runtime.

See