reza reza - 1 month ago 10
Scala Question

How to run external jar functions in spark-shell

I created a jar package from a project by this file-tree:

build.sbt
src/main
src/main/scala
src/main/scala/Tester.scala
src/main/scala/main.scala


where Tester is a class by a function (name is print()) and main has an object to run that prints "Hi!" (from spark documention)
created a jar file by sbt successfully and worked well in spark-submit

now I wanna add it into spark-shell and use Tester class as a class to create objects and ...
I added the jar file into spark-default.conf but:

scala> val t = new Tester();
<console>:23: error: not found: type Tester
val t = new Tester();

Answer

you can tried its as below by providing jars with argument as below

./spark-shell --jars pathOfjarsWithCommaSeprated

Or you can add following configuration in you spark-defaults.conf remember please ensure you remove template from end of spark-defaults

spark.driver.extraClassPath  pathOfJarsWithCommaSeprated
Comments