reza reza - 1 year ago 173
Scala Question

How to run external jar functions in spark-shell

I created a jar package from a project by this file-tree:


where Tester is a class by a function (name is print()) and main has an object to run that prints "Hi!" (from spark documention)
created a jar file by sbt successfully and worked well in spark-submit

now I wanna add it into spark-shell and use Tester class as a class to create objects and ...
I added the jar file into spark-default.conf but:

scala> val t = new Tester();
<console>:23: error: not found: type Tester
val t = new Tester();

Answer Source

you can tried its as below by providing jars with argument as below

./spark-shell --jars pathOfjarsWithCommaSeprated

Or you can add following configuration in you spark-defaults.conf remember please ensure you remove template from end of spark-defaults

spark.driver.extraClassPath  pathOfJarsWithCommaSeprated
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download