Todor Markov Todor Markov - 4 months ago 37
Java Question

Importing my jar to spark shell

I have a simple scala maven module which is part of a larger project (I created it as described here:

package com.myorg.simplr

import [...]

case class Simplr (){
//class code

I am trying to use this class in spark shell, so I built a jar file "simplr-1.0.jar" and launched the spark shell with --jars simplr-1.0.jar.

Then, when I try to import, I get the following

scala> import com.myorg.simplr.Simplr
<console>:25: error: object myorg is not a member of package com
import com.myorg.simplr.Simplr

How can I make the import to work?

I used maven to build, and here's my pom.xml:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns=""






Please make sure some below points it will works 1. start spark shell like ./spark-shell --jars jar_path 2. There is class file in jar under the same package which you import, open jar and check it. 3. After start spark go to http://localhost:4040/environment/ you jar will be in classpath entries or not.