Sasha Sasha - 9 months ago 65
Scala Question

spark: how to include dependencies with build/sbt compile

I am new to spark but am trying to do some development. I am following "Reducing Build Times" instructions from the spark developer page. After creating the normal assembly I have written some classes that are dependent on one specific jar. I test my package in the spark-shell in which I have been able to include my jar by using defining

, but the problem lies in actually compiling my code. What I want to achieve is to include that jar when compiling my added package (with
build/sbt compile
). Could I do that by adding a path to my jar in
file or
, and if so how?

(Side note: I do not want to yet include the jar in the assembly because as I go I make some changes to it, and so it would be inconvenient. I am using Spark 1.4)

Any help is appreciated!

Answer Source

Based on the answer in the comments above, it looks like you are trying to add your jar as a dependency to the the mllib project as you do development on mllib itself. You can accomplish this by modifying the pom.xml file in the mllib directory within the Spark distribution.

You can find instructions on how to add a local file as a dependency here - I haven't used this approach myself to including local file as a dependency, but I think it should work.