I work an application Spark-scala and I built project with sbt, my arborescence is:
projectFilms/src/main/scala/AppFilms
hdfs/tmp/projetFilms/<my_3_Files>
java.lang.IllegalArgumentException: java.net.UnknownHostException: tmp
[trace] Stack trace suppressed: run last compile:run for the full output.
ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.InterruptedException
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd._
object appFilms {
val conf = new SparkConf().setAppName("system of recommandation").setMaster("local[*]")
val sc = new SparkContext(conf)
def main(args: Array[String]) {
val files = sc.wholeTextFiles("hdfs://tmp/ProjetFilm/*.dat")
//val nbfiles = files.count
println("Hello my application!")
sc.stop()
}
}
root@sandbox projectFilms# hadoop fs -cat /tmp/ProjetFilms/*
the error IllegalArgumentException: java.net.UnknownHostException: tmp
is because in wholeTextFiles
value its taking tmp
as the hostname. Replace value with hdfs:///tmp/ProjetFilm/*.dat