sirine sirine - 10 days ago 7
Scala Question

java.lang.IllegalArgumentException: java.net.UnknownHostException: tmp

I work an application Spark-scala and I built project with sbt, my arborescence is:

projectFilms/src/main/scala/AppFilms

I have 3 files in HDFS, these directories is:
hdfs/tmp/projetFilms/<my_3_Files>
, When I run my code by this command line "sbt run", it genere an error:

java.lang.IllegalArgumentException: java.net.UnknownHostException: tmp


and this:

[trace] Stack trace suppressed: run last compile:run for the full output.
ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext
java.lang.InterruptedException


This is my code:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd._

object appFilms {

val conf = new SparkConf().setAppName("system of recommandation").setMaster("local[*]")
val sc = new SparkContext(conf)
def main(args: Array[String]) {

val files = sc.wholeTextFiles("hdfs://tmp/ProjetFilm/*.dat")
//val nbfiles = files.count
println("Hello my application!")
sc.stop()
}
}


I can't read my files from hdfs, but when i write

root@sandbox projectFilms# hadoop fs -cat /tmp/ProjetFilms/*


How I can read the content of all my files from HDFS, knowing that I work always by the same command.

Please can you answers me!

Answer

the error IllegalArgumentException: java.net.UnknownHostException: tmp is because in wholeTextFiles value its taking tmp as the hostname. Replace value with hdfs:///tmp/ProjetFilm/*.dat