Aviral Kumar Aviral Kumar - 2 months ago 10
Scala Question

Hive table not found in Spark-SQL

I am facing a weird problem.
I am trying to connect to a table in my spark-sql java code by :

JavaSparkContext js = new JavaSparkContext();
SQLContext sc = new SQLContext(js);


DataFrame mainFile = sc.sql("Select * from db.table");


It gives me a table not found exception.
But when I do that in


spark-shell using scala, it works fine.The table gets accessed and I can print out the data also.
Any inputs on this issue?

Answer

Spark-shell provides HiveContext. If you want to use HiveContext in Java code then add dependency for that in your application and then use in java program. Please refer http://spark.apache.org/docs/1.6.2/sql-programming-guide.html#hive-tables

<dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.6.2</version>
</dependency>