satyambansal117 satyambansal117 - 27 days ago 15
Scala Question

Transforming two dataframes in spark sql

I am having two dataframes in spark scala registered as tables.
From these two tables

Table 1:

+-----+--------+
|id |values |
+-----+----- +
| 0 | v1 |
| 0 | v2 |
| 1 | v3 |
| 1 | v1 |
+-----+----- +


Table 2:

+-----+----+--- +----+
|id |v1 |v2 | v3
+-----+-------- +----+
| 0 | a1| b1| - |
| 1 | a2| - | c2 |

+-----+---------+----+


I want to generate a new table using the above two tables.

Table 3:

+-----+--------+--------+
|id |values | field |
+-----+--------+--------+
| 0 | v1 | a1 |
| 0 | v2 | b1 |
| 1 | v3 | c2 |
| 1 | v1 | a2 |
+-----+--------+--------+


I am using spark sql in scala .

Is it possible to do the desired thing by writing some sql query or using some spark functions on dataframes.

Answer

Here is the sample code that you can use , that will generate this output :

The code looks like this :

val df1=sc.parallelize(Seq((0,"v1"),(0,"v2"),(1,"v3"),(1,"v1"))).toDF("id","values")
val df2=sc.parallelize(Seq((0,"a1","b1","-"),(1,"a2","-","b2"))).toDF("id","v1","v2","v3")
val joinedDF=df1.join(df2,"id")
val resultDF=joinedDF.rdd.map{row=>
val id=row.getAs[Int]("id")
val values=row.getAs[String]("values")
val feilds=row.getAs[String](values)
(id,values,feilds)
}.toDF("id","values","feilds")

While testing on Console:

scala> val df1=sc.parallelize(Seq((0,"v1"),(0,"v2"),(1,"v3"),(1,"v1"))).toDF("id","values")
df1: org.apache.spark.sql.DataFrame = [id: int, values: string]

scala> df1.show
+---+------+
| id|values|
+---+------+
|  0|    v1|
|  0|    v2|
|  1|    v3|
|  1|    v1|
+---+------+


scala> val df2=sc.parallelize(Seq((0,"a1","b1","-"),(1,"a2","-","b2"))).toDF("id","v1","v2","v3")
df2: org.apache.spark.sql.DataFrame = [id: int, v1: string ... 2 more fields]

scala> df2.show
+---+---+---+---+
| id| v1| v2| v3|
+---+---+---+---+
|  0| a1| b1|  -|
|  1| a2|  -| b2|
+---+---+---+---+


scala> val joinedDF=df1.join(df2,"id")
joinedDF: org.apache.spark.sql.DataFrame = [id: int, values: string ... 3 more fields]

scala> joinedDF.show
+---+------+---+---+---+                                                        
| id|values| v1| v2| v3|
+---+------+---+---+---+
|  1|    v3| a2|  -| b2|
|  1|    v1| a2|  -| b2|
|  0|    v1| a1| b1|  -|
|  0|    v2| a1| b1|  -|
+---+------+---+---+---+


scala> val resultDF=joinedDF.rdd.map{row=>
     | val id=row.getAs[Int]("id")
     | val values=row.getAs[String]("values")
     | val feilds=row.getAs[String](values)
     | (id,values,feilds)
     | }.toDF("id","values","feilds")
resultDF: org.apache.spark.sql.DataFrame = [id: int, values: string ... 1 more field]

scala> 

scala> resultDF.show
+---+------+------+                                                             
| id|values|feilds|
+---+------+------+
|  1|    v3|    b2|
|  1|    v1|    a2|
|  0|    v1|    a1|
|  0|    v2|    b1|
+---+------+------+

I hope this might your problem. Thanks!