JKC JKC - 6 months ago 77
Scala Question

Updating Dataframe Column name in Spark - Scala while performing Joins

I have two dataframes aaa_01 and aaa_02 in Apache Spark 2.1.0.

And I perform an Inner Join on these two dataframes selecting few colums from both dataframes to appear in the output.

The Join is working perfectly fine but the output dataframe has the column names as it was present in the input dataframes. I get stuck here. I need to have new column names instead of getting the same column names in my output dataframe.

Sample Code is given below for reference

DF1.alias("a").join(DF2.alias("b"),DF1("primary_col") === DF2("primary_col"), "inner").select("a.col1","a.col2","b.col4")

I am getting the output dataframe with column names as "col1, col2, col3". I tried to modify the code as below but in vain

DF1.alias("a").join(DF2.alias("b"),DF1("primary_col") === DF2("primary_col"), "inner").select("a.col1","a.col2","b.col4" as "New_Col")

Any help is appreciated. Thanks in advance.


I browsed and got similar posts which is given below. But I do not see an answer to my question.

Updating a dataframe column in spark

Renaming Column names of a Data frame in spark scala

The answers in this post : Spark Dataframe distinguish columns with duplicated name are not relevant to me as it is related more to pyspark than Scala and it had explained how to rename all the columns of a dataframe whereas my requirement is to rename only one or few columns.

Answer Source

You want to rename columns of the dataset, the fact that your dataset comes from a join does not change anything. Yo can try any example from this answer, for instance :

DF1.alias("a").join(DF2.alias("b"),DF1("primary_col") === DF2("primary_col"), "inner")
Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download