I want to import a few tables into a DB which (at the moment) doesn't support decimal precision greater than 18. So i tried ALTER TABLE in Spark like this to lower the precision. It isn't important how it's rounded.
hiveContext.sql("ALTER TABLE rt_rel_acc_acc CHANGE REL001 REL001 decimal(18, 9)")
hiveContext.read.table("rt_rel_acc_acc").write.jdbc(url, "rt_rel_acc_acc", properties)
PSQLException: ERROR: invalid numeric size '30,15'
You can cast any column in the Dataframe. Just like the below.
val df2 = df.withColumn("New_Column_Name", df.col("Actual_Column_name").cast("Decimal(18, 9)"))
In this case, you will have both old and new column names. You can remove or select columns and then apply saveAsTable or use it for other tables.
Let me know, if this works. If my understanding about the problem is wrong, i would be happy to try other solutions based on your response.
Thank You, Regards, Srini.