johnblund johnblund - 1 month ago 16
Scala Question

Aggregate a Spark data frame using an array of column names, retaining the names

I would like to aggregate a Spark data frame using an array of column names as input, and at the same time retain the original names of the columns.


This works but fails to preserve the names. Inspired by the answer found here I unsucessfully tried this:

error: no `: _*' annotation allowed here

It works to take a single element like


How can make this happen for the entire array?


Just provide an sequence of columns with aliases:

val colNames: Seq[String] = ???
val exprs = => sum(c).alias(c))
df.groupBy($"id").agg(exprs.head, exprs.tail: _*)