I'm trying to register some udfs (user defined functions) for spark sql and when I try to compile I get a type mismatch error for the following code:
csc.udf.register("DATEADD", (datePart: String, number: Int, date: Timestamp) => {
val time: Calendar = Calendar.getInstance
datePart match {
case "Y" => time.add(Calendar.YEAR, number)
case "M" => time.add(Calendar.MONTH, number)
case "D" => time.add(Calendar.DATE, number)
case "h" => time.add(Calendar.HOUR, number)
case "m" => time.add(Calendar.MINUTE, number)
case "s" => time.add(Calendar.SECOND, number)
case _ => 0
}
}: Int)
[error] /vagrant/SQLJob/src/main/scala/sqljob/context/CassandraSQLContextFactory.scala:111: type mismatch;
[error] found : Unit
[error] required: Int
[error] case "Y" => time.add(Calendar.YEAR: Int, number)
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
[error] Total time: 7 s, completed Jul 2, 2015 4:19:29 PM
The type of your function is (String, Int, Timestamp) => Int
so it should return an Int
.
Calendar.add
is the last expression and therefore returned by the function, but its return type is Unit
instead of Int
. That's why you get an error telling you that.