dreddy dreddy - 10 months ago 130
Scala Question

postgres bool_or equivalent in spark

I'm trying to convert a postgres query to spark

bool_or(case when col_1 is null then false else true end),
from fct_table

Below is the dataframe I am trying to work with that has col_1 and col_2:

val df = spark.table("fct_table")
bool_or(when(col("col_1") isNull,false).otherwise(true))

I'm doing this in scala and bool_or is not an aggregate function.
Any help is appreciated.

Answer Source

With sum:

import org.apache.spark.sql.Column

def bool_or(expr: Column) = sum(expr.cast("integer")) > 0

With count

def bool_or(expr: Column) = count(when(expr, lit(1))) > 0