mithrix mithrix - 26 days ago 15
Scala Question

Round Down Double in Spark

I have some cassandra data that is of the type double that I need to round down in spark to 1 decimal place.

The problem is how to extract it from cassandra, convert it to a decimal, round down to 1 decimal point and then write back to a table in cassandra. My rounding code is as follows:

BigDecimal(number).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble


This works great if the number going in is a decimal but I dont know how to convert the double to a decimal before rouding. My Double needs to be divided by 1000000 prior to rounding.

For example 510999000 would be 510.990 before being rounded down to 510.9

EDIT: I was able to get it to do what I wanted with the following command.

BigDecimal(((number).toDouble) / 1000000).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble


Not sure how good this is but it works.

Answer

The answer I was able to work with was:

 BigDecimal(((number).toDouble) / 1000000).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble
Comments