mithrix - 1 year ago 173

Scala Question

I have some cassandra data that is of the type double that I need to round down in spark to 1 decimal place.

The problem is how to extract it from cassandra, convert it to a decimal, round down to 1 decimal point and then write back to a table in cassandra. My rounding code is as follows:

`BigDecimal(number).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble`

This works great if the number going in is a decimal but I dont know how to convert the double to a decimal before rouding. My Double needs to be divided by 1000000 prior to rounding.

For example 510999000 would be 510.990 before being rounded down to 510.9

`BigDecimal(((number).toDouble) / 1000000).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble`

Not sure how good this is but it works.

Recommended for you: Get network issues from **WhatsUp Gold**. **Not end users.**

Answer Source

The answer I was able to work with was:

```
BigDecimal(((number).toDouble) / 1000000).setScale(1, BigDecimal.RoundingMode.DOWN).toDouble
```

Recommended from our users: **Dynamic Network Monitoring from WhatsUp Gold from IPSwitch**. ** Free Download**