Mayank Patel - 11 months ago 49

SQL Question

I have 4 records in an SQL database. Which has 2 colums hours1 and hours2

- Record1 : hours1- 1.00,hours2- 1.00
- Record2 : hours1- 9.00,hours2- 5.30
- Record3 : hours1- 9.00,hours2- 9.00
- Record4 : hours1- 3.00,hours2- 3.00

So here I got the sum of these colums as :

`SUM of column1 : 22.00`

SUM of column1 : 18.30

These are decimal but I consider it as time.

Now I want to subtract column2 sum from column1 sum i.e.

`22.00 - 18.30`

So I will get

`3.7`

I want final answer as : 3.30 (justification for requirement : so when I add this result to 18.30 (18.30+3.30 which is 21.60 = 22.00 hours as 60 minute = 1 hour) I got total 22.00).

Can anyone help me how to get

`3.30`

Answer Source

As everyone said in the comments: conversion is needed, and it must be done before summing takes place.

Here is how:

```
var times1 = new decimal[] { 1.00m, 9.00m, 9.00m, 3.00m };
var times2 = new decimal[] { 1.00m, 5.30m, 9.00m, 3.00m };
var hours1 = ActualHours(times1).Sum(); // using System.Linq;
var hours2 = ActualHours(times2).Sum();
var diff = hours1 - hours2;
var result = "Actual: hours1 = " + hours1 +
", hours2 = " + hours2 +
", diff = " + diff + "\r\n" +
"Printed: hours1 = " + Print(hours1) +
", hours2 = " + Print(hours2) +
", diff = " + Print(diff);
```

Methods used:

```
public IEnumerable<decimal> ActualHours(params decimal[] values)
{
foreach (var v in values)
{
var t = Math.Truncate(v);
yield return t + (v - t) / 0.6m;
}
}
public string Print(decimal v)
{
var t = Math.Truncate(v);
return (t + (v - t) * 0.6m).ToString("0.00");
}
```

Output:

```
Actual: hours1 = 22.0, hours2 = 18.5, diff = 3.5
Printed: hours1 = 22.00, hours2 = 18.30, diff = 3.30
```