haosmark haosmark - 2 months ago 16x
C# Question

C# decimal and double

From what I understand decimal is used for precision and is recommended for monetary calculations. Double gives better range, but less precision and is a lot faster than decimal.

What if I have time and rate, I feel like double is suited for time and decimal for rate. I can't mix the two and run calculations without casting which is yet another performance bottleneck. What's the best approach here? Just use decimal for time and rate?


The rule of thumb is to use the type that is more suitable to the values you will handle. This means that you should use DateTime or TimeSpan for time, unless you only care about a specific unit, like seconds, days, etc., in which case you can use any integer type. Usually for time you need precision and don't want any error due to rounding, so I wouldn't use any floating point type like float or double.

For anything related to money, of course you don't want any rounding error either, so you should really use decimal here.

Finally, only if for some very specific requirements you need absolute speed in a calculation that is done millions of times and for which decimal happens not to be fast enough, only then I would think of using another faster type. I would first try with integer values (maybe multiplying your value by a power of 10 if you have decimals) and only divide by this power of 10 at the end. If this can't be done, only then I would think of using a double. Don't do a premature optimization if you are not sure it's needed.