Mikhail T. Mikhail T. - 3 months ago 20
C# Question

C#: Double comparison precision loss. Accuracy loss happening when adding subtracting doubles

Just started learning C#. Plan to use it for heavy math simulations, including numerical solving. The problem is I get precision loss when adding and substracting

double
's, as well as when comparing. Code and what it returns (in comments) is below:

namespace ex3
{
class Program
{
static void Main(string[] args)
{

double x = 1e-20, foo = 4.0;

Console.WriteLine((x + foo)); // prints 4
Console.WriteLine((x - foo)); // prints -4
Console.WriteLine((x + foo)==foo); // prints True BUT THIS IS FALSE!!!
}
}
}


Dear ladies and gentlemen, would appreciate any help and clarifications! Thank you in advance!

EDIT: What puzzles me is that
(x + foo)==foo
returns
True
.
Thanks!

Answer

Take a look at the MSDN reference for double: https://msdn.microsoft.com/en-AU/library/678hzkk9.aspx

It states that a double has a precision of 15 to 16 digits.

But the difference, in terms of digits, between 1e-20 and 4.0 is 20 digits. The mere act of trying to add or subtract 1e-20 to or from 4.0 simply means that the 1e-20 is lost because it cannot fit within the 15 to 16 digits of precision.

So, as far as double is concerned, 4.0 + 1e-20 == 4.0 and 4.0 - 1e-20 == 4.0.