Zen Zen - 2 months ago 5
Python Question

Why `0.4/2` equals to `0.2` meanwhile `0.6/3` equals to `0.19999999999999998` in python?

I know these are float point division. But why did these two formula behave differently?

And I did some more investigation, the result confusing me even more:

>>>0.9/3
0.3

>>>1.2/3
0.39999999999999997

>>>1.5/3
0.5


What's the logic here to decide whether the result is printed with one decimal place or more?

PS: I used python3.4 to do the experiment above.

Answer

Because the exact values of the floating point results are slightly different.

>>> '%.56f' % 0.4
'0.40000000000000002220446049250313080847263336181640625000'
>>> '%.56f' % (0.4/2)
'0.20000000000000001110223024625156540423631668090820312500'
>>> '%.56f' % 0.6
'0.59999999999999997779553950749686919152736663818359375000'
>>> '%.56f' % (0.6/3)
'0.19999999999999998334665463062265189364552497863769531250'
>>> '%.56f' % 0.2
'0.20000000000000001110223024625156540423631668090820312500'
>>> (0.2 - 0.6/3) == 2.0**-55
True

As you can see, the result that is printed as "0.2" is indeed slightly closer to 0.2. I added the bit at the end to show you what the exact value of the difference between these two numbers is. (In case you're curious, the above representations are the exact values - adding any number of digits beyond this just adds more zeroes).

Comments