V
Vio
I need to test for equality between simple 2 decimal numbers. For example:
if (10 + 15.99) == 25.99:
do some stuff...
The preceding sentence should be TRUE, but to Python it appears FALSE.
Which is wrong.
Perhaps because Python translates "25.99" to "25.98999999999999998" and
not "25.99", which may be the reason for this error (me guessing...). If
that's the case, how do I force Python to only use 2 decimal points, and
not "make up" superfluous decimals? Or if that's not the cause for the
problem, how do I make Python see my math expression as TRUE (as it
"should" be)?
Cheers,
Vio
PS. If it's of any help, I'm using Python2.3 (GCC 2.95.4 20011002
(Debian prerelease))
if (10 + 15.99) == 25.99:
do some stuff...
The preceding sentence should be TRUE, but to Python it appears FALSE.
Which is wrong.
Perhaps because Python translates "25.99" to "25.98999999999999998" and
not "25.99", which may be the reason for this error (me guessing...). If
that's the case, how do I force Python to only use 2 decimal points, and
not "make up" superfluous decimals? Or if that's not the cause for the
problem, how do I make Python see my math expression as TRUE (as it
"should" be)?
Cheers,
Vio
PS. If it's of any help, I'm using Python2.3 (GCC 2.95.4 20011002
(Debian prerelease))