G
Gene Wirchenko
Dear JavaScripters:
I need to do work with fixed-decimal quantities (mainly dollar
amounts but others as well). I need to be able to do reliable
arithmetic with them.
0.1 + 0.1 + 0.1 equals 0.15 + 0.15 mathematically, but not with
floating point. I need to have it equal.
So I am cheating. I am storing fixed-decimal amounts internally
as integers. When I need to output a value, I scale it, but any
arithmetic or comparison operations will be on the integers.
My first thought was that I was safe for nine digits worth,
because ECMAScript does a lot of 32-bit operations. That is on the
edge of what I need. The non-JavaScript system that I maintain now
has reports that generate nine digits worth in some reports.
I did some experimenting, and it appears that I might be able to
get 15 digits of precision, but not 16. After determining this, I
referred to the ECMAScript standard (ECMA-262 5.1 Edition of June
2011) to see if this matched. That standard says that it uses the
IEEE 754 floating point format, and some documentation on that says
that it is good for 15.95 digits of precision.
So far, so good.
But am I safe?
Can I count on exact arithmetic with integers of up to 15 digits
of precision?
If yes, can you please point to a reference? If no, please give
me a counterexample.
I would rather not have to mess around like this, but one of
JavaScript's nasty bits is only one number type.
Sincerely,
Gene Wirchenko
I need to do work with fixed-decimal quantities (mainly dollar
amounts but others as well). I need to be able to do reliable
arithmetic with them.
0.1 + 0.1 + 0.1 equals 0.15 + 0.15 mathematically, but not with
floating point. I need to have it equal.
So I am cheating. I am storing fixed-decimal amounts internally
as integers. When I need to output a value, I scale it, but any
arithmetic or comparison operations will be on the integers.
My first thought was that I was safe for nine digits worth,
because ECMAScript does a lot of 32-bit operations. That is on the
edge of what I need. The non-JavaScript system that I maintain now
has reports that generate nine digits worth in some reports.
I did some experimenting, and it appears that I might be able to
get 15 digits of precision, but not 16. After determining this, I
referred to the ECMAScript standard (ECMA-262 5.1 Edition of June
2011) to see if this matched. That standard says that it uses the
IEEE 754 floating point format, and some documentation on that says
that it is good for 15.95 digits of precision.
So far, so good.
But am I safe?
Can I count on exact arithmetic with integers of up to 15 digits
of precision?
If yes, can you please point to a reference? If no, please give
me a counterexample.
I would rather not have to mess around like this, but one of
JavaScript's nasty bits is only one number type.
Sincerely,
Gene Wirchenko