I write in javascript
a= 17770000000000019
and the value in a gets actually 17770000000000020.
Why is it ?
Javascript's (single) numeric primitive type is stored as an IEEE 754
64 bit double precision floating point number. Such a value is
versatile but still limited in what can be accommodated in its 64
bits. For example, the _continuous_ range of integer values that it
can represent goes from -9007199254740991 to +9007199254740992 (2 to
the power of 53). Not all integers outside of that range can be
represented, and those that cannot will be approximated to the nearest
value that can be represented. Your number is outside of that range
and so is being represented by the nearest available value.
Note that sequences of characters entered as source text for numeric
literals will result in numeric values that are approximations of the
number they represent if they cannot be precisely represented by an
IEEE 754 64 bit floating point number, and that performing
mathematical operations on javascript number values that do precisely
represent some number value may result in values that could not be
represented and so will be represented as approximations. E.G.:-
alert(9007199254740992); // 9007199254740992 (precise)
alert(9007199254740992 + 1); // 9007199254740992 (approximated
result)
alert(9007199254740992 + 2); // 9007199254740994 (precise)
alert(9007199254740992 + 3); // 9007199254740996 (approximated
result)
alert(9007199254740992 + 4); // 9007199254740996 (precise)
alert(9007199254740992 + 5); // 9007199254740996 (approximated
result)
alert(9007199254740992 + 6); // 9007199254740998 (precise)
Richard.