> some JSON parsers (include JS.eval) parse it to 9223372036854776000 and continue on their merry way
This is correct behavior though...? Every number in JSON is implicitly a double-precision float. JSON doesn’t distinguish other number types.
If you want that big a string of digits in JSON, put it in a string.
Edit: let me make a more precise statement since several people seem to have a problem with the one above:
Every number that you send to a typical JavaScript JSON parser is implicitly a double-precision float, and it is correct behavior for a JavaScript JSON parser to treat a long string of digits as a double-precision float, even if that results in lost precision.
The JSON specification itself punts on the precise semantic meaning of numbers, leaving it up to producers and consumers of the JSON to coordinate their number interpretation.
> This specification allows implementations to set limits on the range
and precision of numbers accepted. Since software that implements
IEEE 754 binary64 (double precision) numbers [IEEE754] is generally
available and widely used, good interoperability can be achieved by
implementations that expect no more precision or range than these
provide, in the sense that implementations will approximate JSON
numbers within the expected precision. A JSON number such as 1E400
or 3.141592653589793238462643383279 may indicate potential
interoperability problems, since it suggests that the software that
created it expects receiving software to have greater capabilities
for numeric magnitude and precision than is widely available.
> Note that when such software is used, numbers that are integers and
are in the range [-(2^53)+1, (2^53)-1] are interoperable in the
sense that implementations will agree exactly on their numeric
values.
Or to paraphrase: if you pretend that every JSON number is a JavaScript number (double-precision float), you will generally be fine. If you don’t, you are responsible for the inevitable interpoperability problems you’ll have with most current JSON parsers.
You can write an arbitrary precision number in there, say 4.203984572938457290834572098345787564e+20, but if the vast majority of JSON parsers interpret it as a double-precision float, there’s not much point.
If you control both the producer and consumer of the serialized data, you can of course do whatever you like. But I would recommend people who want more extensive data types use something other than JSON.
> If you control both the producer and consumer of the serialized data, you can of course do whatever you like.
Don't forget all the unintended intermediate producers and consumers due to microservices or even otherwise well-written tools that convert to float64 internally.
This is correct behavior though...? Every number in JSON is implicitly a double-precision float. JSON doesn’t distinguish other number types.
If you want that big a string of digits in JSON, put it in a string.
Edit: let me make a more precise statement since several people seem to have a problem with the one above:
Every number that you send to a typical JavaScript JSON parser is implicitly a double-precision float, and it is correct behavior for a JavaScript JSON parser to treat a long string of digits as a double-precision float, even if that results in lost precision.
The JSON specification itself punts on the precise semantic meaning of numbers, leaving it up to producers and consumers of the JSON to coordinate their number interpretation.