Hacker News new | past | comments | ask | show | jobs | submit login

> some JSON parsers (include JS.eval) parse it to 9223372036854776000 and continue on their merry way

This is correct behavior though...? Every number in JSON is implicitly a double-precision float. JSON doesn’t distinguish other number types.

If you want that big a string of digits in JSON, put it in a string.

Edit: let me make a more precise statement since several people seem to have a problem with the one above:

Every number that you send to a typical JavaScript JSON parser is implicitly a double-precision float, and it is correct behavior for a JavaScript JSON parser to treat a long string of digits as a double-precision float, even if that results in lost precision.

The JSON specification itself punts on the precise semantic meaning of numbers, leaving it up to producers and consumers of the JSON to coordinate their number interpretation.




Every number in JavaScript is, JSON does not specify.


Update: The IETF version is a bit more explicit. https://tools.ietf.org/html/rfc8259#section-6

> This specification allows implementations to set limits on the range and precision of numbers accepted. Since software that implements IEEE 754 binary64 (double precision) numbers [IEEE754] is generally available and widely used, good interoperability can be achieved by implementations that expect no more precision or range than these provide, in the sense that implementations will approximate JSON numbers within the expected precision. A JSON number such as 1E400 or 3.141592653589793238462643383279 may indicate potential interoperability problems, since it suggests that the software that created it expects receiving software to have greater capabilities for numeric magnitude and precision than is widely available.

> Note that when such software is used, numbers that are integers and are in the range [-(2^53)+1, (2^53)-1] are interoperable in the sense that implementations will agree exactly on their numeric values.

Or to paraphrase: if you pretend that every JSON number is a JavaScript number (double-precision float), you will generally be fine. If you don’t, you are responsible for the inevitable interpoperability problems you’ll have with most current JSON parsers.


“JavaScript Object” is right there in the name.


Java is also in the name JavaScript, but we know how much that has to do with it.


indeed, we've all read that history. And we all know how much JavasSript has to do with "JavaScript Object Notation", too right? Basically everything?


Other than shared syntax, JSON is its own thing.


>Every number in JSON is implicitly a double-precision float

Is it? I was under the impression every number in JSON is implicitly arbitrary precision.


You can write an arbitrary precision number in there, say 4.203984572938457290834572098345787564e+20, but if the vast majority of JSON parsers interpret it as a double-precision float, there’s not much point.

If you control both the producer and consumer of the serialized data, you can of course do whatever you like. But I would recommend people who want more extensive data types use something other than JSON.


> If you control both the producer and consumer of the serialized data, you can of course do whatever you like.

Don't forget all the unintended intermediate producers and consumers due to microservices or even otherwise well-written tools that convert to float64 internally.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: