Hacker News new | past | comments | ask | show | jobs | submit login

I have to admit, I was not familiar with this limitation. Thanks for pointing it out.

And I'd go from "kind of horrible" to just "horrible".

Integral bit limits are an implementation detail that every language has a way to overcome - they shouldn't be built into an encoding spec.




Again, not saying this is a good thing, but it's an understandable consequence of the standardization partially involving "let's document how the browsers existing JS implementations deal with running eval on a string". It's easier to write the standard so that existing implementations fit it, rather than requiring changes from everyone.


I imagine it depends on your aims. If your aim is just to document existing behavior, that's one thing. But if your aim is to create a new, simple, data interchange format, documenting existing implementations is only going to limit its usefulness. It could have forced the current implementations fixed instead of setting the bar.

As pointed out upstream, we're dealing with a lot of data these days, and such limitations will only cause JSON to become marginalized or complicated with implementation-dependent workarounds, like integers in strings.


The point is that JSON started out as "data you can eval() in Javascript" pretty much. It gained the traction it did because it is literally a subset of javascript object notation so it was trivial to support.


By the way is this whence the "stringly typed" phrase originated or are there previous instances?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: