Hacker News new | past | comments | ask | show | jobs | submit login

No matter how technically correct that may be people just don't like doing work for some theoretical future benefit that may never materialize. Supporting 32-bit imposes a cost I have to pay right now today for the benefit of very few people (and a shrinking pool at that). Why not spend the effort on something more productive?

It is also likely that general purpose CPUs will use 64-bit integer registers and address spaces for the rest of our lifetimes. Computing has already (for the most part) converged on a standard: Von Neumann architecture, 8-bit bytes, two's complement integers, IEEE floating point, little-endian. The number of oddballs has dramatically decreased over the past few decades and the non-conforming CPUs manufactured every year are already a rounding error.

FWIW If a transition is ever made to 128-bit that will be the last. There aren't 128-bits worth of atoms in the visible universe.




Google says "There are between 10^78 to 10^82 atoms in the observable universe." Even the lower number is (just slightly) greater than 2^256.

(Imagine installing 2^256 sound cards in a system ...)


That seems non-responsive, as none of those things mean that a pointer and a `long` have the same size. Did you intend to respond to someone else?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: