Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That result happens for the same reason that 1 / 3 * 3 != 1, if you use decimal: 1 / 3 = .333, .333 * 3 = .999, which is different from 1.00.

0.1 is the same as 1 / 10, which does not have a finite representation in binary notation, just as 1 / 3 does not have a finite representation in binary or decimal notation.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: