Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, but Mo is more technically correct where "byte = 8 bits" is merely "de facto" correct. There were (likely still are) designs with bytes of 6, 7, 9 and other numbers of bits, where octet an is always 8 bits.

Then again if we cared for "technically correct" as much as we claim to when correcting other people, we'd use KiB/MiB/... instead of MB/KB/... when working in 1024s not 1000s!

My understanding of byte is that it is defined as "the smallest unit other than a single bit that a processing unit naturally deals with". Similarly "word" is the largest value the instruction set natively deals with (i.e. the architecture's register size).

Obviously these definitions are only general and are "broken" by many exceptions: for instance many CISC designs with 8-bit bytes have some instructions that work on data in 4-bit chunks (nibbles/nybbles), multiply instructions need to output twice the input size to be efficiently useful, and so on. Also "word" has in some places become synonymous with 32-bits rather than its more general definition, in much the same way "byte" became so with 8-bits.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: