everyone keeps asking me why x86 uses an 8-bit byte, but I'm struggling to find an explanation that makes sense to me. Can any of you help?
what I've found so far
- it looks like x86 evolved from the intel 8008 (from 1972), which was an 8-bit CPU
- the 8008 came after the 4004, which was 4-bit
some questions I have:
- was the reason to build an 8-bit CPU to increase the size of the instruction set? or something else?
- did x86 really directly evolve from the intel 8008?
would love any links!
@b0rk I don't have any useful links other than the one you probably looked at first:
https://en.wikipedia.org/wiki/Byte
I remember (but can't find) a list of alternate historical sizes. My favorite was the 13-bit "baker's byte".
Just-so stories I can't back up with sources:
A power of two is convenient as a memory size. It takes three bits to address the bits in an 8-bit byte, but it would take four (with the last one partially wasted) to address the bits in a 10-bit byte.
If you want a character set that includes upper- and lower-case English letters, digits, and some punctuation marks, you're going to need at least 7 bits. I believe the 8th bit was originally used for error detection.