A Grandchild's Guide to Using Grandpa's Computer a.k.a. "If Dr. Zeuss were a Technical Writer" was written in 1994 and mentions microcode.
Microcode updates are always discussed when talking about microarchitectural security vulnerabilities (and other scary CPU errata like https://lkml.org/lkml/2023/3/8/976).
Microcode is always mentioned when discussing CPU design evolution.
It's funny that it's "always" mentioned, yet it's not familiar to me. Also curious the Wikipedia article for CPU design doesn't mention it, since it's "always" referenced.
Just because something is familiar to you, or even large swaths of a given population, doesn't mean everyone should be expected to know it.
I love learning new things. I love discovering topics I know nothing about, and I love picking the brains of those passionate about them. But the condescension from a certain type of tech nerd sucks all the fun out of learning. I've certainly been guilty of this in the past.
> It's funny that it's "always" mentioned, yet it's not familiar to me. Also curious the Wikipedia article for CPU design doesn't mention it, since it's "always" referenced.
you're not going to convince others that microcode is some kind of foreign concept to CPUs just because you yourself were unfamiliar.
Yes, it can be a downer to discover that you're more naive in a subject than you had previously thought you were more familiar.
>Also curious the Wikipedia article for CPU design doesn't mention it, since it's "always" referenced.
microcode is something that is implemented by CPUs that are too big and expensive to replace -- it's not something that is fundamental to processor designs. It's something we now live with to prevent things like the 'pentium bug' from costing Intel many-many dollars after a consumer-products forced recall/replacement.
At this point in history I think that if someone wants to consider themselves to be well-versed or knowledgeable about consumer CPUs then learning about microcode is a hard requirement. It's a false metaphor now to consider a CPU to be an unchanging entity, and that's important to at least be aware of -- it's literally one of the only ways that t
When did I say it's a foreign concept? I said it's not common knowledge for five year olds, and in reply, someone stated it's "always" mentioned. I was simply demonstrating that it's not "always" mentioned.
> At this point in history I think that if someone wants to consider themselves to be well-versed or knowledgeable about consumer CPUs then learning about microcode is a hard requirement.
This statement strikes me as hyperbolic. A CPU/hardware engineer, or even security-conscious software engineer, sure. But I can't understand why there is a reason for a consumer to care.
Microcode updates are always discussed when talking about microarchitectural security vulnerabilities (and other scary CPU errata like https://lkml.org/lkml/2023/3/8/976).
Microcode is always mentioned when discussing CPU design evolution.