Just as meaningful as a full course. This is just a "depth first search" thru the content, getting from top to bottom in one pass; you're complaining its not a "breadth first search" covering everything on one level. Done this way, you get the gist of how it all does, in fact, go from NAND gate to games - yes a lot is glossed over or missed, but once the student sees the vertical structure he can see how each layer connects, and how expanded knowledge of each serves not just one layer but those above and below (and how to leverage each in sync).
I appreciate how my education ran from sand to Skyrim (so to speak), and I find it hard to see how anyone can really function in computing without such a vertical understanding.
This is actually closer to breadth-first (since the alternative is an 'in-depth' course) in my mind, but I get your meaning. The thing is, do you actually take anything away? If you don't talk about caching in the CPU, scheduling in the OS, or propagation delay in the gates, how does that help your understanding of how to write software?
I'd be curious to know a) how deep your education actually went (since you've implied it was broad, from logic gates up to OSes and high-level programming) b) what you actually do day-to-day that you think this high level overview is indispensible.
> The thing is, do you actually take anything away? If you don't talk about caching in the CPU, scheduling in the OS, or propagation delay in the gates, how does that help your understanding of how to write software?
What materials and courses structured like this excel at doing is very rapid demystification. They quickly allow the student to remove the "and this layer is black magic" notion of things and give them structure on which they can realize the limits of their own knowledge, or learn to know what they don't know. With this sort of foundation they are better equipped to teach themselves.
Materials and courses like this are not vocational, and don't pretend to be. They are very much the opposite.
We all start out not understanding how it's possible to make a CPU, write a compiler, communicate over wires, draw text into a framebuffer, and so on. These things seem like magic. This is a problem: as long as they seem like magic, we're deprived of engagement with them. You get a family of systematic errors:
* Magic is supposed to work. So you see people calling for functionality to be moved from whatever they're doing (their user-level code, say) into the magic: build something into the language, compile it to machine code instead of interpreting, do it in hardware, etc. Because of course if it's done by magic, it doesn't cost anything and it works perfectly!
* Magic is out of your control. So if it breaks, there's nothing you can do. If your operating system is crashing your program, or downloading updates you don't want, you're out of luck.
* Magic is easy. So the people who make the magic happen don't get the credit.
* Magic is memorized, not understood. So you need to memorize the incantations needed to squeeze performance from your database/OS/CPU/whatever instead of doing them yourself.
You don't need to understand how to use Karnaugh maps to understand that putting more multipliers on your chip is going to cost you real estate. You don't need to understand the different possible scheduling policies to understand that making your program multithreaded will slow it down, not speed it up, unless you have more than one core. Even a shallow understanding is sufficient to be very useful, and to enable you to question things.
It helps immensely in knowing WHY such things are useful, how they can improve (or screw up!) another layer, and where the correct solution should be implemented.
There's an old joke that the difference between computer science and computer engineering is that in the former one assumes infinite speed and infinite storage. Understanding that there are limitations, and why they exist and to what degree, is important.
As already noted, it demystifies the surrounding "magic". There's a confidence and freedom which comes from knowing that nothing in the system is beyond you.
My education indeed went from "sand to Skyrim", from basic physics & chemistry to electrochemistry to discrete electronics to quantum mechanics to semiconductor doping to hand-layout of integrated circuits to automated layout of ICs (writing the automators, that is) to hardware languages (acronym escapes me) to logic to gate theory to basic CPU design to machine language to assembler to compiler design to C/APL/Pascal/Prolog/Lisp/C++ to OS design discrete math to graph theory to raster graphics to 3D graphics, and a bunch of other stuff throughout. It's indespensible because I can look at any problem and grok what's happening all the way down to silicon, able to work with someone writing Windows printer drivers one day and proving a linked crossover bug in the USB driver IC the next while discussing circuit design in between, why an elegant recursive solution causes a "drive full" error under certain conditions, why error handling in a certain protocol is pointless (already handled six layers down the network stack) - to name just a few real cases.
Knowing propagation delay in the gates can explain/reveal the limits of scheduling in the OS. Understanding drive rotation speeds provided the breakthrough of on the fly compression as an OS-level storage acceleration technique.
Take anything away? Just a sensible understanding of how everything works, and ability to drill into detail where and when needed. All learned in about 6 years, and even came out understanding why Aristophanes' plays survived for several millennia (to wit: dirty jokes endure).
What I do day to day (now)? Writing an iPad app for mobile enterprise data. Working under a genius crafting the many layers of abstraction making it fast & flexible, he can (has) describe a new way to represent very high level data, hand me a rough description of a virtual machine to process it efficiently, and I'll instantly see how it runs on server hardware. I can't imagine not having this view. As a part time teacher, I'm trying to get students from zero to binary to writing object oriented games in 12 weeks flat; to do less is to deprive them of the joy and rewards of knowing how things work - at every level.
"A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects."
— Robert Heinlein, Time Enough for Love
I appreciate how my education ran from sand to Skyrim (so to speak), and I find it hard to see how anyone can really function in computing without such a vertical understanding.