Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's something the "old hackers" know that many CS students don't learn from the start (in the introductory classes), and that is actually how a computer operates. Don't spend a lot of time on this unless he's really interested, but it's a good idea to sit down with him for a few hours and teach him what really goes on inside a computer.

I'm talking about the concept of instructions, registers, addressable memory, etc. - with block diagrams, not with full circuits unless he's into that kind of stuff.

Then when he moves onto one of the high level languages that the other commenters are talking about the concepts will naturally stick. E.g. a variable is stored in memory, when we add two variables in the HLL we are copying those values to registers and using an ADD instruction, then we are writing the result back to memory. When there is an IF statement the computer will jump to either block of code depending on the conditional operation. Pointers become intuitive. And so on.

I've explained this to laymen and people who are interested in computers and have gotten a pretty positive reaction from them - so it's worth a shot. For a programmer who generally focuses on HLL it also allows them to get some idea of how their code is executed in the real world under all that abstraction.



It depends if his motivation is to build something or to learn stuff. Personally, when I was that age, I couldn't care less about learning anything unless it directly helped me achieve my immediate goals of whatever I was building. Turned out that was quite a lot. I learned about registers because I had to to optimize part of a game I was making. I wouldn't have sat down and studied some dry theoretical crap just for the sake of it. Block diagrams were a strong turn-off because you knew there would be no actionable information in them. Why do I care how this unit is connected to that unit? I just want to use the units! I'd naturally infer the structure of things once I was using them.


What formal CS education doesn't cover how computers work at a low level? It would seem like the opposite. Most self-taught programmers are probably missing foundational knowledge that doesn't directly apply to their day to day.


Every CS program of middling quality or above covers that material, but some may shove that coursework into mostly elective courses. Also, students can pass courses without really learning anything so if it's just one or two courses that really dive that deep over their academic career, they can learn enough to pass and then brain dump it.


Can confirm, I studied CS and don't feel like I really understand any of the low-level stuff.


You didn't have a course in machine structures? Including programming in assembly language?


I had one, I got an A in it (one of my best CS classes actually), but I don't remember much because I never applied it outside of the class


I have met more CS graduates who have little clue what's going on beyond what they copy pasted from stackoverflow compared to those who do. Both in the Indian subcontinent and right here in the US. Self-taught folks actually have the capacity to research and teach themselves what's going on under the hood.


Well it was the fifth course in my departments sequence, so a lot of students wouldn't take it until junior year. By that time they were decently proficient at writing code.


That's what I learned in CS, along with many other topics.

There's no way in hell I would learn about registers or assembly languages if I was learning about programming in 2022 outside of a CS program.


You'd have a good chance of being exposed to it if you wanted to mess around with microcontrollers, like for robotics, sensors, automation, etc.


I highly recommend "The Secret Life of Programs" for this purpose.

It starts at the hardware level and works up - it definitely massively improved my understanding of everything that goes on in a machine.


It's a first year CS assignment in my university to modify the VHDL source of a MIPS CPU to add an instruction as well as to program in assembly. It's not that uncommon either.


This is actually quite interesting, as this seems like something that would be taught in a CE curriculum as opposed to being in a first-year CS course. From the people I know who studied CS they may have exposure to HDL and digital logic in the upper years if they say elected to take a computer archicture course, but their introductory courses mostly focused on programming and algorithms.

I guess it really depends on the school - I have a EE/CE background so I know these things, but the CS people I know mostly don't.


Non-programmer here. I've been interested in learning more about exactly this. Where should I start?


I would recommend the book "Inside the Machine" by Jon Stokes. I'm a programmer so not going in with fresh eyes necessarily, but it definitely felt like it began at the start and the author was a tech journalist for Ars Technica (with a Computer Systems background) - I think it does a really good job. It also happens to be one of my favorite books, especially if you're interested in older hardware.


I'll check it out. Thank you!


When I was a kid I'd often spend an evening on a deep dive reading How Stuff Works (as Wikipedia didn't exist then).

https://computer.howstuffworks.com/microprocessor.htm


Thanks for this, looks like a solid introduction!


First find a small problem that you would like to solve then search for a programming solution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: