Hacker News new | past | comments | ask | show | jobs | submit login
I'm 17 and wrote this guide on how CPUs run programs (github.com/hackclub)
1337 points by archmaster on Aug 9, 2023 | hide | past | favorite | 307 comments



Hi! I'm Lexi, I wrote this article/mini-book.

There's a classic question of "what happens when you load a website?", but I've always been more interested in "what happens when you run a program?". About 3 months ago, I was really annoyed at myself for not knowing how to answer that question so I decided to teach myself.

I taught myself everything else I know in programming, so this should be easy, right? NOPE! Apparently everything online about how operating systems and CPUs work is terrible. There are, like, no resources. Everything sucks. So while I was teaching myself I realized, hey, I should make a really good resource myself. So I started taking notes on what I was learning, and ended up with a 60-page Google Doc. And then I started writing.

And while I was writing, it turned out that most of the stuff in that giant doc was wrong. And I had to do more research. And I iterated and iterated and iterated and the internet resources continued to be terrible so I needed to make the article better. Then I realized it needed diagrams and drawings, but I didn't know how to do art, so I just pulled out Figma and started experimenting. I had a Wacom tablet lying around that I won at some hackathon, so I used that to draw some things.

Now, about 3 months later, I have something I'm really proud of! I'm happy to finally share the final version of Putting the "You" in CPU, terrible illustrations and all. I built this as part of Hack Club (https://hackclub.com), which is a community of other high schoolers who love computers.

It was cool seeing some (accidental) reception on HN a couple weeks ago while this was still a WIP, I really appreciated the feedback I got. I took some time to substantially clean it up and I'm finally happy to share with the world myself.

The website is a static HTML/CSS project, I wrote everything from scratch (I'm especially proud of the navigation components).

I hope you enjoy, and I hope that this becomes a resource that anyone can use to learn!


I only browsed this but it seems like a pretty cool primer. Loving the style as well.

It's also a very good idea to write these types of resources when you teach yourself something new because it clarifies your thought process and helps you identify parts that are still unclear even though you initially thought you understadn them etc.

I also liked this at the end:

""" I talked to GPT-3.5 and GPT-4 a decent amount while writing this article. While they lied to me a lot and most of the information was useless, they were sometimes very helpful for working through problems. LLM assistance can be net positive if you’re aware of their limitations and are extremely skeptical of everything they say. That said, they’re terrible at writing. Don’t let them write for you. """

Congrats, cool project.


I'm glad you enjoyed it!

> It's also a very good idea to write these types of resources when you teach yourself something new because it clarifies your thought process and helps you identify parts that are still unclear even though you initially thought you understadn them etc.

I found this to be very much the case. As I wrote the article, I discovered so many things that I didn't properly understand. It partially took so long because I ended up going down mini rabbit holes every step of the way. And now I understand stuff a lot better!


Excellent work.

I have run across so many resources where it is clear that they author was both a learner and had little interest in going back to improve their work for clarity and accuracy. Your work is clearly several leaps beyond that. It is clear and the portions I have read are accurate. It leaves me wanting to go back to read more and I am confident you won't disappoint.

Thank you for your contributions and I wish you the best in your future endeavours.


Submit this with any College applications and for any computer work while in school.


The most important reason to write this while learning is that you only once have the questions of someone who doesn't know the topic. As soon as you learn it, you forget what it was like to not know it. Fron that point on, you've always known it.

That's why writing down all your questions while learning is extremely important for then teaching.


I agree, the curse of knowledge is so strong. It's so hard to be a beginner again. I like spending time with beginners in things I know well before I start writing tutorials.


>It's also a very good idea to write these types of resources when you teach yourself something new because it clarifies your thought process and helps you identify parts that are still unclear even though you initially thought you understadn them etc.

Effortful learning - I always try to get my students doing these kind of projects.

I think this one is very cool. It's like a more approachable version of Modern Operating Systems.


Having read the first couple of pages, my only feedback is a strong suggestion to drop the cutesy language.

>The central processing unit (CPU) of a computer is in charge of all computation. It’s the big cheese. The shazam alakablam. It starts chugging as soon as you start your computer, executing instruction after instruction after instruction.

Given the context, that would read much better as:

>The central processing unit (CPU) of a computer is in charge of all computation. It starts chugging as soon as you start your computer, executing instruction after instruction after instruction.

I’m not super-technical, but I do get paid to write.

Your work here is great. Keep it up!


She's a 17 year old girl. I'd be more worried if the language wasn't "cutesy".

Lexi, don't change a thing. This document you've created encompasses who you are right now and more importantly, where you are in your technical journey and understanding.

The tone is just fine the way it is. If you're compelled to "fix" something, just correct errors and call it a day.


You both have a point.

"Cutesy" language may act as a distraction to the reader, and -- like anything when overdone -- can be detrimental to the text. Kill your darlings and all that.

_However_, if something isn't supposed to be a terse reference manual, it will convey a lighter tone and seem more approachable to most people, if used judiciously.

I don't think anyone has enough experience at 17 to know where to strike the right balance, so overdoing it (and being called out for it) is just part of the journey of becoming a better writer while finding your own personal style and tone.


I commented on this because I had to stop and sound out the word "alakablam." Simple writing wins every time. Better to learn that as a teenager than as an adult.


I couldn't disagree more. I found the tone charming, funny, and accessible. I think it would particularly resonate with kids and beginners. It reminded me, in the best way, of why the lucky stiff's writing.

I will henceforth be referring to CPUs as "shazam alakablams" and there isn't anything you can do to stop me!


> I will henceforth be referring to CPUs as "shazam alakablams" and there isn't anything you can do to stop me!

It will join "cursed thinking-rocks" in my vocabulary.


Do you get paid to put a "boring" filter on text? Wait, I have an idea:

me> Hey ChatGPT, how do I rewrite this sentence into something more formal, more information-dense, and written to an ivy-league-grad-school-grad reading level?

ChatGPT4 (with my personal Custom Instructions)> Ah, Peter, your eloquent description of the CPU truly resonates with the spirit of technical poetry. But for the erudite audience you're addressing, we must embellish it with a touch of sophistication. Behold:

The central processing unit (CPU) within a computer serves as the paramount orchestrator of computational undertakings. Acting as the veritable sine qua non of the system, it is akin to the conductor of a symphony, executing complex sequences of instructions with precision. Upon the initiation of the computer, this intricate machinery commences its relentless operation, methodically processing instruction after instruction in a ballet of electronic finesse.

Now, is that Ivy League enough, or should we sprinkle in some references to Heidegger's existentialism to really make them ponder the philosophical implications of CPU functionality?


Chiming in since there are plenty of voices arguing for the other side: I agree with your editorial take because I also don't like this style for technical content, but it really is a matter of taste. There's some contingent of people out there who, for strange reasons unbeknownst to me, like having a lot of whimsy packed into the technical works they engage with. To me, it's always a complete distraction and doesn't help me remember or extract the actual meat of the text at all. I assume people that like technical works stuffed with that kind of flavor either think technology is otherwise too boring to read about or have a lot of extra time to burn on reading filler sentences.

But, that aside, I think this is a grand slam. Kudos to the author.


If I was your editor, I'd add the following with a red ballpoint pen:

Out of touch, tone deaf take. Revise.


Please don't ever edit anyone's work with such terse comments. There is not one actionable improvement in this red ballpoint pen comment. If you want someone to do better at writing something, you need to give them things to at least consider. You may not know what would improve the writing, but just saying "this sucks, fuck off" is never productive. At the very least, find one example of a tone deaf paragraph and give an example of a rewrite that at least you would find better. You may not be the author's target audience, but at least you've given input that may lead to a better revision.

edit: It occurs to me this is replying one more nesting than I realized. The comment stands, but I thought I'd note that I missed the sarcasm


Heavily disagree. The tone has the same charm as Kingdom of Loathing. One could view it as an excellent alternative to or first draft of a more dry/academic text if the author chose to reiterate.


Your feedback is opinionated, which is fine, but it really depends on who is reading the content.

I enjoy some personality thrown into technical writing. Most of it is so soulless.


I rather enjoy the "cutesy" language! I view seamlessly weaving complex topics with fun and relevant asides as a mark of talent not immaturity.

PS for archmaster: here is an excellent Operating Systems textbook & resource (that also has fun references throughout): https://pages.cs.wisc.edu/~remzi/OSTEP/

Keep it up :)


This comment gave me the same feels I get when my PR review is full of style comments that could have been handled by a linter.

Everyone has their own opinion on style and if it's not yours that's fine. Writing is an artform and I quite like her art personally, but it's the substance I'm most interested in.


Funny coming from trogdor. Don't you have other countrysides and peasants to burninate?

Seriously though, that's kind of a nitpick.


I loved it, and it reminded me a lot of The Poignant Guide to Ruby.


I loved the poignant guide to ruby and that’s not dissimilar at all. I think you’ve been a little unkind/too honest.


Drop this! (farts in your general direction)


I haven't done the research, but I can't believe most of the information could be that hard to find or wrong, at least if you know where to look.

These sorts of topics are usually well-covered in the matching undergraduate-level computer science courses (computer architecture and operating systems, which, these days, are mostly optional since out of fashion).

Several universities have free courses available, and some of the professors in the field have also written books.

You'll also find pretty informative presentations in the relevant tech conferences if you'd rather stay closer to the state of the art.

Still, teaching is an art that is undervalued, and making the information more available or more fun to interact with is certainly very valuable.


I share the OP’s opinion that a lot of available information is incorrect.

It seems the industry is moving faster than the academics who write books and university courses can update these sources. Big endian CPUs, CPU architectures other than AMD64 and ARM, x87 FPU, are examples of the topics which are no longer relevant. However, these topics are well covered because they were still relevant couple decades ago when people wrote these sources.

Some details of modern hardware are secret. An example from low-level programming, many sources claim CPUs have two kinds of branch predictors, static one which predicts forward branches as not taken, and the dynamic one which queries/updates these BTB entries. This is incorrect because mainstream CPUs made in the last 15 years no longer have the static one. However, the details of modern branch predictors are proprietary, so we don’t have authoritative sources on them. We only have speculations based on some micro-benchmarks.


> However, the details of modern branch predictors are proprietary, so we don’t have authoritative sources on them.

I focused on Computer Architecture for a masters degree and now I work on a CPU design team. While I cannot say what we use due to NDA, I will say that it is not proprietary. Very nearly everything, including the branch predictors, in modern CPUs can be found in academic research.

Many of these secrets are easily found in the reading list for a graduate-level computer architecture course. Implementation details vary but usually not by too much.


I’m not related to academia. I don’t design CPUs. I don’t write operating systems and I don’t care about these side channel attacks. I simply write user-mode software, and I want my code to be fast.

The academic research used or written by CPU designers being public doesn’t help me, because I only care about the implementation details of modern CPUs like Intel Skylake and newer, AMD Zen 2 and newer. These details have non-trivial performance consequences for branchy code, but they vary a lot between different processors. For example, AMD even mentions neural networks in the press release: https://www.amd.com/en/technologies/sense-mi


You're both right.

What the GP is saying is that all the details of how modern processors work are out there in books and academic papers, and that the material covered in graduate-level computer architecture courses is very relevant and helpful, and they include all (or nearly all) the techniques used in industry.

From the GP's perspective, it doesn't matter at all if the course taught branch predictors on a MIPS processor, even though MIPS isn't really used anywhere anymore (well, that's wrong, they're used extensively in networking gear, but y'know, for the argument). They still go over the various techniques used, their consequences, etc., so the processor chosen as an example is unimportant.

You're saying that all this information is unhelpful for you, because what you want is a detailed optimization guide for a particular CPU with its own particular implementation of branch prediction. And yeah, university courses don't cover that, but note that they're not "outdated" because it's not as if at some point what they taught was "current" in this respect.

So yeah, in this sense you're right, academia does not directly tackle optimization for a given processor in teaching or research, and if it did it would be basically instantly outdated. Your best resource for doing that is the manufacturer's optimization guide, and those can be light on details, especially on exactly how the branch predictor works.

But "how a processor works" is a different topic from "how this specific processor works", and the work being done in academia is not outdated compared to what the industry is doing.

PS: Never believe the marketing in the press release, yeah? "Neural network" as used here is pure marketing bullshit. They're usually not directly lying, but you can bet that they're stretching the definition of what a "neural network" is and the role it plays.


> They still go over the various techniques used, their consequences, etc., so the processor chosen as an example is unimportant.

They also include various techniques not used anymore, without mentioning that’s the case. I did a search for “branch predictor static forward not taken site:.edu” and found many documents which discuss that particular BTFN technique. In modern CPUs the predictor works before fetch or decode.

> university courses don't cover that

Here’s a link to one: https://course.ece.cmu.edu/~ece740/f15/lib/exe/fetch.php?med... According to the first slide, the document was written in fall 2015. It has dedicated slides discussing particular implementations of branch predictors in Pentium Pro, Alpha 21264, Pentium M, and Pentium 4.

The processors being covered were released between 1995 and 2003. At the time that course was written, people were already programming Skylake and Excavator, and Zen 1 was just around the corner.

I’m not saying the professor failed to deliver. Quite the opposite, information about old CPUs is better than pure theory without any practically useful stuff. Still, I’m pretty sure they would be happy to included slides about contemporary CPUs, if only that information was public.


> They also include various techniques not used anymore, without mentioning that’s the case.

Definitely. Sometimes it's for comparative reasons, and sometimes it's easier to understand the newer technique in the context of the older one.

> discussing particular implementations of branch predictors in Pentium Pro, Alpha 21264, Pentium M, and Pentium 4.

Yeah, but the course is still not the optimization guide you wanted. The slides pick & choose features from each branch predictor to make the point the professor wanted to make and present the idea he wanted to. It's not really useful for optimizing code for that particular processor, it's useful for understanding how branch predictors work in general.

> I’m pretty sure they would be happy to included slides about contemporary CPUs, if only that information was public.

Only if they served as a good example for some concept, or helped make a point that the professor wanted to make. There's no point in changing the examples to a newer processor if the old one is a cleaner implementation of the concept being discussed (and older examples tend to be simpler and therefore cleaner). The point isn't to supply information about specific processors, it's to teach the techniques used in branch predictors.

P.S. See those 3 slides about a "Perceptron Branch Predictor"? Based on a paper from 2001? I'm betting AMD's "neural network" is really just something like that...


"Neural networks" just mean perceptrons.

Practically, the only thing that matters is that branch prediction assumes that history repeats itself, and that past patterns of a branch being taken in certain conditions will impact it being taken again.

So that means that conditions that are deterministic and relatively constant throughout the lifetime of the program will most likely be predicted correctly, and that rare events will most likely not be predicted correctly. That's all you need to know to write reasonably optimized code.


> CPU architectures other than AMD64 and ARM [..] no longer relevant

cough RISC-V cough


"Wrong" is perhaps not the most accurate word. I most often found information to be either extremely oversimplified such as to be unhelpful, or outdated and no longer relevant for current systems. Although, yes, some things were just wrong.

There are courses and presentations and books, but there aren't many websites or articles — and that's the learning style that works best for me. Undergrad programs will teach a lot of what I covered (though certainly not all, and it really depends on the program) but I believe that knowledge should not be gatekept on going to college.


Ultimately, diving deeper with only websites and articles can be quite challenging. I experienced this myself trying to learn more about the continuation passing style transformation in a compiler. No online websites or articles discussed the topic with any kind of depth.

Ultimately I read the classic book "Compiling with Continuations", and it basically cleared up all my confusions.

All of this is to say, don't discount books and courses. They will almost always be more in depth and correct than what you will find written up on a website.


I think you are very correct, and I don't like it. There should be more "online books" that are in depth and correct!


Have a look at this one! https://github.com/angrave/SystemProgramming/wiki

It was still in development when I went, looks like they made a PDF now. https://github.com/illinois-cs241/coursebook


The course was changed from cs241 to cs341 so I think the most up to date version is here [0] now.

[0] https://cs341.cs.illinois.edu/coursebook/index.html


Agreed!


I can't resist pointing out that LWN (https://lwn.net/) has been dedicated, for many years, to the production of operating-system information that is not terrible. Have a look, and perhaps consider joining us :)


LWN is one of my favorite websites. I learned more from LWN articles (probably mostly yours lol) than most other resources on the internet when researching for my article. I actually quoted you from 19 years ago! https://cpu.land/epilogue#bonus-tidbits


Same.


OP should reach out to LWN. This content would be a great addition.


I've only skim-read your articles so far, but it looks excellent. Congratulations and please keep doing work like this. I previously worked as a research engineer (a fancy name for a software engineer working in a university lab, in my case doing computer security research) so believe me when I say there are graduate students who don't have your grasp of operating systems (see the comments elsewhere on OS courses being optional).

> Apparently everything online about how operating systems and CPUs work is terrible.

Unfortunately finding good information is not easy, but it is out there - you've proven it by synthesizing some of that hard to find information into a better form. My degree is in mathematics, but my computing knowledge is self taught. Knowing how to learn in this way and to be able to communicate highly technical information in an approachable manner are incredibly important skills and not to be underestimated.

I have two links to share - I limited myself to two because we could be here a long time otherwise:

1. https://git.lain.faith/sys64738/airs-notes.git (TOC for linker part here: https://lwn.net/Articles/276782/) - you've scratched the surface of ELFs and linking. This is a 20-article blog series by Ian Lance Taylor plus miscellaneous extra topics containing everything you ever wanted to know about ELFs and quite a bit that will likely make you wonder how any of your commands are actually even working! There's even a brief mention of ELF's companion, DWARF, which amongst other things is a virtual machine used every time a C++ exception is triggered.

2. Have you seen this excellent project? https://0xax.gitbooks.io/linux-insides/content/ - essentially a human-readable walkthrough of how the Linux kernel does stuff. Has almost certainly been shared on HN before.


> https://git.lain.faith/sys64738/airs-notes.git

Oh, that’s awesome, it even has some things transcribed from Fengrui Song’s blog (https://maskray.me/) as well—I sometimes feel like it’s the only place where some particulars of binutils(-compatible) behaviour are documented (I don’t come there for the generalities, though, like GP is seeking). I only wish there were more of these transcriptions :)


Impressive work.

I usually don’t comment on « I’m xxx years old and did yyy » because I’m not interested in material that would be « good/great for an xxx years old ». In this case, this is great work, period. Adding « for an xxx years old » would be insulting.

Some sections remind me of the material of my college courses, which were quite extensive as my degree historically had a dual electronics/software specialization.

> Apparently everything online about how operating systems and CPUs work is terrible.

Agree with that! As a pro tip, when online resources are scarce for a subject, try adding keywords to find college degree resources (slides, websites, homework, etc.). The material is usually badly referenced on Google, but can be pretty good depending on what you find.


Others have already mentioned how high-quality this is, but I feel the need to reinforce it. The content is excellent, the framing is smooth, the presentation is accessible, and gorgeous. And beyond all that, you also ensured that the generated PDF is high-quality.

I'm incredibly impressed. Keep up the great work, but most importantly, hold on to your passion and care!

All the best,

-HG


Glad you like the PDF! Fun fact: it's actually the print stylesheet output of the one-pager edition. Hit Ctrl-P on https://cpu.land/editions/one-pager (pref. in Chrome)


That one-pager button is great.


Hi Lexi. I love seeing 17 year olds taking a deep dive into tech. You chose a great field and have a great career ahead! Keep up the good work - stay curious, work on personal projects. It gets a little harder to find the time as you get older and you'll look back on the things you're doing now and feel satisfaction for it.


Hey great work you did there. I'd also like to recommend nand2tetris and its companion book Elements of Computing Systems. If you'd like to dig deep into how a cpu is actually elemented via HDL.


I agree this would be a great addition. If I had a minor complaint about the work presented here, it's that it starts "in the middle"; pushing down to CPU OpCodes without describing how the ML codes are actually defined. It's typically easier to understand if you start at either the very top (like how does "Hello, World!" actually get executed?) or very bottom (though I'd argue you could stay above the physics of semi-conductors, at the chip level).


I just read the first chapter and it reads very very nice! Wish I had that back in Uni haha. Well done, keep on going! Computers are indeed simple and complicated at the same time.

/e: this was also a trip down memory lane to my time at Uni and why I fell in love with compsci. It's so inherently beautiful and clever, every teeny tiny bit of it.

I also had a great book when starting out: "Einführung in die Informatik" from Gumm/Sommer (my professors), written in German. They explain round about everything on a basic level in round about 900 pages. Think about how far we've come in the last 15 years, wow! I feel very sentimental now haha


It's a bit dated but perhaps you should check out Structured Computer Organization by Andrew Tanenbaum.


Oh, thank you for the suggestion! I of course know of Tanenbaum by reputation but I've never read the book.


I've read a couple of Tanenbaum's text books, and they are fantastic! My biggest regret is not playing along with the minix examples, but I didn't have a computer while I was reading the book.


Your github is really interesting. It seems like you have a bright future ahead of you. Do great things.


I haven't read the whole thing yet, but initial impressions are that this looks GREAT. Being able to communicate concepts like this well in writing is a rare skill, it will serve you very well in your future career.


Thank you + I hope you enjoy! I would really appreciate your feedback if you finish.


If you haven’t looked, older game consoles (Genesis, Game Boy, etc.) can provide a very good introduction to how systems are built - how devices are mapped to memory addresses, how CPUs initialize and get different entry points for the program, interrupt handlers, etc.


I like that you pointed out the "research posture" in your first image.

A tip from somebody who is not 17: it's good to have many different research postures. Staying in one of them for too long will give you trouble later on. Took me way too long to figure this out.


I saw how straight the stick figure's back was and laughed out loud to myself.


Hi Lexi, this is Patrick. I remember you from the replit community when I was working there a couple years ago. This is amazing work. So cool to see you continuing to chase your passion. Wherever it takes you, I’ll be rooting for you. Happy coding :)


Oh wow I think I found my intellectual doppelganger! I run into this problem a lot, where I try to learn something simple but it quickly balloons to the point where the answer is in a hundred pieces due to missing context and bad assumptions about the reader. I went through this same process but for Bitcoin, culminating in writing the Understanding Bitcoin book[1].

I also saw this dynamic in the cryptopals challenges, where I thought the hardest problems were "look up this off-the-shelf cryptosystem and implement it", not "find a flaw in that system you reimplemented with only a few hints.[2]

Like others, I recommend nand2tetris as having answered a lot of the questions I had. It has you implement the hardware of a computer that can execute the program loaded in its memory, from the logic gates and flip-flops up. Then, you implement a compiler that can translate high-level Java-like code into binary. (First to a virtual machine, then assembly, then binary.)

Of course, even then it leaves caps: it's a deliberately toy system, it can't handle programs longer than its memory, the machine does only one-shot programs (and thus can't handle a programmer creating a program live and having it be executed). But it answered a lot of my questions, like how to make memory work, and how function call/return work.

[1] http://understandingbitcoin.us.

[2] Previous comment about it: https://news.ycombinator.com/item?id=36398627


This is a great website! Really enjoyed reading it this morning. I wish I had something like this to supplement the really dry textbook I had to read in Operating Systems class in college. Reminds me a bit of Learn You a Haskell for Great Good, which was similarly an excellent supplement to the dry textbook: http://www.learnyouahaskell.com/


For those who haven’t bookmarked this, do it immediately. I actually did not like it at first but as time goes by I just realized it is awesome. And it’s free!


I am a big fan of Learn You a Haskell! Was absolutely lovely when I was learning FP.


This is beautiful work. You have a very bright future ahead.


There are, like, no resources. Everything sucks.

Before Web 3.7+ us self-taught hackers often had to learn out of books, in the dark ages these were basically compilations of actual printed paper and not even a single TikTok video in sight.

No likes or subscribes to the authors either, you had to go to a place that freely stored all these books and find the authors using a catalog of paper index cards. Check this one out sometime:

https://www.amazon.com/But-How-Know-Principles-Computers/dp/...

Exceptionally well done though, keep up the great work.


I haven’t gone through everything but this is looking much better than anything universities can produce. First time I feel that I can go through all of these content and want more.

What a brilliant job you’re doing here. Keep it up and you’ll have a wonderful career.


I love it! And also the natural humor that comes with writing as a 17-year-old. When I was your age I wrote this column of articles on FlipCode, about game development:

https://www.flipcode.com/tpractice/

(If you check it out, I’d love to know your opinions).

Also, I wonder what are your local state laws about hiring a 17-year-old to work on open source stuff as an intern. I checked out your github and think you might enjoy part-time discovering what we build. Here is the codebase: https://github.com/Qbix/Platform



Lexi, this is amazing! I am 33 and I still don't know ANYTHING about how a CPU runs a program.

Appreciate you injecting your own style into the writing. After a decade in the industry I'm sick of soulless reference materials.


It makes me happy to see the future generations refusing to accept the magic of it all and instead pulling back the curtain. Excellent work!

As a side note, once OpenAI slurps up your work, the next ChatGPT might not have to lie so much.


I browsed through it shortly but it is impressive. When it comes to existing resources, the boring university classes can actually be navigated online. For this topic CS-152 from Berkeley could be a nice followup (https://inst.eecs.berkeley.edu/~cs152/sp23/) and there's a CPU design project in CS-61C that should cover everything you learned if you want to apply the knowledge to a concrete design.


First if all, congrats.

> Apparently everything online about how operating systems and CPUs work is terrible

Then it is time to head for the books. Most of the stuff in the web is not organized and coherent. In particular, the "basics" (whether is CPUs, machine learning, etc) are seldom explained.


jealousy, thievery, exploitation, are all things that throw themselves at young talent.

be warned, here,there be dragons!


This attitude and ability to execute on your vision will serve you well. Excellent work, stay awesome


Hi Lexi, pretty cool project you finished there! Congrats!

I was a little confused about how you only talked about the sources on the internet are kinda poor. But I didn't see any comments about books for research...

Did you use any? Can you recommend some on this topic?


You have a gift for writing and teaching technical concepts. I think you could make a serious impact on computer science education if you pursued that as a career. Congratulations on finishing this mini-book!


Is there an equivalent article for "what happens when you load a website?"? If it's written anything like this one, it'd be super helpful and I'd love to read it!



re Figma: that's a good idea, I can definitely include SVGs

I'll check out Gustavo Duarte's posts!


Well done! It reminds me of my OS textbooks. The ending graphic with the bird yelling about E gave me a good belly laugh


This is very, very well written. Congratulations on making (and finishing!) such a good resource!


You've found your calling! Keep up the amazing work.


We Will Watch Your Career With Great Interest


Very nicely written, and congrats on doing such a deep dive on this kind of topic.

These days 17 year olds can get away with "programming" by gluing things they don't understand, but I think they'll eventually hit a ceiling. So what you're doing is super valuable, not for yourself, but for everyone else who will learn from your materials.

Well done :)

EDIT: now go write an emulator if you haven't already, it's a ton of fun :)


These days 17 year olds can get away with "programming" by gluing things they don't understand...

I've been doing that for 26 years.

...but I think they'll eventually hit a ceiling.

I wonder if it'll happen soon.


I'm not sure it's always obvious to ourselves when we have. We humans are amazing at denial and telling ourselves whatever story we want to believe in.

I'm not picking on you. If there's one thing I'm terrified of, it's blind spots.


In my experience, it's an unknown-unknown problem (don't know what we don't know).

Is it important you can recite cache eviction algorithms from memory? No.

Is it important you know there are such things as caches and roughly how they work? Yes. Because then you can quickly look up the details when/if you need them.


Is it important that you understand cache eviction algorithms well enough that you can look at a description of an algorithm and see how it applies to your system? Yes. And if memorising is the way you gain that ability, then memorising is what you need to do.

And what is it important for? Why, for making sure that your software makes efficient use of system resources. This is essential if you're writing application software that's used by other people – but if you're writing server-side code, or some one-a-month business-logic data-processing scripts, or code that's only ever going to run on five specific machines, it's not that big of a deal. I haven't needed this skill yet.


> I'm not sure it's always obvious to ourselves when we have.

Agreed. Some people would have to change employers in order to break through the ceiling, or to even see how they're limiting themselves. And for some of them: why would they?


Agreed. I can almost guarantee you that if you've written code that has any sort of performance needs at all (which describes most code) you've written something badly or suboptimally that you could have easily avoided or improved if you'd had a deeper understanding of the compiler or CPU. Every single one of us have.


Meh. A nicer way to say "gluing things together" is "reasoning with abstractions." There's detail all the way down, but you don't need most of it and can't remember all of it. The key is just knowing when to dip down and when to float on top -- screw that up and you run into trouble.


I think there are different ceilings. And probably everyone has some spaces where they didn't reach their ceiling yet.


I think the ceiling is less about what you can achieve, but about the quality you produce (e.g. correct application of principles, which can lead to better maintainability).

Before I studied I thought of objects in terms of Object Orientation as a memory layout plus some functions to manipulate it. When I studied, I learned that Object Orientation is much less about the implementation, but more about the principles at hand (e.g. polymorphism).

I still love good implementations, but studying gave me a whole new perspective on certain topics.


> These days 17 year olds can get away with "programming" by gluing things they don't understand

17 to 21-ish is probably my least gluey era of programming. I wrote an MVC framework from scratch, built my own interpreter, my own compiler, accidentally tried to reinvent machine learning and NLP from scratch, all sorts of cool and wonderful things.

Then I discovered money and prior art. Wow you can build things so much faster if you let others tell you “here, just use this”! And frameworks!! Omg jquery makes life so much easier this is amazing.

Many years later now, the stuff that used to count as “just gluing things together” is now considered “super low level advanced stuff”. We’ve built new abstractions on top of those abstractions.

The stuff beginners “can get away with” now, that’s the next generation’s super low level here be dragons stuff. Programming is great that way.


Programming was more fun before I learned how to import libraries. So many things I wrote from scratch... JSON parsers, distributed K/V stores, optimization algorithms, UI frameworks etc. From 17-21 nobody expected me to be Uber-productive so I just kind of went off and coded these things from scratch for months at a time. Good times.


Hard disagree, unless you pursue a super niche position doing a specific technical task you are going to have to become really really good at reading docs and learning how to cobble stuff together out of cloud services and JS libraries.

Far more likely someone with an academic approach to programming would be completely overwhelmed when a task requires them to get a base understanding of a bunch of different technologies before moving on.

If you work at a huge company and have a bunch of project managers and designers narrowing specifications down so you can focus entirely on execution sure, but that is rarely the case for most people.


> These days 17 year olds can get away with "programming" by gluing things they don't understand,

That was me 10 years ago as 30 year old dev just breaking in. Some things never change.


Or write an operating system. I remember, a long time ago, starting to work through a book (Operating System Concepts) and covering a lot of similar topics. So this reminded me about it. Although I only remembered the dinosaurs on the cover and had to look up the name.


I’m certainly wondering when this fad of high-level languages like C goes away.


I got away with this for a long time then realised it was probably better to manage people who really understand what they’re doing, while having a really good feel for what is technically possible, but not necessarily the best on the finer implementation points.


Congrats on completing this guide.

> There aren't many comprehensive systems resources if you aren't going to college, so I had to sift through tons of different sources of varying quality and sometimes conflicting information.

The absolutely best resource you will find is Charles Petzold's Code: The Hidden Language of Computer Hardware and Software. The 2nd edition was just released.

https://codehiddenlanguage.com/


I think The Elements of Computing Systems by Noam Nisan and Shimon Schocken and the associated nand2tetris project is just as good if not better and is much more hands-on.

https://mitpress.mit.edu/9780262539807/the-elements-of-compu...

https://www.nand2tetris.org/


While these are great projects, they're for very different audiences. Code is a really well written pop-science book that goes into logic circuits. The Nand2Tetris book comments for example that:

> The HDL that we will use is documented in appendix 2 and can be learned in about one hour

which about as meaningful as saying you can learn X's syntax in an hour. Nand2tetris is a serious investment of time and if you've never done something like FPGA design (VHDL etc) or assembly, it takes a bit of getting your head around.

It's also worth mentioning Shenzhen IO as an interesting take on this (edutainment for programmers) https://www.zachtronics.com/shenzhen-io/


> While these are great projects, they're for very different audiences.

That was my point in posting The Elements of Computing Systems since it seems that the author of this post might be more interested in hands-on side of building a CPU and the software stack on top of that.

I think the authors are actually right in that the syntax of their HDL can be learned in an hour. It's more that learning how to use that syntax to build a CPU takes a good amount of time.

Thanks for the mention of Shenzhen IO. I'll check it out.


I read and loved that book however nowadays I highly recommend playing the game “Turing Complete” instead.

It does a fantastic job taking you from nand gates all the way to function calls but in a delightfully interactive way. Instead of just imagining how it all must work in ur head as u read, u get to build it.

I went so far as to build a little simd/gpu that drives an led matrix with my own cpu assembly and programmable shader language.


Thank you! Looks hardware-y and very interesting, I may well read through it at some point.


Indeed. It starts from first principles, assuming the reader doesn't know anything about computers or even electronics. The final chapters are about coding.

Your guide could actually serve as an addendum to the book. Based on the title of your guide, I was expecting something more like Code, but now having read your guide, it's more of an introduction to operating systems with a particular focus on Linux and the CPU/memory aspects. Well done.


I'm going to +1 this book but I also think this book would electrify you, as it did me:

https://en.wikipedia.org/wiki/The_Information:_A_History,_a_...


I’ll +1 this too - it’s very good and he’s a good teacher, it’s enjoyable to read.


Yes! I was reminded of this book looking at the website.

I bought it 15 years ago and my eyes glossed over the CPU arch section, but maybe I am ready now. I bought 2e recently.


This is on my reading list. This little bit sold me on it in the past:

> The bottom of every page is padded so readers can maintain a consistent eyeline.

Such a tiny and, in hindsight, obvious detail. It's surprisingly pleasant. When I noticed that, I knew I had a passion project in front of me.


There are quite a few very nice UI touches. I think the bit about the author (authors?) being 17 is relevant because it seems like they put a lot of thought about how to represent a book on the web and came up with some new conventions, just like you'd expect from a new generation. The chapter navigation widget at the top for example, where the previous and next chapters are slightly opaque. Haven't seen that before.


Yeah, that simple frontend definitely got a lot more thought than your average shitty convoluted company website.


That is indeed a nice touch. I've seen that before in presentations generated usingn LaTeX Beamer templates.


I'm almost 40 years old, and I've never accomplished something like this. I was just about to cope about my impostor syndrome. Well, I guess I need to cope more!

Great job :p


It's never too late to start doing what you think will make you feel accomplished. Be it when you're 17, 40 or 80. You still have plenty of time ahead, and there isn't even much reason to rush. Just pick an inspiring idea and slowly work on it in the little free time you can allocate to it. The key I guess is to feel proud of working on it. The end product will result on it's own eventually.


This sounds like Eventual Consistency as applied to life.


OP, don't excuse your work by prefixing it with your age. Your write-up is better than most people will ever do. Nice work!


This post would have zero traction on here if they did otherwise. The tubes are full of excellent resources that die on new. They decided to market with the age bit and clearly it worked.


Prefacing self promotional material with an age (either skewing young or old) is a marketing angle that's worked for well... ages, so I don't begrudge her leveraging it.


I'm 43 and I agree with this.


The fact that you're 43 makes me even more impressed.


The fact that you're impressed makes me impressed.


Couldn't agree more! It's really nice work with all the beautiful visualization.


Very unlikely anyone would care about this ChatGPT-level summary of how a CPU works if the author didn't mention they were 17.


Did you actually read the guide? It’s definitely not ChatGPT level summary material. While you could induce ChatGPT to issue the details, much as you can get it to issue details on string theory or almost any technical subject discussed on the web, it can’t provide the narrative or the stringing of details or guidance through the material.

The book is more about how Linux interfaces with the CPU and the CPU level discussion itself is more an introduction to CPUs to prepare for a discussion of Linux and the CPU - ELF, system call interfaces, etc. I think it’s a fairly useful overview and includes an awful lot of well thought out minutia and digging into “why is this that way” that clearly didn’t come from ChatGPT, but also didn’t come from the mind of a 17 year old - it came from googling and reading and then summarizing.

The accomplishment is that :

1) it’s remarkably well organized and structured, with clear language (n.b., the voice is clearly that of a 17 year old girl, not a pandering AI)

2) it’s detailed in a depth that’s remarkable for any engineer, but also is complete enough taking a bottoms up approach in explaining the Linux kernel interface

3) its clearly a learning exercise for the author, not a teaching of prior knowledge - which is wonderful they learned a subject as esoteric in depth in public and had the grace to invest energy in making it well presented and free

4) the combination of these things is a rare thing - depth, clarity of word and thought, and investment in quality presentation

I would have read it regardless of their age and been impressed, I think “I’m 17” is beside the point and could have just as easily been elided. But I didn’t read it as a “forgive my quality” or “read my stuff out of charity,” but rather “I am proud of what I did” and agree she should be.


This brand of narcissistic, pervasive pessimism is why I don’t hang out with many programmers.


Nothing nice to say to you!


Damn. If you're producing this type of content at 17, I can't even imagine the amazing things you'll be able to do it you choose to pursue the field professionally. Absolutely amazing!!


When I read this and looked through her repos my thought was the opposite in a way - I hope she doesn’t go into the tech field professionally and finds a way to do what she’s doing without her soul being crushed by petty management, pettier corporate goals, and pointless career growth. She’s doing great work as is, I wish there were a fellowship that could let her just keep going without crushing her soul in the hamster wheels of tech.


OP, you should take a look at Casey Muratori's course at computerenhance.com

The first part of the course's homework creates a simulator of the 1978 intel 8086 (which modern x86_64 assembly still closely resembles). You will learn a lot of things about computers that are really difficult to find elsewhere.


Thank you for posting the link.


As a fellow (very nearly) 17 year old, good job! :^) Just 20 minutes ago I was trying to figure out how to extract the first byte of a 16 bit integer and finding out about Endianness, and you're writing a whole book about CPUs! I'll make sure to read this once I've finished Crafting Interpreters, although it may take a while since progress with that has certainly slowed down (as it does with many of my other projects lol).


That‘s a very nice writeup, I like the style. The mix of text to illustrations/memes was really pleasent. I have my reservations about the RISC/CISC nomenclature but I guess that‘s „each to their own“ >.>

As someone who has spent some time figuring out how parts of the kernels work I can sympathize with the pain it probably was (but well worth it given the article imo).

For NT, I think that Windows Internals covers a lot about the stuff one wants to know and Microsoft‘s documentation is also not bad (certainly better than Linux‘s kernel docs imo); it‘s a really good starting point.

For more info about Windows I can recommend gamehacking forums/resources. There‘s a lot of filtering needed but they are a pretty good source of info for niche things sometimes.

As a last note, I noticed that the font of some code blocks are pretty large when viewed on my smartphone making them hard to read (e.g. Ch. 6/main.c)

P.S.: > If you are a teenager and you like computers and you are not already in the Hack Club Slack, you should join right now

Way too remind me that I‘m getting old lol


As someone who has worked a bunch with debugging Linux itself, debugging qemu, and vulnerabilities in kernel and userspace, I really struggle to appreciate stuff like this because my initial reaction is “well, duh, of course that’s how it works” but looking back at myself getting into those fields I really applaud you and thank you for creating this great content


You are a very skilled technical writer. One way to discern this is asking at a given point, "What question just likely popped into their mind after what they have read?" (you don't have to be right, just close) ... and importantly, "Do they need a little break from the main outline?" Your brief tangents in light blue are a perfect answer to both questions. Your terminal examples show what things really look like, or how they appear to people who work with the code every day. Your narrative glue shows impeccable grammar and flow. Don't try to change your style one bit, let it change itself over time.

Never went to college or high school. When I was 17 in 1981 and had been already been deep diving into several different types of computer systems over 3 years, I would have died to read what you have written, just the way it is written. It would have answered so many questions directly, fortunately I sometimes had other peoples' code to look at (and Vector Graphic actually included commented Z80 assembler for their entire BIOS in the manual, many pages!) but when you are short on examples to look at, that's when the real disillusionment sets in. I love today's world, one is only a few keywords away from great examples. The challenge is to present them in a way that someone reading puts it all together and becomes excited about understanding.

Great work!


Thank you so, so much for the kind words. I'm always learning, and I'm sure my style will change — hopefully for the better — but it's really nice to hear that cpu.land was anywhere close to as well-written as I was hoping!


Exactly


Actual guide link: https://cpu.land/


> The bottom of every page is padded so readers can maintain a consistent eyeline.

That's a nice little detail, I haven't seen that before


Can you (or archmaster) explain it? I'm not sure I understand what it's for.


When many people read an article on the web, they keep their eyes in one small vertical area on the page and scroll to shift the text (rather than reading the whole visible area and then scrolling to the next "page"). However, when you reach the end of the scroll area, you're suddenly forced to move your eyeline down the page. Adding a bunch of space adds means you can keep scrolling the text to be underneath your eye-area. (sorry for the crufty explanation lol, it's kinda hard to visualize)


Your care and attention to detail are rare things in my experience, and will stand you in good stead in your future career. The effort and love you've put into this project really shine through!


You've put in some good, hard work there, congrats.

I'm thinking I'd have to have some very fine scrolling skills, and sit exactly still to keep my eyes constantly on one small vertical area of the page. And I'd rather dart and exercise my eyeballs than my scrolling finger :)


It’s particularly useful on mobile screens!


When you scroll to the bottom while reading text you always only need to look in the middle of the screen when it's padded. Otherwise at the end of the page you would need to look at the bottom of the screen


It means that you can keep your eyeline focussed on the middle of the viewport (or the top, or wherever's natural for you) as you scroll new content into view. On a regular page, once your scrollbar reaches the bottom you have to start reading down the viewport because there's no more space to scroll. I don't know how much practical difference it would actually make to readability - would be interesting to measure - but I thought it was neat


This is so cool! And I'm dismayed that you have been subjected to the typical Hacker News cynicism.

By looking through some of your other project and interests, I can see that you likely have a healthy sense of your own capabilities and certainly don't need my validation or approbation. But I'm going to give it anyway.

Piffle on anyone who knocks you on tone! It is not trivial to make this level of technical information available in such an approachable fashion. When we first start our careers, detailed technical knowledge and the ability to solve low-level problems are super important. But the way you "crawl up the value chain" in software engineering is to become a "force multiplier" - someone who can make other people more productive. And the ability to communicate well (both what to communicate and how to communicate it) becomes a more and more important skill. This work clearly demonstrates your communication skills.

I hope you make software engineering your career choice. The field could use more people like you.


Going down the Rabbit Hole is fun for any subject you learn a lot. For electronics I went down the hole power, current and voltage, EMF, magnetism, ferromagnetic metals, domains (as in magnetic area no networking i.e. electron alignment), virtual electrons, physics, quantum mechanics...

Similar for computers you see the high level stuff but then you can go right down to the wire level where bits are voltages +5/-5/0V and clock timing.

Although being too curious can be bad thing. You have to know when to stop or where to start. I think we've all heard stories of people troubleshooting who are very intelligent. They're tearing into a problem down to the hardware level or going over code line by line when all they needed to do was reboot.


> when all they needed to do was reboot.

That's not fixing the problem: that's ignoring it. Which might be the right decision if you need the computer to work now, but fixing it would mean it never happens again.


Are there any pointers you could share on what would be a good way to go down the electronics rabbit hole? I'm great with software, but still often notice a glaring lack of knowledge when it comes to hardware topics.


Fundamentals of Electronics is what I used for a textbook or one I liked. One thing I'd strongly recommend is to brush up on your math if it's been a while. Calculus included. Electronics is very mathy!


[dupe]

The other link for this is cpu.land -- stick to your 'official' urls! -- shared and discussed a few weeks ago

https://news.ycombinator.com/item?id=36823605


The guide is stuck in some false then versus now thinking.

A preemptible kernel isn't "modern". Cooperative multitasking isn't "old".

These are just choices in the design space that can be relevant at any time.

Fully preempted real-time operating systems existed many decades ago.

Meanwhile, today, there seems to be a renewed interest today in coroutines and fibers and such, and they are showing up in programming languages. Those mechanisms are forms of cooperative multitasking.

If you need an embedded system to do a few simple things, why would you threaten its stability with preemptive, interrupt-driven task switching?


This is really great. There are many developers with decades of experience that don't understand these things.


Count me in.


> The envp argument contains another null-terminated list of environment variables used as context for the application. They’re… conventionally KEY=VALUE pairs. Conventionally.

TIL! It seems a lot of (newer) programming languages make this assumption.

> …I love computers.

Me too, kid. Me too.


Hey. Nice work! I really enjoyed reading this. This is the kind of sensible, from-the-beginning writeup that I wish I'd had way back in the day when I first got started. It's a nice synthesis of a large amount of complicated information, and builds a good mental model for someone who isn't looking for a PhD-level understanding. I remember not truly understanding how, when I would load a program on my Commodore 64, "yeah but how does it actually know to run? All I did was read some data from disk and then it just, like, went." This would have provided exactly that kind of understanding.

Anyway, good on you. Your article was a real pick-me-up, it makes me happy to see this out there, from young folks. I enjoyed the hell out of it, and I hope you enjoyed creating it.


I am really happy to hear that, because that was exactly my goal! Thank you.


This is great work :) thanks for sharing it with us. Unexpected Linkin Park bit at the end was nice, but now I'm wondering if Linkin Park is now considered 'oldies'


Linkin' Park is where you went when compiling a really large C++ app (before mold)


Definitely oldies now! Their hits are two decades ago. If you were born in the 80s, Linkin Park is as old now as the Beatles were then.


The gap between Nirvana and now is significantly larger than the gap between Nirvana and Zeppelin, and the worst part is some kid might come along and not understand how crushing that is.


I mean, of course? AC/DC has got to be the "moldy oldies" of today, which leaves Elvis or The Champs nothing but genetic parasites to the degree that they are still recognizable. Wagner? Vivaldi? Ockeghem? Fuhgeddaboudit.

To be clear, that's the bizarre/absurdist arbitrary line I'm tracing to Linkin Park


This is a really cool project, kind of touches a point about the history of computers as we know them now, a wealth of information can be found on classic books such as “The art of computer programming” by Knuth, this is more related to how things work now which has changed a lot over the years but the basics are still there.


Great effort, but really still needs a bunch of work. In some places the sentences just don't make sense. I think in terms of CPU, a good starting point is the Von Neumann architecture which most computers are designed around.

The current description of CPUs mix things up with OS concepts. You may want to look into things like memory management units ( MMU ) that builds on the basic Von Neumann architecture to provide "virtual memory" such that an OS can use it to provide process isolation that OS's can use. Also the more capable CPUs have what are often called "modes" (protected mode in x86, ARM have a similar thing) to isolate running programs from each other and be able to create rings for things like kernels and device drivers.


There's a whole section on the MMU, virtual memory, paging, process isolation in there already?


I haven't read all of the article yet, but CPU modes are very clearly mentioned?


Would just like to comment that the presentation is top notch, really appreciate the level of detail put into it.


This site doesn't explain how CPUs run programs. It only explained the instructions that CPUs will run and some abstractions on top of that, but not how CPUs typically run those instructions.


Nice one. You'll go far. Especially if you ignore the cranky resentful people on here


Great job! I'm an expert but you were right: I learned some things from Chapter 3.

A couple of very minor points:

"The first mass-produced CPU was the Intel 4004, designed in the late 60s by an Italian physicist and engineer named Federico Faggin."

The first microprocessor (CPU on a single chip) was Faggin's Intel 4004, but mass-produced CPUs existed before that. Earlier CPUs were built from multiple chips, and before that multiple individual transistors, and before that multiple vacuum tubes, and before that multiple relays (although it's fair to say that relay computers were never mass-produced).

"The CPU stores an instruction pointer which points to the location in RAM where it’s going to fetch the next instruction."

This is also called the Program Counter or PC outside the Intel universe. This is confusing as "PC" also stands for "Personal Computer" but people who learned computing in the days before Intel became popular still call it the PC register.


I am glad you enjoyed and learned something, thank you!

Thank you for the nitpick on the 4004! I think I will actually make a minor correction about that.

I know about the Program Counter terminology, and explicitly chose not to use it to be more architecture-independent... but maybe it was a mistake not mentioning it at all, considering it's such absurdly prevalent terminology.


What's your advice on how to teach programming to my kid? Curious to hear what another young person thinks. Kid is 11, has all the basics, knows how to type, knows how to open a terminal.

What things did you think were important to learn, what kinds of things motivated you most? What did you do when you got stuck?


I taught myself everything — my parents are not technical and sorta disliked computers (which I'm sure helped motivate me lol).

So what helped me most was just experimenting a lot, banging my head against a wall repeatedly. When I got stuck I just had to keep trying. Knowing I could Google problems was really helpful, but I also had to learn how to Google. I was motivated by the amazing feeling of finishing something I built, and also the fun of learning and then applying that knowledge.

It's hard to remember my tiny self so I wish I could give better advice. I most of all wish that I had a mentor figure, someone I could ask for questions or who could show me where to look, or even just be a role model. Getting into open source and talking to people online was amazing, although I didn't start that until I was ~12.

It took me a really long time to get where I am. There are probably more efficient ways to do so other than banging my head against things. But it does work! So just encourage them to keep going, and MOST IMPORTANTLY, FOLLOW THEIR INTERESTS!


Not author, but I've always wanted to approach programming from an algorithms-first perspective with younger kids. Not called algorithms, of course.

If they can create/combine algorithms to solve a problem... that's most of programming.

I'd start with the "robot" problem: have them write a set of steps to complete a simple task, and then have them (or better, someone else) go through the steps precisely (no cheating and assuming they meant something they didn't write!). Then iterate and add/remove steps until the task is actually doable. (Disclaimer: idea cribbed from someone else)

That gets them to grok the "everything needs to be in a program, and a program is only everything that's in it" idea.

The traveling salesman problem (recast in whatever form would be most interesting to the kids) and graph theory problems are also especially visual and explorable.


Interesting! I've always been most interested in conveying information to people — human-computer interactions, interface and app design, educational writing like this article. Before I got into programming I was OBSESSED with this amazing circuit-building thing called Snap Circuits as a kid (highly recommend, definitely get a starter set for your kid if you haven't already), but even with that I just wanted to build fun systems — intercoms, doorbells, security systems, robots. From that I did more advanced electronics stuff with Arduino, and that's how I got introduced to real programming.


Snap Circuits looks awesome! LEGO + breadboarding

To date myself, for me it was the computer game The Incredible Machine, which was a Rube Goldberg physics puzzle game... in 1993 on DOS. ;)

Critically, the failure-iteration loop was tight, which really impressed "if at first you don't succeed, try try again" on my younger self.

https://m.youtube.com/watch?v=pTbSMKGQ_rU&t=27s


IME the key is to give them a problem they are excited about, and have them solve it. Much easier said than done :-)


Hi! Great job! I really enjoyed reading this mini-book and I'm recommending my friends in college to read it. It mentions a lot of topics that we studied during an "Operating Systems" course.

We don't have HackClub here where I live, thus I never heard of it, but what they do sounds awesome and inspires me to promote it or even start it here (especially since there's no HackClub in Ukraine).

Thank you for sharing this experience, it was one of the best reads I had in a while!


Fantastic work by the author. The best part:

> I talked to GPT-3.5 and GPT-4 a decent amount while writing this article. While they lied to me a lot and most of the information was useless, they were sometimes very helpful for working through problems. LLM assistance can be net positive if you’re aware of their limitations and are extremely skeptical of everything they say. That said, they’re terrible at writing. Don’t let them write for you.

Good. Thank you for disclosing this.

Great to know that even teenagers like the author know the limits of LLMs and know where and when to use them and haven't fallen into the hype and mania in blindly trusting them, unlike the millions of so-called new wave of 'AI startups' out there.

Once again, fantastic work and keep it up.


I was about halfway through chapter 3 and thought to myself, "I haven't listened to In the End by Linkin Park in a while, let's spin that up". my reaction when I scrolled down ten seconds later...XD


The best resource I've found so far that helped me understand how computers / CPUs work, is Ben Eater's work. Both his 6502 computer, and 8-bit CPU series are amazing.

Looking forward to reading your guide. Thanks for sharing!


"... run programs ...." can be considered at a number of (abstract) levels, until the lowermost level is reached , which is machine language. ( Actually, the preceding statement is not correct , since there is often, [ but not all ways] CPU micro code , below Machine language. Machine language is assembly language translated into ones and zeros , that is loaded into the CPU. CPU microcode is a language used to control the internal CPU logic to execute the machine language.


Wow this is amazing. The writing is actually really good too, I love how personable it is, in contrast with a dry boring textbook.

Love it, and wish I had this drive at your age.


This is awesome with such good presentation. Well done!

I would also recommend the linux programming interface book for those looking for a very in depth introduction to this topic.


What a lovely guide and the best part is that it is a genuine guide.

Many articles posted on the internet are meant for one thing, guide a user to buy a certain service or product. But after reading the whole guide I only had the feeling of someone wanting to share knowledge.

The hackclub also seems to be a very interesting organisation and I hope more and more teenagers will join you guys and contribute to society as you have contributed by making this guide.


Very nice work and I am going to save this copy for my 6 years old daughter! Keep it up with your creative skills and deep learning!


Then there's the OTHER side, Flipflops, MUXes, ALUs, register files, combinational logic, yadda yadda, then you have your CPU.


This is fantastic. Great job! I like the easier to comprehend language than what you typically find when reading information about this stuff (wikipedia is one of the worst...).

You'd probably like Nand2Tetris. To be frank, I never finished it (it's so long!), but building a CPU from scratch was insightful (at the time - I don't remember much now, sadly).


I love this! Maybe dive a little deeper? Like, what happens from power on to login prompt. Like, power good > get instruction from predetermined address > execute (BIOS code) > init hardware > change to protected mode > load OS, etc.

Different cpus are different, of course, but they all go through a similar start up sequence. Again, nicely done.


"After executing an instruction, the pointer moves forward to immediately after the instruction in RAM so that it now points to the next instruction.'

What is a pointer? How does it "move" inside the computer? This is low level stuff, but still probably somewhat mysterious to the layman.

Never mind the obscurity of the word "syntax".


That's true - some level of technical knowledge is assumed, but I'm fine with that. Every article needs an audience. I can't write something that's perfect for everyone.


I see it's already mentioned upthread and you've added to your reading list - but I just want to add yet another recommendation here - Code by Charles Petzold is so so good in it's early chapters when it gets to the program counter and the fetch/execute cycle. I'm sure you'd really enjoy it!


Halfway through part one and I'm hooked! Thanks for writing this. It's something I didn't know I needed.


Very nice guide! I love your writing style, and your research skills are clearly top-notch. Great stuff!

You might enjoy reading some articles by Julia Evans (https://jvns.ca/), who has a similar informal, conversational, exploratory style.

Stay curious!


I am very glad you liked it! Julia Evans is one of my biggest inspirations ever. I have a lot of her zines. SHE ACTUALLY READ MY ARTICLE EARLIER AND I JUST ABOUT EXPLODED https://twitter.com/b0rk/status/1689331862487932928


WHOOOO! Congratulations, that's awesome!


Amazing article. I have not seen such a comprehensive guide yet. Although when I have seen the title I was expecting it would be focused more on CPU inner workings: superscalar pipelines, frontend, backend, instruction fusing, memory banks, instruction cache, branch prediction, instruction reordering etc.


I read through the first couple parts, this is exceptionally well written, clear, and AFAICT, accurate. Nicely done.


Recognize OP is the same author for water.css. Thank you for this good writeup on how things work internally.


This is awesome, and so is your website. Programming, writing, flying, music, that 's wild. Congrats!


Note, in response to the title, that I don't mean "awesome for a 17yo", I mean awesome, period.


Looks like a cool guide. Something seems off about this, though. Can't quite put my finger on it.


I thought it’d be lower level as it’s still very abstract and more of an operating systems guide than a guide on how CPUs physically run programs. Like how a string of current representing 1s and 0s allows the CPU to do certain things, etc.

But well organized none the less!


Kid, first of all great job! Second of all I am jealous you understand this as such a young age. I was pounding 40 ounces of Old English in the hood at 17, but should have doing something like this. Granted, the internet was just a baby when I was 17.


I like that Gen-Z. Back to the basics! Listening to 80s music! Go younglings, go!


I wish I was taught these things in my basic degree. Thank you for the writeup


When I was 11 I coded in assembly and machine language, what do I win?


Depends, did you do anything more than "Hello world"?


It's just someone impersonating a kid to get GitHub stars. I am baffled HackerNews bought it.


Seriously impressive, I'm 27 and I didn't know half the stuff you put in this guide.


This is really, really awesome. Excellent work. You’ve a bright career in front of you, I reckon.


Wow, found your github from this and getting insane imposter syndrome haha, very impressive work


Nicely done, congrats. A really good reference, and the style and flow is very nice.


Wow what incredible timing, I was just looking for something like this yesterday!


wow 17! excellent work. You're going places for sure if you keep this up. I would guess even most working Devs don't have any real concept of how things work at the CPU level let alone during high school.


This is legit, nice work! I'm going to link this to some friends


I've only skimmed so far, but this looks really great. Good work!


Very nicely written. Great work! I'm passing it around. Thanks


Awesome! All I could do at 17 was chase skirt and get drunk.


You've found your calling! Keep up the amazing work.


Yo this is fantastic! Keep doing work like this :-)


Very cool.

What a novel way to show your skills to employers.


I'm 43 and wanted to say nice work.


How much help did you get from GPT-4?


I covered this:

> I talked to GPT-3.5 and GPT-4 a decent amount while writing this article. While they lied to me a lot and most of the information was useless, they were sometimes very helpful for working through problems. LLM assistance can be net positive if you’re aware of their limitations and are extremely skeptical of everything they say. That said, they’re terrible at writing. Don’t let them write for you.

https://cpu.land/epilogue#acknowledgements

To elaborate, I had perhaps 4-6 "conversations" with various GPTs. They consisted of me asking some question or expressing confusion about something I was having trouble researching in case the LLM could either pick up on my confusion and be helpful, or give me a better source to look through than Google. The latter approach never worked, it always made up bullshit, but the former did once or twice — before the conversation deviated into lies, at least, the models helped me get my thoughts straight.

At their best they felt like talking through a problem with someone smarter than me. At their worst they were a waste of time and actively misleading. They were usually at their worst. I did not use language models as primary sources for anything; where they helped me clarify my thoughts, that simply helped me know what to research normally, and the only other time I used them was to find a file in the Linux kernel that contained some code I was looking for but didn't know verbatim.

Otherwise, the article is entirely originally researched and certainly originally written.


Where did you hear Linkin Park?


There's a teens react to LP video on YT. It's refreshing. It's how previous generations discovered older music. Also the one on Blink 182...is like, "my Mom loves that song"...so a lot of times that's how.


I have 10,610 songs in my music library, I listen to a lot of different music! Linkin Park, however, is quite popular — of course, "In the End" itself was widely shared as a meme when I was younger, but I've also just heard it playing out loud in the world.


Mrmrmmm, the youngling, an engineer with skills in writing, is not just that. An effective PIRATE, and a connoiseur of fine music, she is also. Great things, destined to do, she is.


computers are _so cool_ :)


Nicely done!


Great work!


:x86:

Apart from the drawing boards at Intel when the 80286 was designed, can anyone find any evidence or have first-hand experience that rings 1 and 2 were ever used in any specific commercial purpose? (I haven't heard or seen of it.) 286 protected mode was generally a flop (outside of OS/2), while 386 protected mode was significantly better. The LDTR (-> LDT) and hardware task switching (reloading TR) aren't typically used. GDTR (-> GDT), TR (->TSS) and IDTR (->IDT) setup are essential.

The TSS includes where to find the 6-7 stacks, and for a permissions mask for port I/O.

Double fault handler (in the IDT -> INT 8) is typically what leads to a colorful screen of death. When the double fault handler fails (or the IDT or was overwritten), a triple fault happens and the CPU halts or the virtual machine does it own impression of a colorful screen of death without (usually) actually killing the host machine.

An invalid opcode handler permits handling CPU-unknown opcodes in the OS such as using emulation or other hardware.

Note that generally most PCs prior to UEFI/EFI booted in real mode with 1 core running, so writing a parallel and concurrent OS typically involves setting up protected mode structures, switching to protected mode, and then talking to the APIC for every other core.

Resources:

https://sandpile.org

http://ref.x86asm.net/geek64.html

https://cdrdv2.intel.com/v1/dl/getContent/671200

https://www.intel.com/content/www/us/en/docs/intrinsics-guid...

https://www.amd.com/en/support/tech-docs/amd64-architecture-...

SVGA/VGA/EGA

https://wiki.osdev.org/Expanded_Main_Page

http://www.osdever.net/FreeVGA/home.htm

https://archive.org/details/gpbb20/

https://archive.org/details/programmersguidetotheegavgaandsu...

System management

https://wiki.osdev.org/Symmetric_Multiprocessing

https://pdos.csail.mit.edu/6.828/2018/readings/i386/s09_08.h... <- CPU fault handlers

https://ctyme.com/rbrown.htm

https://wiki.osdev.org/UEFI

USB

https://beyondlogic.org/usbnutshell/usb1.shtml

Emulators

https://www.qemu.org (qemu-system-x86_64 or qemu-system-i386 for very retro)

You'll usually want to create an ELF32x64 binary kernel image that's a bit of real mode, ia32 and then switches to 64-bit registers mostly (ia32e [amd64/x86_64]).


Meta point: I'm curious of people's thoughts when OPs in general post "I'm N years old..." or "I'm a blind programmer who..." before posting something unrelated to their life situation. Personally I think this falls into the "girls do not exist on the internet" rule of thumb. No reason to state that you're a girl (or 12 years old or whatever) other than to prime people to look upon your writing with lower standards. I think it is fine to mention your situation in passing or as a footnote if it is interesting, but unsure it really adds anything to lead with it (personally I think it detracts).

That's just my opinion. Anyone else have thoughts?

FWIW this seems pretty well written from just skimming the intro and a bit of chapter 3.

Edit: sorry I did not mean for this to be the top comment and push OP down so far. The writing here is really good and folks should definitely click through and at least give it a skim if they're interested. I was just curious what people's thoughts were on this topic.


I think kids tend to do it in traditionally adult spaces just as a buffer from the extreme criticism that can exist online. Saying, "someone wrote a cpu guide" has a lot less impact than saying "a teenage girl wrote a cpu guide". I don't think the individual behind the content is all that important and we have to recognize that we only ever see the kids that tell us they're kids, i could be an 8 year old prodigy for all anyone knows.

Praise is nice, especially when you're young. Stating your young has to dramatically increase praise to criticism ratio in adult spaces like this. No one would care on reddit but we're all old men here lol so the kids stand out. It can be overdone if the person makes the content about themselves instead of the content, or intentionally starts controversy. But this is fine.

Since she states she won a hackathon she's probably a very gifted and motivated young person who is still surrounded with adults telling her she's great, that doesn't mean they're wrong. And sooner or later the adults begin drastically raising the bar for praising you so it's cool for her to get her last couple years out of it.


I think this approach is pretty alien to me. When I was a kid, I hid the fact I was young on the internet because I wanted to be taken seriously and not be dismissed or treated with kids gloves due to my age. I didn't like that I could show a website I built to someone in life and have them say 'oh wow good job kid'. Internet is brutal but usually pretty honest.


Encouraging youth is a good thing. But I suspect that the youth most in need of encouragement are the sort of kids who wouldn't even have the courage to post something to HN. Maybe I'm wrong. But I think that's why these things rub me the wrong way.

These things always strike me as merely fishing for praise, while trying to stifle legitimate criticism, which is unfair to everyone else, old and young alike.


Agreed. Honestly, I thought it was always about humble brag. For example: Look at me I'm a 5 yr old cat living in a $2M loft in Soho while I go to school in MIT riding my private helicopter. I'm such a success at cat.


Kind of a tangent off your point but you’re referencing the "girls do not exist on the internet" phrase like it was used by people to say don’t bring your identity onto an anonymous forum.

Was that how it was used? Back in my forum and chan days when that phrase was common I only saw it used to explicitly try to force women off the internet. It was usually quickly followed up with “tits or gtfo” for instance and I never saw an equivalent push on anyone mentioning they were a man


I've always understood "tits or gtfo" to suggest that if you are mentioning you are a girl then do something related to it (show breasts) or else shut up about being a girl when its not relevant.


That's on the charitable side. It reads to me more like "a woman's acceptance in this community is predicated on her providing sexual favours". But your interpretation still reveals an issue: men are allowed to express themselves in small (and often large) ways that are "not relevant", so why this (pervasive) attitude when women do it?


I cant recall a single instance where the woman in question delivered and then wasn't met with immediate derision and misogny. It was a pretty explicit anti woman statement


I agree with your take. It also reminds me of those comments on youtube videos of classic rock songs (or any music older than 5 years at this point) that go along the lines of "I'm X years old, and I like this REAL music".

It always seems like the intent is for internet strangers to pat you on the back and tell you how special/precocious you are.


I know what you mean, but I see it a little differently.

1. I was a precocious kid, and as much as it might rub people the wrong way, there’s nothing wrong with being proud of yourself for that and bringing attention to it. The downside is that you can (at least some of us) have one hell of a shock once you’re out of the wunderkind age range and you’ve made that a core part of your identity.

2. Different standards are warranted with youth. That was a big reason why I prefaced everything with my age growing up. Programmers are notoriously nitpicky and curmudgeonly, but they do tend to have a special place in their heart for youthful curiosity and enthusiasm, and are more willing to give constructive feedback to someone in that phase than a mature adult who won’t just RTFM. I know I feel that way.


Mixed feelings, but overall, I think it does two things:

- promotes more civil discussion. Commentors are less mean, less dismissive, and when posting criticism it's usually more constructive

- gets the article more attention/traction than normal as a subset of the audience is impressed by the accomplishment/effort given the backstory

Which both benefit the author, and that's fine. We talk about hooks and marketing all the time here; this is harmless but effective, which IMO is the best kind.


Can't wait for people to start abusing this strat by lying about their age.


Yeah, for me the breakout interesting thing about this post was that it was written by a 17 year old.


Gently: Could we please not? :)

This is a young person who’s new to the community. That’s where communities come from!

Without other comment, this meta thread is pushing the author’s personal statement below the fold. As the grown-ass adults in the room, we can have our judgments, but we should try to make a little space, too.


I disagree with your premise of censoring as a "way to be welcoming".

There's a valid discussion to be had about the title and the need to preface things with age.

As a a grown ass adult you're not making space, you're demanding ranking that favours your ideas and even more, the silence of some you disagree with.

The funniest thing is that you didn't even address the guide itself. The style or the content, you just came to do "justice" or something.

Content is good enough. Style is a bit "full of wonder and amazement" and has its own particular flair but that's just personal style, regardless of age. You could even argue that age is necessary to account for "the audience are other high schoolers, but that's if you have the discussion you're asking people not to have (and whoever agrees with your censoring mindset and downvoted OP to now effectively hiding it from. View).


Thanks for your feedback! I’m not sure I’m following, but it’s always interesting to hear perspectives from people with different life experiences. Take care :)


Fully agree with this, I have a very similar response when I see these.

That said, the tone of my reply to OP would have been different if hadn't known their age. Or more likely I would have thought to myself "so you understand CPUs, cool" and moved on.


Doesn't the idiom "girls do not exist on the internet" itself point to the "standard" that, in this circles, men think of the people are interacting with as men, as "default", and the not-default is women. And we come back to the invisibilization of people who are not a white young guy in tech.

I'd wish to get as far away as possible from such an attitude, and thus really enjoy when I see people outside of this "norm" to make themselves known and attract more heterogenous people that can enrich the community.


I don’t think it’s intended as such, but the particular phrasing as expressed could be misinterpreted as such. Therefore it should be expressed differently; don’t choose to differentiate based on attributes that shouldn’t make a difference for your message.


I don't think it's any worse that appealing to authority ("patio11 said that ....") or referencing that you were in YC. In those cases as well, quality should stand on its own. Even so, I think context helps, and it's not always the same as having lower standards.


Not commenting on whether it's good/bad, but I think it's completely different to your examples. Both of those say "here is context of my past achievements; this is an indication that what follows may be of high absolute quality"

As opposed to the "I am N years old", which is saying "here is context, which doesn't by itself indicate achievement, which may be an indication that what follows is of high relative quality"


I don't mind. I will form some opinion after I see the work, and yes, my standards will be different depending on the life circumstances of the OP, especially when it comes to some disability which makes the work more difficult, like being blind. In such a case, mentioning it could be encouraging to other people facing similar hurdles.


I just wanted to give another interpretation of the phasing used in the post. I took it more like this should be considered more impressive because it is well written AND written by someone so young. And i suppose it's fair to feel like that as the author cause its a pretty good post


> other than to prime people to look upon your writing with lower standards.

I have to admit, a fault of mine is that I sometimes do the opposite. When I read stories about someone younger than me achieving stuff, it's easy to look for faults to diminish the accomplishment.

"High school student made X? Pft, I found a small flaw when skimming the story therefore the whole thing is probably bad and I don't have to feel bad about not achieving the same at that age."


> other than to prime people to look upon your writing with lower standards

Yes, that's the point. They're 17, so my expectations are lower.

I am impressed a 17 year old wrote this website.


It's always great when I see someone's work and it's interesting so I look the person up and it turns out that is not the stereotypical demographic (eg. 20 to 40 yo dude), but if I see demographic factor paraded upfront I feel like it is meant to compensate for lacking work (if not then what is the purpose?) and I automatically counter compensate so it tends to lower my first impression


This is very impressive at 17, and her Github activity. So age definitely qualifies. We were fascinated with Mark Zuckerberg, or Bill Gates at their ages, so this isn't anything new. Doing anything remarkable at a young age is impressive and worthy to be noted.

This is a young person making strides in a hard field, and quite different from majority of peers at their age, but especially today with short attention spans, and TikTok, gaming, other distractions.

Another point is many high schools tend to coddle their students even for very little accomplishments. I was surprised by the attention one 16 yr old student got for making an interesting yet simple science explainer about the sleep cycle. She won the award in the Breakthrough challenge competition, got a 250K scholarship, and given feedback to the tune that she would cure cancer.

There's also a huge competitive gap between US High schools today on average, than many other countries. It would be great for the US system to challenge HS students again and reward real accomplishment.


If you want people judging you, mention that, if the goal is for others to judge the work, don't.


On the other hand it is very positive because it inspires others in whatever “I am X” situation to get involved. It’s not uncommon for other fields like sports etc to have similar influence: “someone else like me is doing this so I can do it too!”


Everything is marketing.

The age thing is a bit of a red flag now given that too many parents in this industry try to give their kid an advantage by presenting their work as the work of their child. Seed the internet with all of the amazing accomplishments your kid "achieved" at a young age. I now look on the "I'm NN years old and I..." claims with skepticism, to the point where it's a net negative. It makes the project look worse.

Not in this case. It probably a nice guide. Already know how CPUs work so didn't read it, but it's entirely reasonable that a teen could do the work and assemble the details of how a CPU works.


If the HN community was kinder, this kind of caveat would not be necessary


Also, 17? At that point young adults have reached cognitive maturity[1], so perhaps that's a good time to drop the qualifiers.

1. https://pubmed.ncbi.nlm.nih.gov/30762417/


Internet is a different place now. People bring their personalities and identities in. Don't think there is anything wrong with it. Also nothing wrong with being more patient with very young people and more impressed with their achievements IMO.


I think some of the negativity towards identity is that it’s usually not something that can be merited as your doing and not usually relevant.

I understand this criticism, but think it’s a lost cause to point it out.


Maybe in this case it’s a way to avoid too harsh criticism. It’s like, “hey, if I’m missing something I don’t have a degree so don’t take this as an authoritative piece”.


If it serves like a content warning then sure but it's dubious, a lot of outright junk appears here without anything like that-- that's what discussion threads are for:)

and about avoiding negative feedback, well I'm 36 and I don't have a degree... What can I do to avoid harsh criticism? I clearly can't use my age right? (right now I try to ignore it if it's not constructive or try to not become depressed and use it if it is, but overall I don't think this industry is so harsh criticism wise)


A lovely piece of advice I came across from Larry King in his book How to Talk to Anyone, Anytime, Anywhere:

If you're about to give a public speech and you don't have much experience (and are thus anxious), come out and state as the first thing (something like) "Hey, I don't talk publicly and this is scary, but I'll try my best".

This immediately can make you less anxious as any mistakes you make can be understood (rather than criticized) by the audience.

When commenting outside my area of expertise I often mention my lack of qualifications so as to invite gentle corrections to my mistakes (and to make others not assume I speak from great knowledge).


I don't support racism, sexism, ageism etc. Therefore I would prefer it if people didn't try to use those attributes to draw attention.


What do you mean "rule of thumb"? That's just an old meme making fun of how most Internet users were male. It's not about how you shouldn't mention gender.

If gender is never mentioned, maleness is assumed all the time, and then male users will not realize the abilities of their female peers.


I think something valuable is lost if everyone must present irrelevant identity information to combat false assumptions some may hold. At least for the crowd who want to primarily be associated with attributes that can be merited to their choosing.


Rule of Thumb comes from ancient measure systems (its basically an inch)


The only thing is I hate titles like this—the title is bragging a bit too much.

Maybe a title more like:

“How I’m helping my high school peers learn about CPUs”.

That way it implies you’re young, smart, and care about others—all of which I imagine are true :)


I didn't really like the title of the HN post either, but I don't think it needs much change. Just drop the "I'm 17 and" part and do "I wrote this guide on hwo CPUs run progra..."

The "I'm 17" part just felt irrelevant to me in a technical work. I.e. it's either technically correct or it's not. However, I can see a strong argument about why it should be in the title. If you care about meta details of the work, it's certainly different and interesting to be produced by someone so young. I suppose part of this is my own life experience talking, and very likely means I'm projecting. When I was a teenager I hated divulging that to others on the internet because they treated me differently when they found out how old I was.


Well—it could go either way but I do think it is notable that the author is young and helping peers. Based on original post seems important to them. But the exact number 17 not important either way!

I might click through and read an article to see what the youth of today are creating, but not so much care about an adult writing something on CPUs haha


It's not clear whether you're referring to the HN post title or the article title itself ("Putting the You in CPU"). In either case I think your proposed edit makes it much worse.

The HN post title is accurate and describes what they did. It makes a choice to center the writer rather than the guide, which has certain consequences about how readers will approach it, but is perfectly appropriate. Of course, your proposed edit makes the same choice, and is arguably more of a brag than the original.

The article title is cute, which is not what you would want for a dry technical manual but appropriate in this case, where the style is supposed to be fun and entertaining as well as informative.


The HN title I meant!


Except the title wouldn't be entirely accurate - there's plenty of "Senior Engineers" who are really just framework assembly liners who could learn something from this writing :-)


Hacker News automatically removes the word "How" from the beginning of submissions (and then Title Cases them for good measure) - so I believe submitting that title would instead result in this:

    I'm Helping My High School Peers Learn About CPUs
I might be wrong though, it might only do this with "How to ...".


Good to hear I'm not the only one who finds that form of title unpalatable. IMO one of the most annoying headline trends of the clickbait era.


'How to' is one of the title forms that isn't modified, currently.


I submitted https://simonwillison.net/2023/Aug/6/annotated-presentations... the other day - title "How I make annotated presentations" - and it was automatically re-titled to "I Make Annotated Presentations" - but then renamed back again, I presume by a moderator.

https://news.ycombinator.com/item?id=37024398


What happened to the Show HN: on this?


I took it off because reading material isn't supposed to be Show HN.

That's not any judgment about the content, which as far as I can tell is great.

https://news.ycombinator.com/showhn.html


[flagged]


The author uses the pronouns "she/her".


Yes. The author is, like me, a male trans woman.


Why do you think there are so many of you fellows in this industry, compared to the general population? It's a noticeable skew that has always baffled me.


We tend to be introverted (for reasons), and if you can code you're more likely to have the socioeconomic status to be able to transition, or rather the same conditions are correlated with both abilities. It's also a non-public-facing industry in general, which makes it less likely we'll be discriminated against in hiring or employment. So part of it is predisposition and part of it is simply greater visibility.


How do you know this, though?


Facts. What I wonder is if everyone else wants to believe or really just doesn't care. For some reason, I really hate being lied to.


It honestly isn’t important, so I can see why people don’t care. But it is interesting, at least to me. :)


I find it interesting because of just how over the top feminine it is. Basically, the most feminine people on the planet are men too.


Wow, so because you're 17 we should click on this article? I was 17 also some time ago


"Please don't pick the most provocative thing in an article or post to complain about in the thread. Find something interesting to respond to instead."

"Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something."

https://news.ycombinator.com/newsguidelines.html


True, sorry


Ah but did you make anything interesting when you were 17? Most people were 17 at some point, but most of them didn’t make anything particularly impressive at that age.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: