Hacker News new | past | comments | ask | show | jobs | submit login
"He started programming when he was 6" (mjohn5ton.com)
61 points by cloudmike on Sept 1, 2011 | hide | past | favorite | 61 comments



I don't think most people are bragging about when they started programming.

For me, it's just a fact I throw out to emphasize that it's my life, not just a hobby or a job. It's seriously part of who I am, right to the core.

For the record, it was 4th grade for me... About 10 years old. Had it been younger, I'm sure I'd still have glommed onto it just the same... That was just the first exposure I got. Hint: Expose your kids to things young. It could make a big difference in their lives.


I agree with this. The author seems to think that saying you started programming early is some sort of statement about virtuosity, but at least for me, it was more about curiosity.


I can see how you got that. My argument was somewhat unclear, but the point I was trying to make was that celebrating the age at which people started programming makes it seem less accessible to those who don't program. And I think that's counterproductive to the goal of encouraging more people to program at all ages (which is a goal you may or may not agree with).

The discussions below about sports are interesting, though. Lots of kids seem to play basketball even though (or arguably because) the stars are put on pedestals. But not everyone plays basketball.


Totally agree with 'seem less accessible' part.

Starting to program at the age of 6 is nice and good(if that is out of curiosity), but for me - I'd say - wait for sometime,do the things that you are supposed to do at that age, and when the time comes to be different, you'll know and follow it then. There is no point in bragging about when a person started to code.He might miss out on a lot of things as a child which might make him a different person and constrict his 'social' network. As a child spending more time in the programming world is not desirable, after all we all are programming towards making things work like humans would work, and we are missing out on the 'being human' part by programming in the time in which we are still growing as a human?


In all honesty, I only ever mention that I've been programming since the age of 6 when I'm trying to impress someone -- someone non-technical, at that, and very rarely. Technical people are not likely to be impressed, as they know exactly what that really means. Although sometimes it's a useful weapon in the breaking-gender-stereotype wars, too. "She was how old?"


I'm a big believer in the 10,000 hours thing, but people always forget the 'deliberate practice' part. The reason I'm still a bad driver after easily putting in 10,000 hours is that I've never tried to get better.

I started programming much later in life but my skills improved quickly because I made (and still make) an effort to put in 1-3 hours a night of practice, doing exercises, trying something new, etc. Writing 10k hours of code does not make one an expert (otherwise there wouldn't be so many awful programmers out there).

The real advantage to those who start young is that later on in life they don't have the start up cost of lots of fumbling and learning just to get started. Additionally if they go the CS route a lot of the intro classes will be easier allowing them to get ahead faster if they so choose.

The same is true of musicians, if you think back to high school there's tons of kids that play guitar, but the group that goes home and practices scales all night rather than just playing their favorite songs tend to be the ones that go one to be serious musicians later on.


Absolutely right. I don't remember what age I started programming. I'm 34 now and I started by typing in BASIC programs from Byte magazine when I was in elementary school.

I was an awful programmer until I started doing it professionally. It was only then that I deliberately worked at improving at it.

Now I'm in college but if I had gone after high school my only advantage would have been a familiarity with the tools and syntax.


"Programming is so hard that in order to be good at it, you should've started when you were young."

Strawman.

Putting in 10,000 hours on _anything_ is so hard that you should've started when you were young. That way when everything else in life (schooling, emotional & physical maturity, legalities, social networking, etc.) converges you're ready to take full advantage of the opportunity at a point where most would-be competitors are just getting their bearings.

I started programming in 4th grade. For medical reasons, bypassed the usual school sports route and got in 2-6 hours every day programming. Sold my first program before leaving high school, having already gotten my 10,000 hours in. First two jobs were programmer - before college. Not bragging; making the point that there's no way I could have been that far ahead of fellow students without having started at age 10. It's not that it was hard, it was that I couldn't have gotten to that level of mastery of _anything_ by graduation had I not started early.

Kicking numbers around, I notice that 4 years of college amounts to cramming that 10,000 hours in on the subject of your choice once you realize you in fact need to master a subject to make a living.


I started "programming" when I was somewhere around 6, when my stepfather made me type 4000 line BASIC programs into his Commodore 64 out of a hobbyist magazine. He taught me very simple debugging (syntax error on line 1230? 1225 GOTO 1235 and retype that line on 1235).

I barely understood what the code was doing, but it ingrained in me from a very young age that programming was something that I was capable of doing. I never thought that that programming was some mystical art beyond the capabilities of mere mortals, it was just another toy to play with.

Later on in 1st and 2nd grade, I used to impress my friends and teachers by writing very simple choose your own adventure games in BASIC during computer class. This taught me that not only could I do it, it was something that was WORTH doing, and later in 5th-6th grade my group always had the best Logo programs and Lego Logo creations. When I got my first modem (300 baud), AT commands didn't faze me one bit and I discovered the magic world of BBSes, text-based games on GEnie, and eventually the Internet. Teaching myself HTML in 8th grade was a snap, and my friends and I used to always try to one-up each other with our webpages (I was so jealous when my friend Dylan got on Cool Site of the Day with his paper airplane page).

I honestly don't think that I'm smarter than your average hacker or that I have some innate aptitude for programming. I would say that learning even simple programming at a young age is extremely valuable because it instills a hacker mindset in you. Like the article says, if you can read, type, and think, you can program. Starting from a young age, you never question that. The biggest obstacle to learning how to program when you're older is convincing yourself that you can actually do it.


I started programming when I was about 8, but I'd never admit it because that means I've been programming for over 30 years and I'm still this awful at it.


Indeed. "I've been doing X for Y years" doesn't tell me anything about how good someone is at X, but it does set expectations about how much their level may change in the near future.

If you've been writing C++ for a year, there's a good chance you'll get much better at it within the next year. If you've been doing it for 20 years, chances are your improvement over the next year will be marginal.


I started when I was ~6 and I'm probably worse!


I think the key word here is "intensity".

For example, it is typical in martial arts to hear people bragging that they did Kung-Fu for 5 years or Karate since age 10. But if the training was nothing more than a 30 min warmup once or twice a week at a downtown studio with "World of Karate" sign, it is actually worse than not having any training at all. People with such training track genuinely overestimate their ability and put them self through unnecessary health risks.

Someone who followed peer-reviewed journals on software and design and implemented complex system in a 5 year span is simply going to be better anyone who did HTML/Javascript websites for 20 years, no matter what their age difference is or when they started programming.


I'm curious here: would HNers please share their stories or knowledge of people who are good programmers, and who started 'late' in their lives? (Say - 18 and older, as an arbitrary metric).

Of course, what a good programmer means would probably vary from person to person - I'd appreciate it if you mention in passing what you mean by it.


I didn't do any programming until my senior year of high school, in a terrible intro to C++ class. I learned very little; I produced some small programs that solved brain teasers, but I had very little understanding of what I was doing. I had almost no understanding of how the program worked, no concept of the abstractions I was using, and no understanding of how a computer system worked. I didn't truly start programming until my freshmen year of college. (And 11 years later, I actually taught a section of that college course.)

I consider myself a good programmer. I have an understanding of the whole system stack (processor, kernel, the kind of code the compiler produces to support the abstractions I use, and understanding of the semantics of the languages I use), and I can reason about and implement non-trivial systems with understandable solutions.

My papers and some code are available here: http://people.cs.vt.edu/~scschnei/


In the companies I have worked at, there were always a bunch of 45+ programmers who were quite good at it. Most of them seem to have started only in their early 20s. (Anecdotal of course)


I learned to use a computer at 18, when I started engineering at NCSU. I started programming at 19-20. I knew nothing, other than I loved it. Haven't looked back since. That was 18 years ago.

I was a core contributor to Lotus Notes and Domino, MySQL and I am the creator of CouchDB and well respected as a programmer and architect and founder of start up that s doing very well. I'm still learning and still love it. I plan to do this for another 20 years.


My current team is fairly old - most of us are in their 40s so most of us didn't have access to home computers when we were really little and programming wasn't taught in most schools.

I'd say all of them are good programmers by my definition but we're doing embedded development in C so there's not much going on with web stuff, databases, dynamically typed or functional languages.


I started programming with Arduino and Processing in my last year of college, then found out I could get paid to write awful PHP, decided I didn't like doing that, learned Ruby, and now write MVC JavaScript for a living. I've been out of college for 2 years.

That being said I do have vague recollections of MicroWorlds in fourth grade.


I was 19, my second semester in college, when I started programming. Incidentally, I've heard from at least 3 other successful women programmers that they started in college as well. (I'm defining "successful" as "speaks at conferences, generally well known and respected in their fields").


I was taught BASIC when I was around 8 and "helped" my dad with homework when he was in college for a CS degree, but I never did anything with it. I started taking CS courses as a continuing student at William and Mary in 2005 and found I really liked it and was, in fact good at it. My parents had told me that I would be great with computers for years, but I "knew better" and fought them until I wised up.

6 years later, and I've got my M.S., good work experience, job offers when I'm not even looking and enough difficult programs under my belt that I know I measure up well to the stack of people with my level of experience.


One of the best programmers I've ever worked with didn't start until a year or two after college; he had degrees in theology and philosophy, and was working as a graphic artist at the time.


Sidebar: you'd have to have access to a computer at 6 to start programming at 6. This was not a given even for a lot of Gen Y today. Not to mention the parental (or other source) mentoring and guidance that leads to a kid pursuing programming as a hobby at 6. While this is an easy opportunity in middle-upper class families, not every capable kid is going to grow up in that type of environment. IMO, this is where access to technology is a equal opportunity issue, as these high tech jobs and paths to entrepreneurship now often pass through these formative years in childhood.


Back in my day, the BBC Micro ecosystem meant that your parents had very little to do with it -- you had access to a good programmable computer at school. I was lucky enough to have a ZX Spectrum at home, which my (decidedly not middle/upper-class) parents bought to keep me occupied with games, and which I figured out how to program; but there was no mentoring, guidance or stacked decks of computer science PhD parents in my history, and neither in many of the histories of similar inquisitive kids who had a manual and a basic home computer.

I'm actively thinking about ways the technology of today can be used to bring back this golden age of opportunity, thoughts are welcome.


This. I first did programming when I was 11, using BASIC on our first computer; it was not an encouraged hobby. I only programmed regularly when I got a TI-83. But this is just to emphasize your point about technology access as children and parental encouragement.

But just because this is so much of an ego stroking thread, I'm going to characterize the latter as "early entry into mobile".


Well, the analogy "I started talking at 2" is silly, because everyone starts talking around that time (assuming normal circumstances), it's built in, while starting to learn other skills at a very young age would sound impressive. Consider someone who says "I started to play the piano when I was 4", wouldn't you be impressed?

I think we all agree that the earlier you start somethings the better you get at them, e.g. by age eight they say you are too late to start learning the violin, if you want to become extremely good with it. Learning a language is like this, too. Is programming, or for that matter math one of these skills? I think early exposure definitely helps, but is by no means a determining factor for later success.


I have two responses: (both in agreement)

1) We don't see such comparisons in some fields, which I find curious - tennis, for instance, is one where all its stars started playing at a young age. Nobody really comments on the age these players started playing anymore, since it's taken for granted that to be good you have to start really young.

2) G. H. Hardy makes a somewhat related observation in A Mathematician's Apology:

To take a simple illustration at a comparatively humble level, the average age of election to the Royal Society is lowest in mathematics. We can naturally find much more striking illustrations. We may consider, for example, the career of a man who was certainly one of the world's three greatest mathematicians. Newton gave up mathematics at fifty, and had lost his enthusiasm long before; he had recognized no doubt by the time he was forty that his greatest creative days were over.

http://amathematiciansapology.pandamian.com/4/#p[IhbHmb],h[I...

In fact, I would think that chess and math and perhaps programming(?) are more similar, and starting young seems almost to be a constant amongst its greats. So I'm really curious if anyone here on HN can point to a counter-example.


I think you have to start very young in most sports now, not just tennis. I know swimming is that way, and Michael Phelps is a bit of an exception because he started training at the late age of 7!


A correlation to this is "I've been doing this for 20 (or 25, or 30, whatever) years..." whenever someone is trying to establish themselves as a position of authority,

My standard response is a knee-jerk: "You can be bad at something for 20 years, it's not something you should point out, though."

It's fine if you are having a discussion and some context is helpful, such as, "I've been selling residential real estate in the area for about 20 years" as opposed to "Why would I need to protect against SQL injection? I've been working with databases for 20 years and it's never been a problem."


Most of the people I know started coding before 14 years of age. I think it's difficult for older people to learn simply because older people don't have the time to commit to coding, since there are other obligations, like work, etc.

Wish they made a website like that for C++.


That's selection bias. I'm guessing you don't know wide amounts of people who learned to code something real after finding VBA was a few miles short of where they needed to go.

Work in a non-computer industry, you'll see several people who learn to code at 25+ or even 35+. They often don't take huge amounts of self esteem out of this (it's just another thing they do at work) and don't go hog wild for "geek culture", but they program nonetheless. They often think people who do go far into that stuff will be contemptuous of them, so may be hiding it from you. I know I have a reputation of being nice to everyone but liars, and have had lots of questions on this stuff from random professionals about their personal coding

There is little that is special about this thing we do in the grand scope of things that humans do.

BTW, professional assistant/secretarial types love them some VBA/bash scripting once they figure out what it is (a way to do less repetitive work). I've seen 3 take up automation of excel or word.


My grandpa started coding when he was 80 and he quickly got the hang of it. He's dead now, though, but I don't think there's any connection.


If you chimed in on this thread just so you can say what age you started programming, then chances are, this behavior is just exactly what the author is talking about.

What annoys me the most is when these forms of expression are injected as a way to gain legitimacy, because quite frankly, its just narcissistic and shows insecurity.

Its the same league as saying 'oh, look I believe my penis actually is bigger than yours'. -- 'Oh hmm okay, cool, good for you, you are so special.'

SHOW ME THE FUCKING CODE!


Well I've had the pleasure of working with someone who started programming when they were 6, is proud that they did so, and seems to believe that all you need is a belief that it can be done. Said individual was very good at delivering cut-and-paste work and "hacking" it so the page loaded from a straightforward database. But the first sign of a business requirement beyond this resulted in delay after delay, excuse after excuse and promise after promise. Eventually they were let go. The guy could talk the talk all day, and get the first week or even month of tasks done. But beyond that, a total failure, and on increasingly critical tasks.

So, no, someone saying that they've been programming since they were 6 is not a useful indicator.


I don't know if there is any correlation between the age someone started coding and the quality of the code they produce, but you can be sure that someone who programmed as a kid enjoys the practice[1], which is orthogonal and also important.

[1] or at least used to enjoy it


The age you started programming doesn't matter.

Totally agree. I don't even remember when I started programming. My parents say I was 3 or 4. But before I can remember (supposedly on a Vic 20 but first computer I remember was the BBC Micro!). And while it may have helped me have an interest in programming later on, I'm pretty sure it hasn't helped my aptitude.. hard practice and effort, even as an adult, are the only things that help there.

There are plenty of people who started coding as teenagers or even adults who could code me into a corner in seconds. Starting at a young age may foster the initial interest but it's not going to help the long-term skill IMHO.


This is spot on.

The whole "I've been programming since X" attitude is BS and does nothing but continue stereotypes, I suspect stuff like this is a big part of why there's such a lack of diversity in CS.

As someone who "started programming at 14" and a current CS undergrad, I know enough to know that I know next to nothing. I know plenty of people who'd never programmed before college who are way better at it than me.

That said, I would really love to see programming taught early on, since I'm sure a lot of people who'd really love it get put off by this sort of attitude and just never get exposed to CS (every chance I get, I point people towards my school's excellent intro classes).


The age you started programming doesn't matter.

----------------------------------

Sure it does ... if they've been programming pretty regularly from age 6, by the time they're 20 they have 14 years of solid experience on someone who starts at the same age, plus you pick things up much quicker when your younger. Its the reason a lot of sports professionals (even musicians) start out pretty young.

Starting at a later age doesn't mean you can't be very good, it's just that all things being equal the person who started earlier has an advantage over you, lets not try to act like that doesn't matter ;)


Good Point. I was interested in computers all my life.

I went to a computer class when I was like 14 and was first in the class (they taught how to use MS Paint, Play prince of persia and solitare, I knew using MS Office). Then the summer class was over, and from then, I only used computers from browsing cafes sparingly.

After seven long years, I got my own PC. Then, I learned lot of things, everything on my own with the help of generous people in the interwebz. After three years, I started my first company, with my first computer. While I started, I was proficient with HTML/CSS and fairly okey with PHP. I've built real websites and worked for real clients, all because of 3 years of my hobby time.

/End_Brag

My Point is, if you like programming or hacking tech stuff, then amount of time doesn't matter.

The only envy I have with coding kids is, they are free, in the sense they don't have pressure to earn money, as we adults have, So, they can enjoy the fun part of programming more.


I disagree with this on so many levels.

"My biggest problem with celebrating the young age at which some people started programming is that it sends the wrong message. It says: 'Programming is so hard that in order to be good at it, you should've started when you were young.'"

In what way is this the "wrong message?" I think it's perfectly fair. Yes, programming (well) is difficult, and yes, starting at a young age will have more of a transformative effect on the way you think.

As the son of a computer programmer and a math professor, I was exposed to large amounts of math at a young age, and learned a little calculus before finishing elementary school. High school didn't have anything more advanced than calculus to teach me, so my peers (the math majors among them, at least) have more or less caught up to me in how much math they've been exposed to. But - in my opinion, at least, and maybe I'm just a narcissist - I'm fundamentally more of a quantitative thinker than they are.

"Were you programming non-trivial apps soon after you wrote your first line of code? Did you immediately proceed to program lots of apps, even complex ones, all by yourself, and collect payment for your work?"

Most people who start programming young make games, because that's what kids think computers are for. As a kid, I made scores of computer games. Games are complex and, when done well, independently developing games can be extremely stimulating. Before I turned 10 I was tackling things like I/O, creating intuitive user interfaces, sound, 2D graphics, and rudimentary AI. In each instance I had few resources and no training so I had to come up with ad hoc solutions myself. That I went through this process, and that I did it young, has been a huge benefit to me throughout the rest of my life.

And, side note: payment makes absolutely zero difference to your ability to do something well.

Disclaimer: I started when I was 7, so I may be biased. Not to brag or anything.

Edit: if you disagree, a comment is more helpful than a downvote.


I think it's only the "wrong message" to people who aren't young anymore and don't program yet, which is most people. It makes programming seem less accessible.

To young people, I agree: start now.

I realize this seems circular, and there's probably a better way to make my point. My motivation is all the non-programmers I meet who think programming is scary. I don't believe it is (hence the mention of codecademy.com at the end of the post).

As for payment, you're right: making something people want is a better measurement of value, regardless of whether others pay for it, and even that's not a complete measurement of your ability. But Mozart made money early, which is why I mentioned it.


Counterpoint: I have been programming since I was 6. I skipped a grade and school and was still in more advanced maths courses than anyone in my grade, though not as much as you seem to have been.

The only bit of this I usually ever mention: the bit where I've been programming since I was 6. And I don't say that as a form of bragging or a form of indicating my skill; it is merely the first sentence in response to the question "how did you get into programming?"

The main reason for this, though, is that I'm embarrassed by the fact. I feel like, if I've been programming for so long I should be better at it than I am. There seems to be an expectation that just because I taught myself how to program using little more than the source to BASIC games that I should be a super-rockstar or something and I'm not.

Point is, I don't mention it because the fact I've been programming so long doesn't really mean shit.

[EDIT: I also know a number of people who never touched code until college and got really damn good before they graduated, let alone after that.]


I can see why you would argue that you need early exposure to become a programming prodigy, but I honestly don't see why you would say that all hope of becoming a "good" programmer is lot by the time you're in your early/mid/late teens (or at any other age for that matter). I always saw programming as a meritocratic field where motivation and determination got you a long way as long as your code stood the distance. As a CS college student, I see a fair amount of people who have been coding for years being blown out of the water by people who hardly programmed before college. They might not be prodigies, sure, but saying that they'll never become serious programmers is a pretty narrow-minded statement.


The age you started programming doesn't matter.

False. But it is only one indicator, and people lie.

I believe that there is natural aptitude, there is experience, and finally there is motivation.

The age you started programming is most certainly an indicator for experience, and also I believe motivation. But it is only an indicator, not an accurate measure. If someone started programming at age 6, but didn't do anything with it till college, then they have less experience than a social outcast who came home from school every day and programmed to the detriment of homework. It would also suggest a lack of motivation.

Logically, the best programmer maximizes all three, but how does one measure them and how common is such a person?


I often say that as a starting point to establish my background but I follow it up with: in my teens I wrote shareware games for pocket money, I did comp-sci for 3 years at high school and after school did I a 3 year course in IT. And now I have been doing commercial software development for nearly 18 years.

The advantage of starting early of course was that I looked like a boy genius when I first started working in the industry but actually I just had, had more experience then my peers and I have always been able to stay ahead of the curve ssince then :-)

Scarily this all means that I have been coding almost daily for 30 years which is conservatively is more then 35000 hours...


Haha.. A little ominous but also kind of encouraging. I started at 27 and am now 29. My path has been biotech grad in industrial lab, kindergarten teacher in Japan, handling manufacturing and accounting for a Japanese toy camera maker, PHP hackishness and now joyfull obsession with ruby and JavaScript.. By all accounts I should give up now but it's just too much fun. Oh, and when I was 6 I mainly liked Lego, futzing about with a chem set, digging trenches and searching for frogs at the river and exploring forests with 2 dogs. Not exactly career enhancing activities but I'd say it was time well spent.


The best programmers I've hired where those who love programming. It's probably the case for any skills. The critical question I ask is if the developer has personal programming projects. It's usually an indication that the person breathes/loves the stuff.

Starting at an early age is great but doesn't mean much. Continuing doing it with dedication and just for fun into adult life is awesome. If you're still sticking with it professionally and still liking it, chances are you're a great developer. But of course, that kind of path works from whatever age you start getting into it.


I absolutely agree with this article. I had never done any programming before college (6 years ago). On the first day of my freshmen year I began learning HTML. It was completely new to me at the time. After graduating I'd noticed that my coworkers 1.5 times my age who had been programming since they were in middle school were frequently coming to me for help.

The reason, I believe, is that our industry is constantly evolving. Once you've learned the basic concepts of programming, the only thing left to do is to keep up with new technologies as they come.


> How silly would it be if we bragged this way about other things? "I've been talking since I was 2!"

Actually not silly at all if the language is something your job requires, like a second language.

Experience matters. A lot.


Another true gem is "The team has 120 years of combined experience between them". Never could quite figure out what a person who wrote something like this meant to actually say.


I started programming (in C64 BASIC) at 3, but I didn't get really serious about learning to program well until my twenties. I think the latter experience (and intensive self-study program) was a lot more important.


Hmmm, this is probably a good way to get in touch with hackers on HN who actually started programming at the age of x, don't you think? Are you hiring at the moment?


Did my first programming at age 10 or 11, but I can't say I wrote any decent code until I was 20 or something.


What total BS. The truth is that if you ask good programmers, they almost all started before they were 15, most before they were 10. Almost all contributors to Debian are the same way. The vast majority DON'T talk about it, it's assumed.

It's pretty damned rare for somebody to learn late in life (after 18). That's the hard ugly truth of the matter.


Late in life is after 18? Where in the world did you get this number from? Are you saying that if you're 20 or older you can't learn anything else?

Yes it's true that a lot of programmers started early but please you are making the correlation == causation fallacy here. There's NO proof that starting earlier means you're better. I've met some really good programmers who started later in life because they thought they wanted to do something else, and when they found programming, it clicked, and then went crazy with it.

Time spent doesn't mean you're better after that time. Time spent trying to get better DOES mean you'll get better over time, and that has nothing to do with how early you start.

This also isn't only for programming but any profession.


I don't see where the post you replied to implied causation.


What I read is he said "as many good programmers started programming young, it must then be true that starting young makes you a good programmer."


What are you talking about? I've met plenty of excellent programmers who started when they were 18/19. I'm sure it's possible to learn later as well.

Programming well is hard, but not that hard.

Also, Debian contributors aren't a really good group to take samples out of...


have you ever asked bad programmers when they started?


I actually did once. Worst C++ coder I ever worked with got a 5 on the AP exam in high school.


Curiosity and prodigy go hand in hand!




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: