I think programmers tend to underestimate the difficulties involved with becoming a good programmer because once you're good, you only see the even steeper learning curve ahead of you.
Some of the smartest people I know work in other domains: biology, chemistry, and even physics. They are sometimes baffled by tasks that seem trivial to me, and I'm under no impression that I'm more intelligent than them. I simply specialized and focused only on programming, while they program to accomplish other tasks in their domain of expertise.
Can this last forever? Of course not, nothing lasts forever. But wondering why the wealthiest corporations in the world pay their workers high salaries is perhaps like wondering why water is wet. Software has a low marginal cost, and the rest is basic incentives for the corporations.
> I think programmers tend to underestimate the difficulties involved with becoming a good programmer because once you're good, you only see the even steeper learning curve ahead of you.
Good programmers I know also overestimate the skill needed to earn high salary in this job. You don't have to go up the learning curve much; these days, you just learn yourself JS a little bit and go for a webdev job, making shit code and still earning more than most people in a given country.
> But wondering why the wealthiest corporations in the world pay their workers high salaries is perhaps like wondering why water is wet. Software has a low marginal cost, and the rest is basic incentives for the corporations.
Nah, that's like wondering why is this ice block sitting on a hot plate and still solid. The answer is: because it just got put there, and it'll melt in a moment. So too, will end high salaries, as most low-hanging fruits get eaten by software, made by mass-produced cohort of programmers.
> Nah, that's like wondering why is this ice block sitting on a hot plate and still solid. The answer is: because it just got put there, and it'll melt in a moment. So too, will end high salaries, as most low-hanging fruits get eaten by software, made by mass-produced cohort of programmers.
Our industry has its share of cycles, but this, in my view, is largely wishful thinking on the part of people. Nothing wrong with optimism but...
Every 5-10 years there's a "technical shift" that forces everyone to reevaluate how they build software or more importantly what they build, and the race starts all over again. The ice block is removed from the hot plate is replaced by a bigger, colder block of ice. And when these technical shifts aren't taking place, the bar for what constitutes as "good software" inches upward.
If your standards for acceptable software were frozen in time in 1985: using modern hardware and software toolchains, you could accomplish in one day what used to take a small team an entire month. But if I delivered, say, what passed for a "good enough" database in 1985, it would resemble someone's 200-level CS final project rather than a commercially-viable piece of software.
I have not noticed these technical shifts per se. What I have noticed is that mature engineers move on and do other things and new ones reinvent the wheel with some new fancy language or term which then becomes the new way of doing things, and the cycle repeats. Sometimes there is a great deal of value when a new level of abstraction happens but I wouldnt call this a shift, it's just progression.
Many of the underlying problems and solutions exist for decades. Database systems you mention are a good example of this.
When I talk about shifts, I'm referring to things like the proliferation of smartphones and tablets, which increase the net-demand for software and along with an entirely new specializations of knowledge.
While there are some key concepts to things like databases, the fact remains that your 1985 database would not be considered sufficient in todays world: It would have too many limitations, lack features we now take for granted, would not scale to modern data requirements, etc. Supporting all that "modern" functionality is non-trivial and requires a huge amount of effort. You can't just say "Well, we figured out space and computationally-efficient hashing, so relational databases are well on their way to being feature-complete"
There's a reason we haven't stuck with 1.0 on our platforms, and it's not just security or a desire for a bigger version number: New demands required new functionality and new ways of building things.
"When I talk about shifts, I'm referring to things like the proliferation of smartphones and tablets, which increase the net-demand for software and along with an entirely new specializations of knowledge."
iOS was basically AppKit, so anyone already developing for the Mac knew most of what they needed to know to develop for iPhone.
Pretty much every programming innovation is incremental, and doesn't require throwing out all of your previous knowledge and starting over.
> iOS was basically AppKit, so anyone already developing for the Mac knew most of what they needed to know to develop for iPhone.
Maybe. But AppKit was not the Mac Toolbox.
When my career began being good at memory management was a skill to be proud of. I would say now, being good at concurrency is a skill to be proud of.
I don't really have to worry about memory management any longer but didn't worry about threading when I started my career.
As I see the younger generation entering the programming field I wonder in what ways the craft will be different when they've had a few decades under their belt.
Will parameter tuning datasets for machine learning be the coveted skill? Who knows.
MongoDB didn't become a public company through innovations in fundamental distributed database technology or even through good engineering. They became a public company because once Javascript became adequate for building client software, there was a strong incentive to build MVPs using the same data structures from client to server to DB, and once you build an MVP that gets adoption there's a strong incentive not to switch databases.
That's the sort of shift in the environment that the grandparent is talking about. Fundamental CS tech was arguably better in the 1970s and 1980s, because it moved more slowly and you had time to get the details right. That doesn't matter if you're building say a mobile Ethereum wallet in 2018, because you're building for the user expectations of today, they don't care about data integrity or security as long as it doesn't fail during the period where they're deciding which tech to use, and software that solves the problem (poorly) now is better than software that doesn't exist.
> Fundamental CS tech was arguably better in the 1970s and 1980s, because it moved more slowly and you had time to get the details right. That doesn't matter if you're building say a mobile Ethereum wallet in 2018, because you're building for the user expectations of today, they don't care about data integrity or security as long as it doesn't fail during the period where they're deciding which tech to use, and software that solves the problem (poorly) now is better than software that doesn't exist.
I believe you are a victim of survivorship bias.
There was plenty of shitty software in the 70s and 80s. The difference between then and now is that we haven't been able to wait for 4 decades, to see what software of 2018 stood the test of time.
I agree, back then there was not a mindset of move fast and break things. It was foundational research, a lot results and learnings from then are still applicable today.
In the 1980s there was a lot of "foundational research" (poorly re-inventing the wheel) for microcomputer people who did not know about the work done on large computers in the 1960s. Move fast and break things was also very much a thing for microcomputer manufacturers and most microcomputer software vendors. Look at how many releases software packages went through, and at what rate.
I think you're agreeing to disagree. His viewpoint, to me at least, is opposite to yours. Or at least parallel. He never said that today there's no foundational research.
And the mainframes that run our banks, transportation systems, healthcare, public safety.. etc etc. Use the right tool for the job, price it against what the market will bear. Pacemakers and insulin pumps driven by npm updates - shudder -
It's before my time, but I'm pretty sure people had at least an intuitive understanding of what partitioning does to a datastore before Eric Brewer wrote it down.
I'm not sure why you get downvoted, I'll upvote you.
The "modern" database systems are now going back to the exact design principles that the books you refer to solved long time ago. There is tons of research, dissertations,.. that focuses on this from decades ago.
Its just now that the new systems realize that these problems actually exist.
If you dont know the history of a certain field and what came out, you repeat and make the same mistakes again. This seems to also apply to software engineering.
Yeah, distributed database systems from the late 70s, early 80s actually had certain transactional guarantees that some of these "modern big distributed systems" you refer to still dont have.
In maturing parts of the software industry you'll often see a desire to stay with the times in order to maintain a competitive edge, re-inventing the wheel often looks like full/partial re-writes of a system for minor marginal gains.
A great example of this is the evolution of FB/Google/Amazon. Portions of their core tech have been completely re-written over the years for marginal gain, but there is a large premium to being the best in tech.
In other parts of the industry every new cycle enables some new area of tech, and those marginal gains become the minimum bar for entry. e.g. Deep Learning and Computer Vision, distributed systems and cloud computing/SaaS.
You overestimate the quality and reliability and expandability of those old systems. The 1998 10 Blue Links tech couldn't support the functionality and scale of Google today.
your standards for acceptable software were frozen in time in 1985
Haha in 1985 Amiga had a multitasking GUI desktop in 512k and 7Mhz. Now we have millions of times more computational power and struggle to deliver an experience as crisp and satisfying.
I wish people still made software like it was 1985, that actually used the full power of the hardware for something useful...
Yeah but Exec/Kickstart/Workbench was missing some basic niceties that are table stakes today:
- No process or kernel security (processes could rewrite kernel code)
- Processes could effectively disable the scheduler
- Supported only a single, fixed address space: both a massive limitation and a performance hack that made the system bearable to use
- Single-user
- No security model
There are embedded applications these days where not having these features are deak-breakers. Let me assure you: if you re-implemented the original Amiga OS feature set it too will be screaming fast. The tricky part is keeping it fast once you start adding protection and additional functionality on top of it.
And largely what happened when you tried to implement more complicated applications on top of these primitive systems is that they would crash the entire system, constantly.
That and the fact that amiga was clean room design. IBM Pc was already old and even x86-64 won over itanium. Backwards compatibility has a cost but also gains. Amiga wasn't even compatible with c64.
Sure but "pc" at the time of amiga was already pc compatibles and at backward compatible with xt and IBM PC. Besides amiga was very optimized for 2D graphics. 3D story has been less rosey. But I would seriously welcome a new amiga. That wouldn't mean a specced up amigaos running pc. That would mean a radical new mobile architecture. Or Vr machine. Some fresh air On a radical, alien architecture. Belt CPU ? Gpu/CPU à la xeon phy ? Ram only? Dunno. Something crazy.
more complicated applications on top of these primitive systems is that they would crash the entire system, constantly
That’s chicken-and-egg. Why do modern apps with actually quite simple functionality need all this vast power, the GHz and Gb? Because it’s there. Why does software crash? Because the OS let’s it with nothing more than inconvenience to the user.
Amigas were actually quite usable, they were stable enough for complex software to be developed and real work done. Same for ST’s.
your whole point is extremely backwards! software in 1985 was more "complete" than today. Even CD roms shipped with magazines had to support more windows variations than "good software" today supports of browsers! not to mention that every corporation with a software team also had usability and QA teams. Software quality and resiliency was much better than today. Then in the 90s it took a dive because online made "first to market wins", and it have been downhill from there.
so, your software concepts from 85 would be overkill today, not lacking.
Agreed, and the games industry may be the clearest example of this. In 1985 the developers had to actually ship a finished game, there wasn't the opportunity to release a half-finished product and update it later. Compare to Fallout 76, which was very clearly unfinished at launch.
Games in 1985 shipped with bugs baked in that people were just stuck with. Civilization had a broken Gandhi, and still shipped 4 different bugfix versions.
No game (or any large software project) is ever bug-free, but the standards were higher than they are now. Fallout 76 was literally unplayable on launch-a fairly early mainline quest was broken, so it wasn't possible to reach the endgame. Was civ unbeatable on release? Sure, ghandi was bugged, but that was entertaining enough they kept the bug in the sequels.
There were plenty of bad and buggy games in the 80's, it even crashed the market [1]. We just don't remember them.
Also modern non-indie games are orders of magnitude larger and intricate than 80's games, and have about the same distrubution of quality/bugs. They're now made by teams of size 10-1000 rather than 1-10 though.
Yes, but in your previous comment, you said it didn't exist. It may as well not have! Almost nobody used Windows 1.0...
Still, in 1985, even DOS apps had to target varied environments: mono, CGA, EGA, Tandy graphics, different memory configurations, 8088 or 286, printer drivers...
Preach, brother/sister! My career started in the early '80s and I say this all the time in comments here on HN. You had less to work with but less was expected of you.
I think you analysis is fair, but Glassdoor data is a bit off. At my work, I filled out salary data on Glassdoor and it never showed up. Maybe they thought it too high?
End user tasks might be able to be automated into dumb programming jobs, but companies have tried for decades to down-source their programming and it always falls flat on its face.
Even if you have library support to hand hold your budget coders, even if you use a lot of them, even if you give them all the time in the world, they will produce more complicated, less coherent, less stable, buggier, and harder to modify, improve, or iterate results than better coders who understand the problem better.
That means that no matter how little you pay up front you end up paying more in the long run throwing more man hours and money at fixing the mess made. A good mess is easier to maintain and improve and costs less over time. A mediocre / bad mess takes substantial efforts to maintain and iterate on.
Its also probably a domain impossible problem to remove the ability for any coder to make bad code. If for no other reason that in any programming environment you can never stop someone from iterating by index over a binary searchable tree for a specific element in it, or you can't stop someone from turning a float into a string to truncate the decimal part to then reinterpret the result as an int. But if you don't give them the tools - the integer types, the data structures, access to the bytes in some form or another - you aren't really programming. Someone else did the programming and you are just trying to compose the result. A lot of businesses, like I said, can be sated on that, but its still not programming unless you are in a Turing complete environment, and anyone in such an environment can footgun themselves.
> and it always falls flat on its face.
How so? What about outsourcing in general? Where I work wanted to save money, and now we have 4x as many engineers in India. Most of them suck, and we have to shepherd and correct their mistakes, but business is viewing it as a success. They are creating features at a higher rate than we were before. Every other large company also has teams in 2nd world countries because they're cheap. At some point this labor will be both cheaper and qualitatively comparable to US talent.
I think that misses the point. SQL does not prevent a junior dev from selecting many more rows than he needs to and then proceed to iterate over them naively.
I know its a troll comment but it really illustrates my point. Rust feels great when you know all the paradigms and complexity involved in writing Rust code because for a seasoned veteran it does everything sensible and right. But for your average person trying to get into Rust is a nightmare of fighting the borrow checker on all your misintended mistakes that would at best be silent undefined behavior in other languages. Top it all off with how expressive Rust is relative to most C like languages (remember, we had an entire generation who thought function objects were alien because they were taught Java) and its fantastic if you know what you are doing and a totally lost cause when you don't.
Ironically, knowing that I can get a good salary with relatively shit skills makes me want to up my game. Because it seems like a situation that is inherently unstable.
Eventually it has to change (imo), either through companies becoming more scrupulous in their hiring, or through a massive flood of new devs.
If you take a look at most schools CS is now one of the most popular majors. When I graduated ~3 years ago the class was ~100 per grade. I went back to recruit and class size was ~600 per grade. Talking to profs I know all over the country this seems to be thematic. The supply curve is about to shift.
You will get a massive flood of wanna-be new devs. But economic theory would tell us that high salary attracts more talent and even the companies will have more people to choose from they also become more selective so the riff-raffs still won't get a job. So only the good ones will get the plum roles; of coruse a few bad one fall through the cracks, but overall devs are smart people.
I remember back in the early aughts after the .dom implosion and the resultant uptake of outsourcing. There was a pervasive attitude that the high paying software jobs were not going to come back in any meaningful way. Yeah, well...
There will be a day, but when is hard to say. Thinking it's right around the corner is akin to the belief we're on the cusp of true AI. We're more pessimistic today than we were in the mid 80s. And non-programmers were programming with hypercard, filemaker pro, VBA, etc, back in the 90s.
There are of course former well paying jobs such as old-school front end devs (html/css/sprinkle of js) that are largely commoditized, but that's a given considering the low barrier to entry.
Except the high salaries won't end given the level of productivity found within the technology industry. Tech companies as it stands today are some of the most profitable companies out there, it's this profit that enables them to further compete for talent and pay people more.
The same highly productive nature per employee is found in virtually every other high paying industry, most of which have not seen the pay for the higher end of those in the industry fall over time.
> these days, you just learn yourself JS a little bit and go for a webdev job, making shit code and still earning more than most people in a given country.
With these jobs in particular, what I see is that the definition of seniority has shifted to 'knowing the latest tech'.
So a junior dev who's just got to grips with React has become a React Developer, and they are now relatively senior in that field. The experience isn't transferable to other parts of the software stack though, it's too heavily tied up in the browser. So they end up as a super-specialised frontend dev.
It'll pay pretty well until the tech becomes obsolete, unless that kind of person enjoys maintaining legacy code.
Universities are already producing those like there's no tomorrow, and we now have bootcamps that contribute some more. The only reason there isn't a glut of programmers visible is because the industry is growing even faster. But I can't imagine that growth lasting much longer.
Doctors are already seeing commoditization as healthcare systems consolidate and adopt standardized workflows (aided by EHR implementations). And nurse practitioners and physicians assistants are doing much of the work once done exclusively by doctors, but with less training and lower pay.
It wouldn't surprise me if index funds have caused a decline in earnings for investment bankers.
I think the comparison to biology/chemistry/physics is interesting. Perhaps even more than software, there's a huge spread between the value of low and high performers - the best scientists make new discoveries that can be worth billions.
On the other hand, if you think the software industry has a hard time figuring out (at hiring time) who the high performers are... science is driven by serendipity. Nobody can predict who will find the billion dollar discovery. Not even past performance is a reliable indicator.
So it makes sense to me that the salary spread in science is relatively even. If they could reliably figure out who to dump money on, they would. On the other hand, the FAANG companies clearly believe their hiring practices can select out the high performers... and perhaps they are right? If they're paying 3-4X what everyone else does, they expect to get at least 3-4X the value.
I've worked with good people at every company I've been at... but the nice thing about being at a top company is I never have to deal with totally incompetent or helpless people. Nothing frustrates me more than having my job responsibilities include training someone with no initiative.
The selection process seems to do a good job of keeping out the lowest tier at least, although we openly acknowledge that we miss a lot of good people as well.
> Nothing frustrates me more than having my job responsibilities include training someone with no initiative.
Years ago, I worked someplace where a colleague was tasked with working with another developer on project X. After about 15 minutes it was clear the other developer ... wasn't? A web project, and this person had been employed as a "web developer" for at least several months. Questions like "how does this information in this browser get back to the server?" came up.
Colleague goes to manager and says "I can hit the project deadline, or I can make sure other_dev learns the basics enough to be able to contribute and understand projectX, but I can't do both by the deadline. Can we move the deadline back a few weeks?"
No, and no. Train other_dev and hit deadline.
Deadline was hit, other_dev moved to another project afterwards, and was pretty much as ineffective as before, but colleague was then saddled with this reputation of being a 'bad mentor' because the next team learned other_dev didn't know how things worked. Why the hiring manager wasn't tarnished... who knows?
He was given 2 tasks and he only delivers one result.
I know this sounds... not ideal... but it is what it is.
His manager probably has to operate at the same level of expectation: given 3 tasks by his manager (or director), either you finish all 3 or you're less dependable.
Isn't the hiring manager tasked with "hire someone with basic competence"? And they failed? But their reputation/credentials don't get called in to question?
Science doesn’t reinvent itself every couple of years either: new discoveries build upon a foundation of old discoveries. Software is more like the fashion industry.
I think they expect to not have to face another Google or Facebook to compete with by hiring everyone at 3-4x. Then employees just rotate among the big techs. The big techs then decide where they want to compete with each other... e.g Netflix and amazon video/prime... so if stocks decline I’d expect a rise in bonus or base pay or more stock.
One way that they may be getting 3-4x the value is in the long term. Although I've only worked as a programmer, I'd expect that the impact of having one negatively valued programmer is far larger than having one negatively valued chemist or physicist. Legacy code can often be years old and far outlive the jobs of the people who wrote it.
Oddly enough, I think learning to program is easy, but only for a few people. And those are the people who are motivated to learn it as an end unto itself.
I was motivated because my older brother, and my mom, had already learned how to program, and they were quite excited about it. After getting past a few familiar conceptual hurdles, it became very easy for me to learn programming myself.
People who are only motivated by the money, or under pressure from others, have a harder time, because their curiosity and drive aren't activated. There's some sort of valve that lets the knowledge into your brain, that has to be opened.
For the most part, the people I know who seem to be motivated by money itself are not so desirous of getting rich per se (many are already rich), but are actually interested and curious about money in the way that I was curious about programming.
I don't program for a living today, but my ability to program is definitely a force multiplier for my work. It has either improved my earnings, or improved the continuity and longevity of my career.
"""I don't program for a living today, but my ability to program is definitely a force multiplier for my work. It has either improved my earnings, or improved the continuity and longevity of my career."""
may I ask what domain you are working in? Can you give some examples of how you've slipped in some programming knowledge into other job tasks? I love to hear people's anecdotal problem/solution approaches. Was the programming side of it actually slipping in some VBA/chrome extension/javascript or was it more of just an 'analytical' approach taken to a business decision.
My background is in math and physics. While studying those subjects in college, I learned programming on my own. Today, I develop technology for fancy measurement and control equipment. When I say I don't program for a living, I mean that it's not my job title, and my managers may actually be unaware of the role of programming in my work.
I use programming extensively as a problem solving tool, for things like data analysis, modeling, automation of experiments, and prototyping. Almost all modern equipment is electronic and computerized. To be capable of rolling out an MVP on my own, I program.
You will rarely see my computer without a Jupyter notebook on the desktop. ;-)
In addition to working in a computerized field, program code is just a super powerful way to express ideas. And the disciplines of good programming practices (yes, learn them) provide ways to organize the innards of complex things, so they actually have a fighting chance of working and being right. Plus, it's fun.
People who work as full time programmers may make more money than me, but I'm not sure that I can do their jobs. When thinking of any profession, a person should not only look at the cool, fun stuff, or the money, but what the actual daily grind looks like, because that's what you have to survive.
> I think programmers tend to underestimate the difficulties involved with becoming a good programmer because once you're good, you only see the even steeper learning curve ahead of you.
But that is no definition of bubble. Bubble, at a very basic level, means there is a lot of capital flowing within, it has little to do with whether how difficult your job is.
This is... debatable.
I was around in 1985 and if I had proposed using "an eventually consistent schemaless DB" to tackle a bank or a manufacturing project I would have been laughed out of the room.
And not because it sounded too good to be true, mind you...
> I was around in 1985 and if I had proposed using "an eventually consistent schemaless DB" to tackle a bank or a manufacturing project I would have been laughed out of the room.
Medical record systems were running on MUMPS (schemaless) and being eventually consistent (records were keyed in from paper forms) long before 1985.
Does anyone do that, at least on the software side? Obviously there are consistency issues with e.g. non-instantaneous bank teller actions, but those are human inconsistencies, not software inconsistencies..
What I mean is that some of the powerful components we can leverage today solve problems that did not exist in 1985, so having these available then would not really help.
I mean: what would you use Node.js for - in 1985 - even assuming you had access to a system to develop and test stuff made with it?
> I mean: what would you use Node.js for - in 1985
In 1985, your hypothetical bank would be running VMS, which had asynchronous IO system calls as the default. There was no need to "invent" something like Node.js.
I agree, but the real bubble that has popped is that non-techies have become more selective in judging the skills of programmers. Gone are the days where being able to stand up an FTP server and display a mysql query result in a web page means you can write your own ticket into any tech job in the world.
Most programmers are not that well compensated. I never made more than 120k a year tops, I'm almost 50 years old. Not everyone is young and pretty and attractive to the Googles. The number of employers who compensate like this are tiny you can count them on one or two hands. 98% of programmers never make more than 100k. With nearly 30 years experience I build things that would bill out to $160,000 a month for an outside team to produce, but I live in an 80 square foot roach infested apartment and can't afford a car.
If your work is that valuable then perhaps you’re selling yourself short? In my 40s now and have definitely encountered age discrimination but found remote work once I updated my skills. I hope you get out of that dump!
Absolutely right and something we see in Data science too. Coding is a domain where we work cross-architecture. I can't be just a useful Python programmer - I can only be a useful Python programmer in Marketing automation or some Business associated with it.
The fat package makes programmers not realize this.
As a data scientists who considers himself moderately good with the pydata library set, I realized the other day if you took those 5 libraries away from me, I'm not sure I have that much to offer in terms of my programming abilities. I don't necessarily feel bad about that, although it did give me pause.
I’ve known people who built lucrative careers on expertise in specialised technologies, and others who’s entire careers disappeared because they were too specialised and the technology got deprecated. You might be fine, but I’d definitely recommend diversifying your skill set a bit. Being a generalist has served me well.
I would say every capable programmer should be a "generalist" in a sense and be able to transfer their skills with relatively little effort. If they can't, they are not really that good at the abstract concepts. Still this doesn't prevent one from focusing more and specializing more on one field/language that they love and be as familiar with it as possible.
thanks for the info, but as I said above anything that relies on the installation of graphiz is extremely hoaky, and as you can see on many posts on SO, doesnt work many times.
I came in to the pydata library after extended time in Matlab/Octave. Learning there first was terrible for programming basics (e.g. polymorphism) but was excellent for ensuring I knew what a given algorithm does/should do.
I highly recommend spending some time in C/C++, Go, or Julia to pydata-first folks that ask.
As a data scientist myself, I learned to write code that utilized arrays and matrices - from the most basic library. From cleaning to analysis to machine learning (I supposed numpy was used for ML). Is this the most correct manner to write code? I don't know but learning to do DS without specific Python libraries has improved my coding ability. Aside from being able to read in different data formats I believe I could do a fair portion of my work in C.
It is like exercise... you are shaping your brain to the problem. The US military actually tests for this in what's called the Army GT score of the ASVAB test.
When I took the ASVAB in high school I scored a 107. That score is too low to become a warrant officer so I had to retake it a couple of years ago for my officer packet and I scored a 129 out of a maximum 130. That puts me in the top 0.1% of testers. I am not smarter or more intelligent than when I was in high school. I do write software though. Every couple of years I look back on my software and algorithms realizing how I continue to improve and see the solutions more clearly.
The best teacher has been helping collegues. I have been programming a lot better ( less errors and even without sometimes running the application when I'm pretty sure), because I think and analyze more upfront than I used to.
Some things come back, but it's rarely related to me ( eg. last moment spec changes).
I do have to watch out, I notice that basing my code on someone else is ok. But, they always have faulty code in hard to test areas. So make testing easier on things that are hard to test is my next motto.
Also, helping others is a pretty huge timesink :(
Ps. Being in the zone does wonders lately
PS2. There was another threat about videolan yesterday. And nobody heard the entire story about https. I gave the VLC developers the benefit of the doubt, not knowing everything off their infrastructure.
. A lot of the comments here on HN disagreed with me ( stole silently upvoted though).
Today I saw a blog post about why..
It was infrastructure based...
I can't understand why I was practically the only one with another view in the subject in this community, where developers come together.
FYI:
Comments are in my history. Mostly on the videolan topic. It's all recent
I would say that it's the same for the other trades as well though. For example, people might imagine that learning a new human language is trivial with the proliferation of misleading advertisements such as Duolingo or Babbel, but to really learn a language the effort needed is tremendous and it requires constant repeated practice over a long span of time.
The question the author posed was why programmers are paid that much even when some other paths could seem "harder", which seems valid. Sure not all careers are supposed to be "harder" than programming, but they're not as easy as one'd imagine either.
Though yeah at least for now I don't see the situation abating much. The demand is still going strong. Once the proverbial "flood" of the market happens from new grads, things might get worse. But still if you know what you're doing, you know all the right concepts and skills, you should be able to stay on top of the game. There has always been a saying that the irony of the CS degree is that many people who graduated with the degree can't program, while many who can program didn't need to do a degree at all. I doubt the influx of students trying to study CS would change this situation much. Coding bootcamps have been around for a decade yet they don't seem to change the market equilibrium that much.
> they program to accomplish other tasks in their domain of expertise
My kids have mentioned that they might be interested in a degree in computer science because and I've encouraged them to combine that with a second area of specialization. Programmers are everywhere, but a programmer that also knows chemistry or biology or economics or art history or just about anything stands out.
>Programmers are everywhere, but a programmer that also knows chemistry or biology or economics or art history or just about anything stands out.
Really? I have a physics degree with some experience in rocket science, but my most valuable skillset (measured by how much pay I can fetch for it) is plain old software engineering. I don't think I'd be able to leverage my area of specialization to exceed or even match what I can get from FB/LI/G as a generic software engineer.
You don't think your broader education and experience makes you a better software developer (that encompasses programming, writing, and working with other people)?
That's a good point. Not going through the standard CS track and straight into software engineering has probably given me broader experience and skills (both soft and hard) that I wouldn't have developed as well otherwise.
Maybe. I studied CS and bioinformatics. You end up competing with both pure CS folks and also the bio folks that are bioinformaticians. Still I generally agree that some domain expertise is helpful.
That is hard because you have to convince many other people to agree. It's easier for me to just undercut other people by taking less pay. I can for example going remote and live in low living cost area.
On the side of the employees, unions have proven themselves to be good means to improve the situation for employees. Here in Germany they definitely have helped in many industries.
My comment was more about the companies though which may form cartels to drive down employee wages. Companies forming cartels is illegal, while unions are legal in many places.
Yeah but even at that point faang (or well at that point apple, Ms, google, Netflix didn't exist and Facebook broke the cartel) compensation was significantly more than people in Europe were making.
And with anti-collusion labor protections, engineers stood to make even more.
In the US, union workers make between 10% to 30% more than their non-union peers[1].
You're comparing pay across two different economies and only looking at unionization as a variable. It's like wondering why engineer rates in Omaha, Nebraska aren't on par with those in New York, and concluding that it has something to do with differing fire codes.
> And with anti-collusion labor protections, engineers stood to make even more.
Correct, but these protections exist (and existed at the time) independent from a union.
> In the US, union workers make between 10% to 30% more than their non-union peers
There's very few high-skill jobs which are commonly unionized. In a market where supply is greater than demand, then yes unions have absolutely shown to improve worker outcomes[1]. I'm not aware of any evidence for markets where demand outstrips supply (like that for skilled software engineers). It's not immediately clear that union protections would be beneficial.
>You're comparing pay across two different economies and only looking at unionization as a variable.
No, I'm simply pointing out that your flippant response to esoterica doesn't actually address the question. If unions are better for workers, why is it that a non-union area !!with a cartel depressing wages!! was still substantially better for workers than a unioned area with no such issue?
Saying "oh the market is different" ignores the question of why the market is different.
[1]: Indeed, that's kind of exactly what happened with this cartel. Facebook wanted to hire skilled engineers, and was willing to pay more, so broke the cartel. That kind of thing won't happen when workers are generally equivalent, but SWEs aren't.
> There's very few high-skill jobs which are commonly unionized.
Sure there are. Doctors and actors, to name just a couple. In both cases the "union" actively works to create barriers to entry.
The AMA colludes with medical schools to set artificially-low student body quotas. If you've ever wondered why teaching "XYZ for pre-meds" is such a miserable experience, this is why. You have to earn straight A's to get into med school because there are so many more qualified candidates than openings (but it's not clear to me how, say, art history or algebra-based physics makes you a better doctor).
SAG (the screen actors guild) requires actors to have already performed in a SAG production a a condition of membership. And they also strictly limit the number of non-SAG performers on SAG productions. That chicken-and-egg problem was very intentional
If you've ever taken a macro economics course, you know what effect these actions have on prices.
> I'm not aware of any evidence for markets where demand outstrips supply (like that for skilled software engineers). It's not immediately clear that union protections would be beneficial.
See above. Unions can create a market where demand outstrips supply.
> If unions are better for workers, why is it that a non-union area !!with a cartel depressing wages!! was still substantially better for workers than a unioned area with no such issue? Saying "oh the market is different" ignores the question of why the market is different.
So tell me why professional associations exist, then. Why do doctors form a union to increase wages, if as you say, they would be better off without it?
> Sure there are. Doctors and actors, to name just a couple. In both cases the "union" actively works to create barriers to entry.
Neither the SAG nor the AMA are unions in the traditional sense. In many ways, the AMA actively works against worker quality of life (consider that the horrible conditions for med students/residents and the high suicide rate among MDs) to artificially reduce supply.
>Why do doctors form a union to increase wages, if as you say, they would be better off without it?
The AMA is mainly a lobbying organization, not a union. Since a significant percentage of doctors are in private practices or small practices, they don't have representation with the government. So sure, the AMA does collectively bargain with the US Government. But by that same token, since 53% of MDs are self employed, the AMA can't do "normal" union things like set wages, because there's no one to bargain with except the doctors themselves.
And interestingly, the AMA actually admitted that its intentional supply-reduction is hurting the medical industry as a whole. To answer your question, "because they thought it would be better". But in hindsight, they probably weren't.
I've said that they are good means, not the best means. And I guess the reason why they are paid so little is the higher profit margins of FAANG companies as well as probably the alternative in SV that you can found a startup and make much much more if you're good (and lucky).
Unions are basically legalized price fixing. What happens is that the union negotiate a "fair" price, and then all companies decide to pay no more than said "fair" price. See for example (original is in Swedish):
The problem is that the numbers that gets published by unions in Sweden are taken as law by employers. You don't really know what unions are like if you haven't heard your employer say "We can't give you a bigger raise due to our collective agreement". And since basically all other employers follow the same guidelines you can't get competing offers for significantly more. There is a reason why salaries are very flat in Sweden.
Another way to see it, collective bargaining goes both ways, ie both workers and employers will come to a joint agreement. So if we created a FAANG engineers union and created a joint pay-scale for them, then that would basically be equivalent to the non poaching agreement often derided in discussions like this.
Not all union models have sector bargaining and it certainly doesn't work for professional unions - and I am not saying that European unions really get the needs of m&P members and need to change.
As I said good programmers are underpaid. They should figure out how much they are making their companies and ask for more. The market can often afford to pay more, if you just negotiate better. You can also unionize to get your employers closer to what you are worth to them rather than what they are worth to you.
In every other aspect of computers, the industry has finally embraced usability as a desirable goal, and not just for end-users.
On my first computer, you had to read a 100-page user manual and learn exactly what commands to type. In my first programming language, you had to manually allocate (and worse, deallocate) memory. With my first database, we used to have to go type VACUUM regularly. None of these is true today.
Yet even though some of the highest paid people in the world are members of unions and have agents to do their negotiating, programmers seem to have latched onto this idea that if you're not making top dollar or have your ideal working conditions, you should "just negotiate better".
Why stop there? Tell programmers they should "just program better", too.
> You can also unionize
Have you ever organized? I don't think you realize how difficult this is, especially without strong support from an existing union. There's a reason unions heap rewards on people who do it.
Existing unions also have great labor lawyers. A common response to even thinking about unionization is getting fired. (That was in the news recently because it happened 4 weeks ago here in Seattle.) Labor laws aren't what they once were, and there's usually no consequence to the company for firing organizers.
> On my first computer, you had to read a 100-page user manual and learn exactly what commands to type.
Flipside: I can still write software for my first computer without looking anything up, over 30 years after reading those 100 pages. I still know the memory layout, opcodes, assembly etc by heart and it is still the best way today to program that particular computer (which still works in my man cave) today. Yes, today it is all simpler, but the 100 page example I find a plus, not a negative. Maybe you were referring to something but my 100+ page manual was usage and at the same time programming (using was programming beyond the basics) as that was the only way to use the system.
People who make enough to have agents negotiating their salary (famous actors, professional athletes, other celebrity types) are usually looking at an order of magnitude higher compensation than even the best software developers get. At the lower end of the spectrum (lesser-known actors, musicians, etc.) agents are known for enriching themselves as much as helping their clients. They are just sort of accepted parasites on the way compensation is handled.
Depends on what the outcome is. If it makes the site 50% more performant on 25% less hardware, pretty easy to swag it. Same if the outcome makes developers on the team able to ship new functionality 20% faster with 33% fewer bugs.
Issue 1: It's very difficult to tell if your contribution got 50% improvement in performance because there were 10 other devs pushing in features and bug fixes. This is the attribution problem
Issue 2: This happens over time. It's very unlikely that your 50% improvement happens every year or month. Because, think for your self, this is compounding with large rates. It grows quickly. 1.5x improvement in 6 cycles (months or years) is 10x. This essentially is the time problem
Issue 3: even if you deliver the results you did, in a large company there's a large bureaucracy and no one person has the ability to increase your salary by that much. This is the control problem.
The problem with this argument is that programmers don’t work alone in a vacuum. How do you account for the support staff? The recruiter that hired you? The cleaning lady? The DevOps people? And so on.
It’s avtually fairly non-trivial to be able to say with even a modicum of certainty how much value a given developer brings to their company.
This is precisely my point! Thank you for getting it and explaining it.
I currently write software used by millions of people. Partly because I’m a backend engineer, I have no real idea how much more the company is making due to my direct efforts. Since they keep paying me, I’m assuming it’s a decent multiple of my carrying cost, but I have no way to measure it.
It's how markets work when wealth is distributed incredibly unevenly, and a weak social safety net makes it intimidating to work for yourself instead.
In other words, markets work that way because that's how the bosses and capitalists want it to work, and have so far been successful at thwarting attempts to use the government to change things.
The simplest and most accurate statement is that it’s an emergent result of the principles of capitalism combined with human nature and not a plot to keep us down.
I'm all for programmers getting paid more. However, by your logic, if the company is losing money, should programmers contribute from their own pockets to keep the company afloat?
Starting your own company (not self employed contractor) gives a really good perspective on what it means to be owner and employee.
How is this his logic at all? The logic is more similar to a sales person. When a company starts losing money, it doesn't try to claw back commissions from its top sales agents to keep afloat. It might lower %s / do layoffs / something else, but money paid is money gone.
The logic is that programmers tend to produce far more value than they capture -- so that gets captured elsewhere, a lot of it typically by management. Except the value can be hard to quantify when the company is old and so is the software, how much of the value is employee #3701 making fixing a bug that's making the product not work for one customer in one instance vs employees #107, #85, and #150 who in their past team's life created the original version of that system to begin with to make the new customer even consider using it? There's no point to moaning about how much you "should" get paid. Just ask for more if you feel underpaid, but be aware that because of competition and because people usually want to hear what more you'll do to justify it, you're not going to always get it.
> Privatizing profits and socializing losses refers to the practice of treating firms' earnings as the rightful property of their shareholders, while treating losses as a responsibility that society as a whole must shoulder, for example through taxpayer-funded subsidies or bailouts.
And in tough times are you taking a drastic pay cut, or jumping ship? Sounds like you want a huge share of the good and none of the bad. Also not sure why you value programming so far above all of the other activities it takes to make a successful product.
Define "making". Does the program/service sells itself? If so, then your argument may stand. Nine out of 10 times though it doesn't. There are other people involved in making the sales and they also require a piece of the pie.
The usual ratio for top flight talent in the finance industry is 10:1. A trader makes JPMorganStanleyofAmerica $5m, they get $500k compensation. That seems reasonable because there’s a direct line between talent and profit. Even so someone had to build the business, capitalise it, create the business opportunities and relationships, train and support the trader and assume all of the risk.
How much of the business risk of the enterprise is your top flight programmer assuming? Are her decisions the only ones that make any difference to the increase in profitability as a result of her work? How direct is the line between back room engineer, no matter how good and profit?
The only case where 50% or near it makes sense is for a founder owner who is also the lead talent. Maybe. Because then they are also creating the business opportunity and assuming a big chunk of the risk.
> Can this last forever? Of course not, nothing lasts forever. But wondering why the wealthiest corporations in the world pay their workers high salaries is perhaps like wondering why water is wet. Software has a low marginal cost, and the rest is basic incentives for the corporations.
This has been contemplated for many years and the price has gone almost nowhere but up. In the early to mid 2000's it was thought that most work would go to much cheaper India, and put highly compensated on shore workers out of jobs. It was tried to some extent, and failed. Then it was thought that lots of people would then fill the demand by going to university for CS and other related fields and saturate the market. That didn't happen either. It has been known for a long time that CS is one of the most valuable majors as far as salary and still the percentage of bachelor's degrees awarded has stayed between 2-4% Not saying that this is the only way to get into the field but it is the most traditional way and indicative of the supply coming in.
Consider that some of the most valuable companies in the world did not exist 25 years ago, and now they do making real non bubble money where their biggest assets are software developers. This isn't the dot-com boom. The attrition rates in CS programs are very high and even then not everyone that receives formal educations ends up being a decent developer. Factory workers made good wages in the mid 20th century due to companies making lots of money on an industrial boom. We now have a technological boom, and unlike factory work the barrier to entry is much higher. So I don't see why it can't continue. Sure the 3-400k salaries are high but they are at top firms that are selective and competing for a finite pool of talented workers in very expensive areas.
CS has been through a couple of salary busts in the US previously though, no? One in the '80s when all the then-new CS programs started to release graduates into the market, then another after the dot-com bubble; or have I been misinformed?
The dot com bubble caused a salary bust because so many companies that were employing lots of people went out of business. This put a glut of programmers on the market with nowhere to work, which predictably pushed salary levels down.
This is nothing like the current environment, where there is a consistent demand for programming talent which outstrips the supply, and the industry as a whole is far more stable.
Of course, a global economic slowdown could put a damper on salary growth, but it will not be a salary "bust" like what happened after the dot com bubble.
I'm certainly not claiming that the future will necessarily resemble the past. But GP was making an argument from the past, so in that context we should certainly at least try to get the past right as much as possible. (And there was (I am told http://philip.greenspun.com/research/tr1408/lessons-learned.... ) the '80s, where the salary slump was driven by increased supply rather than a collapse in demand.)
If the current tech bubble bursts, it’s going to absolutely wreck amazon, fb and google—the three companies that actually gain most of the benefit from all these tech companies blowing their funding on ads and AWS spend.
Those are the companies driving up salaries — once that rollercoast ride stops I imagine that there are going to be mass layoffs and salary readjustments for years afterwards.
What is the current tech bubble, exactly? Those companies you mentioned have absolutely massive revenues and profits, and plenty of non-tech people and companies are responsible for those revenues and profits. This was not the case in the earlier bubble (lots of overvalued companies with little or no revenues).
What seems more likely is that the little startups will get crushed as VC funding dries up.
I think that's true, but I also suspect that one of the main drivers for the incredible acceleration of big company salaries has been to keep talent at that company rather than seeing the best engineers leave to go take VC money and play the startup lottery. If VC funding really does dry up, the pressure to keep salaries this high may be considerably lower.
I graduated from a no-name college shortly after the dotcom bubble burst and it was still easy for me to find a job paying $60k/year, which was plenty to live off as a single person in the SF bay area at the time.
Okay, but that doesn't seem to contradict the conventional wisdom that the hiring maket was significantly better there (from the developer's POV) a year or two earlier.
How many software engineers are truly making over $300k as a rank-and-file? Let's be generous and say there's about a dozen or so companies that routinely pay engineers that well. Each FAANG will have on average, 10k engineers? So that's around 120,000 out of an estimated ~20 million developers (from a quick google search). That's around 0.6% earning pay in a "bubble" situation, the rest being senior folks or executives who would demand high pay everywhere else. So if you're a trained software engineer, you have less than a 1% chance of being in the situation described.
A quick google search finds that law school students have around a 14% chance of making BigLaw (the legal equivalent). The odds of getting into a medical school is between 2-5% on average. So no, I don't think we're in a bubble, the majority in the situation described would simply exist as the elite compensation class elsewhere as well, but maybe with even better odds.
Let's put it another way: we are looking at the top 1% of software engineers. Is it surprising that they have incomes in the 1% range?
For comparison, in Amazon, Senior and above engineers count for ~20% of the total, and those are the ones pulling +300k regularly. So only the top 20% of one of the top companies are getting such compensation.
And to follow the article, this won't last forever. Whenever the next stock market crash comes, almost half of that compensation (the equity based one) will almost vanish. But maybe on the next bull market we will see a similar situation (remember people in the 90s making 250k?).
> we are looking at the top 1% of software engineers
Working for a big company that pays well does not mean you're at the top 1% of software engineers. It means you're willing to do what it takes to secure that job and maintain it, including moving somewhere many don't want to live.
people working on new systems and tech are generally better programmers than people hacking away at enterprise spaghetti at a big corp. big corp jobs like that pay better for less work though.
"willing to do what it takes to secure that job and maintain it"
Some (most?) of that is innate ability and intelligence. Sure, there are well-paying jobs that are unpleasant. But for many others the company literally gets to choose 1 out of 100 candidates.
No one has "innate ability" for programming. It's a human construct you have to learn.
Getting a job at these companies largely requires studying up on CS101 trivia and CS basics to pass stupid tests. It's practicing how to pass their interview process. It's not "innate ability and intelligence."
You should probably narrow that since you're almost exclusively talking about the US market. There are a few pockets outside the US where you can earn that, it's just quite rare comparatively.
The US has anywhere from one to four million software developers, depending on your source. The BLS lists 1.2 million US software developers, with a $103,560 median pay (excludes benefits) in 2017.
You have closer to a 5% to 10% chance of earning $300,000 in total compensation as a software developer in the US, at some point in your career. Frequently high incomes don't last, there's a relatively high turnover because peak earning power only lasts so long, layoffs happen, specialization changes, job changes, et al.
The giant caveat to this, as everyone here knows, is you have a <1% chance of earning that outside of a small group of markets (ie it's very much not evenly distributed; if you're in New Orleans or El Paso you have almost a zero shot at it; if you're in SF or NY you have a legitimate shot at it).
I live in New Orleans (grad school). I have plenty of friends from college who work at FAANGs. The local market here consists largely of DXC, CGI and IBM paying “data scientists” $90k and junior devs $40k and laughing all the way to the bank.
FAANG companies have engineers staying just 2 years because they made enough money. Your statistics don't account for time.
Sure its not "easy" to get into those companies, but it isn't an outlier to get into them either.
The simple reality is this:
If you are an engineer at a publicly traded tech company, it is customary to get RSUs and Refresher RSUs. These have compounding effects as their vesting schedules start occurring in parallel. By the end of your second year you will have two series of shares unlocking, and this is in conjunction with your salary increases and bonuses.
You should expect and negotiate your RSU grants to be proportional to your salary. Competing offers from other publicly traded tech companies ensures this.
If the share price has also increases, which is the only thing that happened over the last decade, this is enough for a lot of people to quit.
The article did not talk about share price increasing.
Yes a lot of people are content with their earnings from 2 years at a FAANG. They aren't thinkings its enough to live off of forever, but its enough for:
A house somewhere else and never paying rent again
or
A downpayment on a home in a high priced area where demand doesn't seem to stop, which is always a low-interest leveraged investment that has cashflow opportunities
or
"Taking a break" which means years of luxury at every music festival and socialite event while optionally networking and pursuing other fulfilling money-making activities, with almost no opportunity cost of having 'gaps' on your resume
or
Riding off the resume entry of being an "ex-Googler" to secure advisory roles, raise capital easier, get C-level titles at someone else's startup
or
Joining another FAANG company at a higher premium
and/or
Leaving the company so you can write covered calls on all those shares you earned, for passive income
There is a bubble on salaries in the Bay area. Only because of the real estate bubble that has been artificially created there. It will eventually burst as more companies will seek to hire in cheaper areas.
Top companies pay similar salaries in Seattle and New York. New York real estate isn't as bubble-like as SV, and Seattle's is possibly bubble-like but still about half the price as SV.
There is nothing artificial about the real estate "bubble" in the Bay Area. Prices are set by supply and demand, both in salaries and real estate.
Real estate prices are high, because the Bay Area is a very desirable place to live, there are people who can and will pay the high price, and the supply of housing is low.
Salaries are high because big companies are fighting for talent (demand), and the supply of talent is low.
The thing that makes it a bubble is that the supply is artificially constrained by local zoning laws. Eventually people will be fed up with the situation and they'll vote for a government that'll allow more development, at which point housing prices will plummet.
This is one of those bits of common wisdom that falls into the "it's more complicated than that" territory. Zoning laws are a contributing factor, but they are far from the only factor -- and it's a stretch to say that they're the deciding factor. Zoning is an issue. Rent control is an issue. Basic supply and demand is a huge issue that arguably outstrips regulatory roadblocks. Regional geography is also much more of an issue than people sometimes think.
Housing prices here are likely to stabilize, but they're not going to start seriously declining unless the job market starts declining, too.
All those other reasons you mentioned are contingent on zoning. Allow highrise apartments anywhere in the Bay area and just watch how fast the buildings go up and the rent goes down.
> Eventually people will be fed up with the situation and they'll vote for a government that'll allow more development, at which point housing prices will plummet.
Clearly you don't live in CA. Homeowners vote more than renters, and they will never intentionally vote for policies that will put them underwater on their own mortgages.
I think Simon Peyton Jones has the right idea (paraphrasing): programming is one of the most difficult, complex engineering efforts undertaken by humans. At the scale we do it one can't help but be amazed that it works at all, let alone that it works so well.
I think the reason these compensations get that high is because it does take 5-10 years to become good enough to lead a team that manages something so complex and to do it well so that the results are reliable and consistent. I think the difference with doctors and lawyers is that we're not licensed to practice. We're not a capital-P profession. However we still have to attend conferences and stay relevant but the expense and requirements to do so are on us or the companies we work for: there's no professional obligation to do so.
I don't think we're in a programming bubble if the author means we're in a compensation bubble and that programming is over-valued.
I think the real bubble is complexity. We're seeing a deluge of security breaches, the cost of software running robots in the public sphere on unregulated and very lean practices, and a lot of what we do is harming the public... though by harm I don't necessarily mean only harm to human life -- but harm to property, insurance, people's identities, politics, etc... and we're not accountable yet.
If anything I think we need to up our game as an industry and reach out for new tools and training that will tame some of the complexity I'm talking about... and in order to do that I expect compensation to remain the same or continue to spread further out and become the norm. Being able to synthesize a security authorization protocol from a proof is no simple feat... but it will become quite useful I suspect.
As someone who hires programmers, no. It is difficult to hire good people. The only way I can successfully hire low-cost programmers is by building a system/machine for creating and releasing software that does not require a high level of intelligence/creativity. The only people that can make high margins in programming-heavy industries are those who can do this or some sort of defensible moat.
As a programmer myself, yes. There is no way I can continue to make this much. I'm a dumbass. (I say this having worked on, in the last year, a compiler, trading algorithms, and 3D object analysis)
I am still basically an entry-level developer even though I've been doing it for 30 years (20 for money).
The way I see this playing out is, something like behavior-driven development (BDD) where the business folks describe the functionality they desire, and programmers write up the backend logic. Then as AI progresses to AGI, a higher and higher percentage of that backend code will be generated by machine learning.
So over the next 10 years, I expect to see more specialization, probably whole careers revolving around managing containers like Docker. There will be cookie cutter solutions for most algorithms. So the money will be in refactoring the inevitable deluge of bad code that keeps profitable businesses running.
But in 5 years we'll start to see automated solutions that record all of the inputs, outputs and logs of these containers and reverse engineer the internal logic into something that looks more like a spreadsheet or lisp. At that point people will be hand-tuning the various edge cases and failure modes.
In about 10 years, AI will be powerful enough to pattern match the millions of examples in open source and StackOverflow and extrapolate solutions for these edge cases. At that point, most programmers today will be out of a job unless they can rise to a higher level of abstraction and design the workflows better than the business folks.
Or, we can throw all of this at any of the myriad problems facing society and finally solve them for once. Which calls into question the need for money, or hierarchy, or even authority, which could very well trigger a dystopian backlash to suppress us all. But I digress.
How is this maintainable? What do you use to describe the inputs and the outputs (if it resembles a programing language then we're basically pushing people around, don't we)? Is AI supposed to design the interfaces as well as the plumbing?
Let's say, a bug appears.
If the internals are produced by machine learning, chances are it's basically un-freakin-fixable from the high mountains of the spreadsheet/lisp interface. So someone has to dive in, and do it by hand. I doubt the business folk will do it, they won't know where to look!
The result, seems to me, is a metric-ton of machine generated code that now someone has to rewrite. Better hire a team to do it...
I really love your storytelling even if I'm not sure I believe one whit! You should try your hand at writing books/blog posts/short stories if you haven't already.
Hah thanks, ya I don't even know what's real and what's not anymore. Someday we'll live in a society where grandma married a guy she met on the internet and the grandkids have to fill in their pre-learning questionnaire with all the stuff they've already learned on the internet so that the teacher can move on to the really important stuff that prepares them for getting their degree from Silicon Valley online university, where they'll major in pre-K robot childhood education. The year is 2029.
> But in 5 years we'll start to see automated solutions that record all of the inputs, outputs and logs of these containers and reverse engineer the internal logic into something that looks more like a spreadsheet or lisp.
(Unfortunately, Lisp neither makes you smarter, nor a better programmer, which seems to be a very profound, ego-wounding disappointment for a lot of people who try to dabble in Lisp programming).
Now programming-by-spreadsheets, on the other hand, is a real thing, that is almost as old as Lisp, and is called "decision tables." It was a fad that peaked in the mid-1970s. There were several software packages that would translate decision tables to COBOL code, and other packages that would interpret the tables directly. I think decision tables are still interesting for several reasons: they are a good way to do requirements analysis for complex rules; the problem of compiling a decision table to an optimum sequence of conditional statements is interesting to think about and has some interesting algorithmic solutions; and lookup table dispatching can be a good way to simplify and/or speed up certain kinds of code.
What is not interesting at all is the use case of decision tables for "business rules." A few of the 1970s software packages survive in one form of another, and I have not heard anything good about them. And the problem is very simple: the "business folks" generally do not know what they actually want. They have some vague ideas that turn out to be either inconsistent, or underspecified in terms of the "inputs," or in terms of the "outputs," or have "outputs" that on second thought they did not really want, and they (the "business folks") never think about the interactions that several "business processes" of the same "business rule" might have if they take place at the same time, much less the interactions of different "business rules," etc.
AI cannot solve the problem of people not knowing what they want or are talking about. Machine learning on wrong outcomes and faulty assumptions is only going to dumb systems and people down (IMO this is already obvious from the widespread use of recommendation systems).
Sort of. The only way I know where someone can hire lowly paid programmers today is by building a system that does not benefit from more highly skilled programmers. Since I do just this for a living (sorry), it's only a matter of time before I am no longer needed.
About a decade ago, I created a piece of software that made a department of 10 people redundant. The company actually tried to make use of them, but they were so happy with the improved productivity, that they basically kept the dead weight on, for the most part.
Last year, I did this at a financial company and eliminated an entire team. They did not keep the dead weight.
I may not be so lucky to avoid a future mercenary like me. Hopefully that explains it! I rely 100% on my creativity and out-of-the-box problem solving ability to solve real problems (note: not imaginary interview riddles). So far, the living I've made doing this is good.
- Just because a value is high and may very well go down, doesn't mean it's a bubble. FAANG are making real money from those workers, not just inflating an asset and selling it to other investors.
- Just because it doesn't involve long hours, doesn't mean it's not hard. A lot of my college colleagues really struggled, and many more didn't even get in. Don't discount natural ability - in the land of the blind, the one-eyed man is king, even if seeing is effortless for him.
I dunno about a bubble, implying it will pop in some catastrophic way.
But will market forces correct the above average salary? I think so.
More young students than ever are learning to code, which is naturally going to increase the labor pool. The supply of software engineers is going to go up in the next 10-20 years (as will the demand, though! But I still think supply will outpace). This seems like it would mostly affect new hires, as some 15 year veteran is going to have valuable experience that (most) companies will always be willing to pay for.
It feels like finance to me. People who got there early made a killing. Then salaries, while still pretty high, fell considerably as everyone rushed there to get rich.
As a counterpoint to the author's comment on doctors (maybe not lawyers, still plenty of law students in the pipeline). It does appear that the number of people going after medical degrees is decreasing, so I would predict their salaries to jump considerably in the next 20 years.
And last, and a totally aside point, I have exactly one friend who skipped college and over the last 12 years worked his way up the electricians union and now runs his own small business doing residential electrical work. He's making more than most of our social circle. He doesn't know many people his age doing this type of work either, so as the old guard retires, he's going to charge whatever he wants.
The amount of people who study CS or can learn on their own is surprisingly limited. You won't get a job just by knowing how to write some if statements.
By contrast, there are load of history majors in finance. There's plenty of ways to act like you understand it. Plenty of bullshitters. Also the role of luck makes some of them seem smarter than they actually are.
Engineers who can't code will be uncovered sooner or later. It's hard to know beforehand at an interview, but it's a lot easier to discover with that person present for a few weeks.
> More young students than ever are learning to code
More young students than ever are being taught the utmost basics. But is it true that more people than ever are pursuing it in the sense of seeking professional mastery over the craft? A proxy for this might be population-relative CS program enrollment and graduation rates. It would also be interesting to know how this is scaling compared to the overall volume of programming labor demanded, which is surely growing as well.
UW is #6 nationally for software engineering and #60 overall... rankings are imprecise, but that might have something to do with it. Top students from all over the country compete for a small number of spots at top 10 schools.
I remember when I was there, the CS classes were for
NERDS and now here we are, everyone wants in.
When was that? I went to school in 2002 and CS at the time was already well past the “just for nerds” phase - I would guess that happened some time in the late 90s.
Lawyers have been around for a while, there doesn't seem to be a shortage of them, and the average salary is still relatively high. However the distribution of starting salaries has trended towards being bimodal (https://www.nalp.org/salarydistrib).
I could imagine the same thing happening for programmers over time, if it hasn't already.
I believe some states will let you simply take the bar exam and start practicing.
Of course, one would probably get started by clerking or working into a paralegal position, both of which would require independent study and coursework.
Sometimes people just need a cheap lawyer to take a look at a simple will or divorce papers, etc.
No one will let you take the bar exam and start practicing. California will accept four years clerking under the direction of a lawyer and passing the Baby Bar for eligibility to sit the Bar Exam. There are other states with similar programmes but I think they all demand at least one year of law school.
> California State Bar Law Office Study Program
The California State Bar Law Office Study Program allows California residents to become California attorneys without graduating from college or law school, assuming they meet basic pre-legal educational requirements.[19] (If the candidate has no college degree, he or she may take and pass the College Level Examination Program (CLEP).) The Bar candidate must study under a judge or lawyer for four years and must also pass the Baby Bar within three administrations after first becoming eligible to take the examination. They are then eligible to take the California Bar Examination.
I believe Washington is the only state that allows this anymore. Even then, you still need to be an understudy before you are allowed to take the Bar in Washington.
I'd counsel caution in relying on those charts because, inter alia, they're self-reported salaries and the very nature of the report excludes unemployed bar members.
Lawyers are specialists in negotiation, mediation, organizational procedures, persuasion, analysis of edge-cases of language and rule sets, evaluating the motives and thoughts of other humans and predicting their actions in ambiguous contexts, and quickly consuming and producing textual information under high pressure.
That seems like a pretty impressive toolbox that has broad applications across great swaths of human endeavor.
As it stands, many of my lawyer friends have contributed far more to humanity's greater good than I think any of my fellow techies have.
Having a legal background opens up a lot of opportunities in government-related areas. These simply aren't as easily available to other professions. If you'd like to hold a position of power, doing law is probably the best bet.
Total med school enrollment is constrained by the number of residencies and supposedly the AMA. Meanwhile, the numbers of grads for other health professionals (Nurse Practitioners and Physician Assistants) that can fill some of the same roles as doctors has skyrocketed.
First, salaried rank and file software developers on average certainly don't make 300-400k per year. Those working at Silicon Valley FAANG companies still account for only a minute percentage of software developers in the US and certainly in the world.
How that is supposed to constitute a bubble is beyond me.
Then, payment for work isn't (or shouldn't be) about (perceived) equality and it isn't about compensation for suffering or even mere inconvenience either. It's about the value created by the work, which is why the work being hideous or insanely long work hours shouldn't be contributing factors that justify a high salary. Now, this of course is an idealistic notion. Often work is still valued in terms of time wasted instead of value created. Moreover, in the case of some not so well-paid but still important jobs the salary attached the them often is only tenuously related to the value created.
All that said, while probably few professions will be able to match the leverage and hence the value created through software development in the near future, perhaps the more reasonable hours and better work environments in software development should serve as a model for other work environments and industries rather than as an indicator that something's amiss.
Pay is about the intersection of supply and demand, which is influenced by but not purely a function of the value created by the work. Insane work hours props up wages by decreasing supply since lots of people don't want to work 90 hours a week.
Insane work hours also reduces demand for workers, though. If it's a norm in an industry for employees to work 80 or 90 hours, an employer might hire fewer people than if folks only worked 40 hours.
I think there are a number of caveats to this article:
1. The amount that a company can sustainably pay you is dependent on how much value your efforts have in its industry. Software has significantly higher margins than medicine, law, or almost anything else. As long as those margins are sustained, it is possible to pay high salaries. (As buffet says: I'd rather work for a mediocre company in a great industry than a great company in a mediocre industry).
2. But companies don't want to pay high salaries, so they will look for substitutes. Substitutes could be technology (RDS instead of DBAs) or increased supply of quality programmers. So far, it seems like these big companies haven't been able to find good enough substitutes to force down wages.
3. If you are looking at FAANG salaries, you are looking at the top of the income spectrum. The top of the income spectrum for lawyers and doctors is quite high.
4. Market economies reward value (outcomes) not merit (hard work). Hard work is correlated with outcomes, but it is not always a perfect correlation. So, looking at programming and saying it is less 'hard' than law or medicine doesn't say much, the question is how much value the person can generate, and how much of it the company can capture.
#4 is a lesson I'm still struggling with in spite of nearly a decade in "leadership" roles. I've been discouraged from coding beyond tiny situationally dependent snippets to teach a more junior programmer something. My value is no longer in producing things directly, my value is in being the guy on the conference call where the customer says "If Consultant32454 says it will work, it will work.". I understand this is valuable, but not having a "thing" I built with my own hands at the end of the day is difficult for me. I wonder if this is a result of low class upbringing or something... Plenty of people in the management/executive class seem to be able to accept credit/responsibility for things they managed into existence rather than produced into existence... It's hard to accept being a force multiplier rather than a force, if that makes sense.
And also, Do not forget that only a small percentage of developers are making those numbers. And that is quite normal since those companies are in the Fortune 500 most valuable companies in the world!
Comparing that percentage, you would see the same pattern in every other industry. If you do something uncommonly good, you make more than everyone else in the industry. Same applies to all other skills that are mentioned in the article. If you pick the top earners in law, medicine, etc, you will easily find higher numbers.
Everything is code nowadays. Infrastructure as code, circuits as code, etc. Maybe the code we write will change overtime, but the practice of rigorously describing things in a formal language will never go away as long as people are doing intellectual work. What is intellectual work anyhow and is it not isomorphic in some respect to writing code? Isn't it about construction? Maybe the functional community, where they talk about equivalences between proofs and programs and amazing things of that sort, has a better grasp on what the future of programming and its rightful place in human endeavors is.
>> Rank and file programmers at the top tier tech companies now make $300 to $400 thousand per year.
That is absurdly false.
To be generous, let's pretend he's only talking about FAANG programmers, rather than startups. According to Glassdoor, average compensation for "Software Engineer" positions:
Facebook: 121K Base + 15K Addl = 136K
Apple: 122K Base + 10K Addl = 132K
Amazon: 103K Base + 20K Addl = 123K
Netflix: 121K Base + 20K Addl = 141K
Google: 124K Base + 20K Addl = 144K
These numbers are slightly inflated (by 1k to 20K/yr) because I counted base and additional pay independently, while Glassdoor calculates an average total combined comp, which is always lower; I'm taking the larger numbers to give the author the benefit of the doubt.
Since Google is the highest, let's drill down on them a bit:
With 1yr of experience you're looking at 121K+16K (or 125K total claimed, not doing my independent math thing), with 15+ years experience you're looking at 140K+24K (or 161K total claimed).
Again according to Glassdoor, to break 300K at Google on the average you need to be a Senior Staff Software Engineer or higher, which is exceptionally rare -- bit o' Googling suggests they're around 1% of the workforce, give or take.
In other words, the author is literally calling "The 1%" the Rank and File of programmers. Which is an exaggeration, I think.
NB: What Glassdoor doesn't account for is past stock performance; since SV traditionally pays a large chunk of its compensation in restricted stock units, and since the stock in each of these companies has increased by 50% to 150% in the past 4 years, by the time those stocks vest your "additional" compensation may have grown quite a bit due to market performance. But that's not a compensation bubble, it's just the stock market.
Personal experience from GOOG 3 years ago: Staff SWE, 220K base, ~200K stock grant per year (if you ship something really impressive, numbers go way up from there). This is with "exceeds" perf reviews which is a baseline to stay in good graces. That is, in short, why startups have trouble hiring Staff SWEs from Google. From what I heard from my FB peers, comp is better there by about 15-20%, but work is more boring, and there's more politics. Why did I leave, you might ask? From a realization that you can't make all the money in the world, and intense desire to get off the treadmill. And make no mistake, Google puts you on a serious treadmill when you join.
Glassdoor is incredibly outdated and inaccurate. Levels.fyi has much more accurate data. It is still true that only experienced engineers at a handful of top companies and decacorns tend to earn these high levels of compensation, but Glassdoor makes it seem rarer than it is.
The Netflix data for example is just flat out wrong. It only hires experienced engineers, and pays all cash salaries in excess of 300k to all of them.
You want it to be one way... but it's the other way.
-Marlo, from The Wire
As other commenter's have pointed out, the data you're citing is completely wrong and/or woefully out of date. There's always 4 parts of compensation that you need to look at:
-Base
-Target bonus (depending on your level could be 15%-20%+)
-Initial stock grant
-Refresher stock grant (offered each year to protect you from a cliff)
An incoming SWE with 1yr experience at Google would scoff at a total compensation of $125k (this would normally be the bare minimum base salary). Same goes for Facebook. And while Netflix isn't known to hire junior engineers, it is well known in the bay area that they usually pay top of market rates fully in cash; this routinely exceeds $300k and much more so for seasoned engineers.
You are absurdly ignorant. Glassdoor salaries are weighted down by old data which makes them useless. I started out of college 3 years ago at 120k total comp and just switched to a 210k total comp job with a measly 2.5 years of experience and that offer was not even from one of the big 5 tech companies.
Edit: I'll add that all the senior engineers (lifetime position for most people) that I have worked with are indeed making 300,000+ a year.
As others said, those numbers are outdated and low. As a 2018 grad from a non top 10 CS school who had a dozen or so offers (BigCo + startups), I’ve personally seen that the numbers reported on teamblind.com and levels.fyi are very, very accurate.
Crowd sourcing personal data + those from friends, median total comp (base+stock+bonus+refreshers) for a CS undergrad degree was ~160k in high cost of living areas. Competitive students (a few internships at FAANG or similar) get offers in the 180-200k range.
I think those Glassdoor numbers are way off based on all the anonymous surveys and salary sharing threads I've seen. For example, 5+ years of experience will get you 200k+ total comp per year at all of those companies, with many offers in the 3-400 range.
Or another way of looking at it: maybe programmers are one of the few professions that aren’t being extremely underpaid when taking into account: inflation, macro productivity and their scalable impact.
Inflation: A $180k starting salary is equivalent to a $100k salary in 1990 dollars (The 1990s is probably the period when many of us 30 year olds started to hear about salaries and learn that a “six figure” salary was impressive. With $100k being the absolute bottom of a “six figure salary”.
Productivity: for every hour worked (on average) the American worker is about 50% more productive - when compared to 1990. That takes $180k to $270k. [1]
Scalable impact: increases in worker productivity are not distributed evenly. Let’s say a programmer is contributing double what the average worker is to the increase in productivity. This brings us above $300k and if we look at the real numbers maybe closer to $500k or more in value compared to that 1990 worker bringing in $100k of value.
Thoughts? Poke holes in this but it adds up in my mind. Now this doesn’t prove companies won’t find a way to under pay programmers but it at least makes me feel less surprised salaries are so high. And sad that for other industries workers are getting the short end of the stick compared to how workers were compensated in the previous century.
As long as the demand for skilled programmers stays as high as it does, I don't see this alleged "bubble" popping. And it does not seem like software will stop eating the world.
In fact, as someone (I think it was Marc Andreesen) said, we're just at the "end of the beginning". Software is in the final stages of eating the media industry and will now progress toward heartier fare. This will be accompanied by further growth in programmer demand - specifically, programmers with skillsets which are niche right now.
Even as CS enrollment reaches an all-time high, I do not believe the supply of developers with appropriate skills will come anywhere near the demand in the next few decades.
So, no I do not think this can be called a bubble - except in the specific case of FAANG workers. They might see their salaries go down in the next years - but programmers as a whole are unlikely to have a hard time finding jobs with decent pay in the foreseeable future.
>Usually one of the easiest ways to succeed is by doing things that other people are unwilling to do, meaning that if you choose a path with long hours, lots of stress, and lots of hardship, not many people will accompany you, and you will get compensated accordingly.
That's one way. The other way is to do something that other people are unable to do. Software engineering is highly dependent on raw talent / intelligence, and hard work is unable to bridge the talent gap in our industry the way it is in other industries. The high paying companies in our industry are just the ones smart enough to pay 2-3x compensation to get 10x programmers.
1) I think part of this can be explained by the disproportionate living expenses on the west coast, especially in The Valley. I wonder how those numbers on doctors and lawyers would play out if you constrained them to San Francisco?
2) I do think there's a partial bubble, though. What I tell people is, programmers today are like literate people in the middle ages. You can do hard things with reading and writing, but reading and writing aren't inherently hard. It's just that they weren't taught to the masses back then, so you could be hired as a scribe simply because you knew those basic skills. I think once public education catches up (or affordable education; the bootcamps you see everywhere now are starting to close the gap), the basic ability to code will quickly drop in value. There will still be high value placed on skilled programmers (the poets and technical writers of the programming world) - though their salaries will probably see a bit of a correction too - but I think the bottom will drop out beneath them, and you'll have to do more than just learn JavaScript to be valuable.
I understand where this guy is coming from, but his instinct to say that tougher career paths should elicit more compensation is misguided. Certainly there is some correlation, but he could have added social workers and teachers to his list, but nobody wonders why those two professions don’t make a lot of money.
You can sort of count on most organizations paying something resembling a cost of living salary (though even that is becoming less true), but after that everything, ever-y-thing, always, is market driven.
We get to have our cake and eat it too, not because it is harder to become a software engineer than a doctor, but because it also isn’t trivial to become one AND the demand for our talents keeps outstripping supply every year.
There are still lots of divisions at lots of Fortune 500s that are just now starting their journey in adopting software to solve their problems, and they all create market pressure somewhere in the supply curve.
Look at medicine. Doctors make less than they used to because cost controls (not very effective ones, but still) have been introduced into the system. So even though demand for doctors is high a doctor can only generate so much revenue for a provider.
The demand for a service only roughly correlates to a markets ability to pay for it. There is a ton of demand for services in our economy that go unmet, because the market can’t or won’t bear the cost.
Edit:
Also, the returns software can generate for an organization are simply astronomical in a way that even other engineering fields can’t rival. If an organization is embarking on a software project that they know will either make or save them 10s of millions of dollars they aren’t really going to bat an eyelash at 300k a year.
Well. I live in Sweden and I don't make anywhere close to these values. Not even remotely close.
I have been programming for 12 years and I like to think I'm not half bad at it.
So is peak programming just in the valley or should I be worried as well?
California is a fairly unique place, even in the U.S. Despite the costs of living there, it continues to attract talent and investment. The job market there is very fluid, as you can work for a competitor or start your own competitor, due to not allowing non-compete clauses to be enforced.
I don't know the laws in Sweden, but unless they have an environment similar to California the tech industry is not going to be as strong.
That said, at least for me in the U.S., the biggest bump in pay has always been from finding a new job offer.
Well, do you make 500k? If not, then why not? It’s possible in the US, but the median US household income is 1/8 that. The vast majority of workers, probably including most software engineers, ate better off at $50k but not having to worry about insurance if they get laid off, and getting 3-6 weeks notice when being let go (among other things).
My guess is that a huge majority the software engineers in the Bay Area are making less than half those numbers. If you value pre-IPO options at the proper discount, I would guess a majority make less than $250k. That’s a high total compensation, to be sure, but, given the cost of living, and the value a good engineer can create, it’s not absurd.
Full disclosure: I work at a public company (not FAANG) and my TC is under $250k.
I doubt the difference would be that dramatic (see Hollywood actors and athletes). Also, what's the point of 500K if you're prohibited from working on personal projects, open source and have non-competes, no vacation out leave time and NDAs to deal with? Maybe if you're willing to deal with it for a few years so you can retire early and loaded?
There are no non-competes in California, and most of the best paying tech companies have decent vacation policies (though maybe not as good as the norm in Sweden). Lots of people have hobbies other than coding, so they wouldn't mind not coding on personal projects after work. Is it really that hard to imagine that people might be willing to make fairly minor sacrifices in exchange for several hundred thousand dollars a year?
There are intellectual property assignment agreements though. These basically say "we're a giant tech company and work in a lot of areas, so if you work on a side project in any of those areas, we can assert rights over it cause it competes with us and you probably stole that knowledge from us, or thought about it on work time, or something".
>and most of the best paying tech companies have decent vacation policies
You get 15 days at Google for the first 5 years (even if you come in with experience). I'm not really impressed by this. I will probably never see 20 days of vacation from Google. Cisco had 20 for everyone. Other companies may vary.
>Lots of people have hobbies other than coding
This is not the case for me. I play some video games and I write code. I like programming more because it feels productive, games are fun though.
>Is it really that hard to imagine that people might be willing to make fairly minor sacrifices in exchange for several hundred thousand dollars a year?
It's not hard to imagine that (it's how I ended up where I am), but now that I'm in the situation I don't really like it.
That's as close to a lifestyle choice as you can get it. It sure does feel like kind of a ripoff when you're paying 60% of your annual salary in taxes and fees and cost the state somewhere around 5% of that amount, but maybe the perception shifts as we get older and use the public services more.
Doctors: varies wildly, but $20k seems to be the norm with enough hours. The base salary is outrageously low, so many doctors take additional shifts - especially in private clinics where they are compensated much better.
I was surprised to find that at least in Warsaw this is entirely possible.
There's even this ridiculous Facebook group named "Praca dla Programistów - powyżej 20000zł" (Jobs for Programmers - over PLN 20000 - that's monthly) where offers start at ~$64k.
I mean, where was all this back in 2014 when I was scraping by for $12k?
I view it more as a gold rush. Creating the future has never been more accessible and companies are in a terrible rush to find their 'plot' and start finding nuggets.
Software engineers are the people who can do the work and so demand go up. The returns (value) these ventures produce warrant continued competition in winning decent talent and the price gets pegged in some range for some region in some category / industry.
To further add to your point: developers now have control over the means to production and, unlike the industrial revolution, we programmers do not need access to capital and expensive manufacturing equipment to build a product. We just need some technical skills that are essentially free to learn. And as the best of us figure out how to add the entrepreneurial talent stack into our own, existing corporations have to pay us as consultants or continue to increase pay to keep that talent. In time, I think we will see it is the corporate structure that is in a bubble. (See "Developer Hegemony" by Erik Dietrich)
Due to the nature of programming interviews many programmers self-select themselves out of a shot at these "bubble" salaries. Look at recent threads on HN, many folks with valuable experience refuse to prepare for the modern day FAANG-like programming interviews out of principle or whatever, and thus aren't putting themselves in a position for these $250-300k jobs.
These companies have actually devised a pretty decent scheme to select candidates, you gotta be someone either smart enough off the bat or dedicated enough to study to pass their interviews -- either way it's a VERY strong signal that you're going to be a solid hire.
Tech companies are going to continue being some of the most valuable companies in the world, if they aren't we have a lot of other pressing issues (like the decline of civilization) to worry about. Even amongst a downturn and/or recession I'm very confident that programmers at the FAANG type companies will continue to be paid very well relative to the rest of the population.
I’ve met people in marketing calling themselves programmers because they once wrote a PHP script.
I’ve overheard high school-aged kids at the Starbucks talking about the differences between a developer and a designer.
I’ve interviewed self-described “programmers” who struggle to write a for loop and “DBA”s who don’t know what the normal forms are.
My family, none of whom are the least bit technical, casually throw around words like “algorithm” and “server”.
The guy who works at the deli outside my apartment has asked me what my opinion on Python is.
We are definitely in a programming/CS/IT (whatever you want to call it) bubble - the level of general interest and enthusiasm for this stuff is unlike anything I’ve seen with other fields.
To add another data point: I've never experienced any of these things, not even close. The closest I've come is a DBA brother-in-law that wanted to code full time asking me how I liked a particular language. Outside of work, I rarely even encounter other software developers.
Since 2008 the Federal Reserve has been keeping interests rates historically low. Low interest rates generally provide favorable conditions for stocks to go up, many companies will even buy back their own stock. Tech stocks have been growing faster than most sectors since 2008.
You mention other companies offered same salary but could not compete on equity. I'm assuming many of the big tech companies buy back their stock and then draw from this pool to give to new employees, which other companies cannot do, giving them an advantage in getting the best talent.
So are we in a programming bubble? In some sense yes, but it's being driven by low interest rates. I would guess that once rates normalize we will see a decrease in overall compensation from the tech giants as providing equity will become much more costly. However, moving forward the demand for strong information technologists will only increase.
Without being able to read the article, I'd say there's good reason for programmer value. It is one of the few professions that really grants a kind of leverage for effort to a business - because programmers don't just program something once, they program and it works over and over again. Yes, that requires some degree of maintenance, but the fact remains that you simply couldn't accomplish many things which businesses take for granted without programming.
In fact, even with a flooded market of programmers, the production of new tools actually affords programmers even more tools to work with and even more leverage for the business by employing them. So there remains a strong case for programming.
Mind you, the silicon valley area as a specific thing may well be a kind of bubble given how much more you can be paid there as a programmer compared to just about anywhere else in the world.
Since it's down, I can't actually read the content, but I too find no small amount of irony in a claim there is any kind of bubble in knowledge work, when the site said content is hosted on doesn't even appear to work.
If you’re writing software used by billion people, your employer can probably afford to pay you more than if you’re writing software used by a million people. Or 100,000, or 1000, or 10 people.
Market multiples will go through ups and downs, and they’re high now for the major tech companies, but it’s not entirely surprising that the major companies pay well.
Great insights, but let's not forget, at most companies being a programmer is no picknick. Some places like amazon, things are so bad employees cry (that amazon article).
Also, don't forget, FAANG employees are the exception not the rule. Those are probably the top 5% in terms of engineering compensation, so it shouldn't be used as an average case for software engineers. You have to be a top 5% engineer or at least top 5% lucky to be in one of those companies.
And, let's not forget 400K isn't so much when your employer chooses to be in a location where housing costs an average of 3M to 4M and day care costs 30K per year (if you're lucky enough to get it).
You don’t have to be a top 5% engineer. You just have to pass their leetcode interviews and be good enough to not get fired and also able to deal with working for a massively evil company (except possibly for Netflix and Apple).
Perhaps one way of thinking about FAANG pay level is where the money is coming from. FAANG can pay engineers a lot because they make a lot of money (in a very indirect way, it's like what people always said about "If I make my company a million dollars, I should get a portion of that", just very diluted through many layers). FAANG makes a lot of money because they're huge companies that make money from the entire world. This is unlike most startups and smaller/mid-size tech companies in the US (where FAANG originated). In a way, they're companies who've globalized successfully. The money that they pay engineers, come from money they made from other people all around the world. (maybe citation needed -- I didn't do any research on this claim, don't know % of money a Google/etc would make from the world vs from the US.)
When you think about it this way, FAANG pay isn't truly a bubble, it's more of an effect of their globalization. In that sense, I don't think they would "pop" per se on a timer (although they may eventually pop if their business falls over for one reason or another, like how AOL/Yahoo/Pets.com etc. did back in the 2000).
It's also a little bit unsettling that when thinking about it this way, it's somewhat of a transfer of wealth from people across the world, to a small set of people in Silicon Valley. True, the companies are providing something of value to the world, so maybe it's not so much a transfer, but the end result is still that money was collected from people in other countries, and paid to SV engineers/employees.
Again note that I may be completely wrong in my framework of thinking. Just something I thought up while reading the article, no reference or research on any of the claims.
The only bubble I see are of programmers who work at Google and Facebook thinking most programmers have lives like theirs and make as much as they do. Most surveys of programmer salaries do not conclude that they routinely make $300k-$400k.
And when talking about just the top-tier tech companies, of course the economy around them is bubble-like. That is the danger of allowing them to maintain nearly monopolistic reign of their markets.
Bubbles happen when prices are much higher than the actual value. Usually they're propped up by someone trying to resell the thing to a higher buyer, or package it as an investment.
Real estate bubble is visible when the apartment ads say "good ROI" instead of "nice place to live". Crypto bubbles appear when people are talking about market cap more than they use it.
But right now programming is in an optimal situation. It's hard to do, making supply low. Demand is very high; while we have all kinds of startups there are more to do. That creates high prices. IIRC FAANG makes about $400k per employee. In those situations paying $200k for a senior is far more sustainable than other tech fields which operate at a much smaller margin.
I don't know. There stocks seem to be doing pretty well. Are any of these companies likely to not exist in 20-30 years? Obviously it won't go on forever but what's the time horizon.
I've always wondered if the salaries in the industry are just keeping pace to what all salaries should be at. What if they're not the bubble, but instead everything else is just lagging? I guess that depends on if you think workers should be able to purchase a home where they live, pay for child care, and not live paycheck to paycheck.
Paying a programmer 300-400k is marginal cost compared to the value generated by their work, so no. Comany downside is controlled by most of the compensation being company stock, in case this doesn’t hold.
The programs created make the companies that hire the programmers extreme profits, in a way that scales by machines to serve an arbitrary amount of customers.
Even small maintenance changes make millions or tens of millions, but when they are done wrong they bring down systems they cost billions. That is why skilled craftsmen are worth their weight in gold.
Doctors only book so many patients a day. Lawyers are also working for a limited number of clients.
An app I did myself (on top of a FOSS base), released independently, was downloaded millions of times within a year.
Google back in 2011 had one billion MAU. In that context, a $150k+ a year salary (plus equity etc.) for helping provide the content for those one billion MAU is plausible.
Also Google is a web intermediary for web sites with content for consumers. So Google benefits from the work of anyone who puts up a web site, contributes to Wikipedia etc. as well.
I reckon there are 3 factors here that determine the salaries, one being mentioned in the article:
1. Equity
Most high "salaries" are more sharing of the profit or stock then mere salary. Lawyers become partners and thus share in the profit the firm makes (same goes for consultants)
2. Supply and demand
There are way more positions than people willing to do the work.* So the ones who do, can pick 'n choose there employer. Thus work conditions improve, as do salaries. Most other professions mentioned are high stake jobs, but with more willing to do it * * . So once you're in, it is still possible for the employer to demand hard work, because you're more replaceable.
3. Scalability
The work of a doctor is limited to the amount of patients.
Lawyers can do large cases (then they get paid much), but most of them will do work that effect a limited amount of people
Code (especially at the big 5) reaches literally billions of people.
So the values you can possibly add is huge.
* I think it is still kind of weird how many people don't even try to learn how to code (or scripting) while working in excel all the time, it's a magical barrier, few dare to cross.
* * I don't know why, but I would guess it's because (i) The sciences are viewed as arcane or simply unreachable for many people (self-claimed "I just can't do math") (ii) those job tend to have more social status (think Suits for lawyers & Grey's Anatomy for doctors, to name a few)
What was the salary of mechanical engineer vs an average American worker a hundred years ago? I saw this a while ago and I don't remember the exact numbers, but the multiple was far higher than it is today (something like 10x). Software Engineers are ultimately underpaid all things considered if we consider Software to be a critical part of the economy.
I don't think Mechanical Engineers back then were managed by Project and Product Managers though. They had much higher status.
I make in the ballpark of these numbers for the last couple years, and while the giant stock grants have run out will still break or approach $300k for a few years I suspect.
My take on this is as such...
On the one hand, I have outrageous impostor syndrome - nothing I do seems terribly complicated to me and I definitely don't put in 80 hour weeks unless I'm on call; many people I work with seem to be smarter than me.
On the other hand, I do usually acquire a reputation of being very productive everywhere I work, and having freelanced before I've seen loads of outrageously bad code - literally every other project I was "rescuing"/"optimizing performance" with an existing codebase could supply thedailywtf with material for a month or two, and some could just be zipped and published there. So it seems like the bar is (was? that was a long time ago) actually pretty low.
Also, it's not necessary to be 3x as productive to make 3x as much - if there are 1500 NFL players, then the best (more or less) 1500 football players are going to get NFL salaries, whether they are "3x" the next guy or 1.3x the next guy. If there are 100000 openings in companies that can make $1m per engineer, 100000 engineers are going to be rather well paid, even if the IBMs of the world can only make $500k per engineer and thus don't pay so well.
On the fourth hand, I cannot see market not responding, like it demonstrably did with lawyers (I have a couple lawyers in the extended family), with oversupply of labor for such lofty salaries - especially as unlike with doctors and to an extent lawyers, there are no medieval guilds in place artificially restricting said supply.
On the fifth hand (sortof a duplicate of the third), as some have said above, and as with lawyers, it's still possible to have elite earners with oversupply of labor.
On the meta level, I'm not good at predicting the future (and I grew up in a chaotic country), so I'm considering this an unique streak of luck and treating it accordingly. In particular, saving a lot for thinner days ahead should they materialize; not buying as much house as I can afford; and postponing the two-year sabbatical I've planned to take, while the going is good. The way I see it, if the luck never runs out I can retire early; if it does at least I will be somewhat prepared.
Housing bubble, car bubble, etc. 10 years of economic growth and 0% interest rates and a few trillion dollars of QE will do that. Once interest rates rise sufficiently and the bubble pops, it'll hurt, but we get to get back on the ride again. Rinse, repeat.
We're in a machine learning bubble for sure, it's what engineers are flocking to for higher salaries. 90% of it isn't useful, but companies haven't yet learned there's almost no return on investment.
I‘m coding for almost 20 yrs now and used to think the same: „when will the bubble pop?“, then it was „how long can this go on?“ until I ended up with „ok, that’s weird, what the heck is keeping this bubble from popping?”
I was told throughout these two decades that I’ll be replaced by some offshore developer from India within the next 5 yrs max. It never even came close.
To me it was against common sense, to climb up so much the social ladder like I was fortunate enough to do, just by being exposed to programming.
Then, I saw a talk given by Uncle Bob where he spoke about the history of software development. It was interesting throughout (like that the rate of female programmers was almost 50% before the dawn of CS degrees), but he also touched the “bubble” issue briefly.
He turned my perma-bearishness into ongoing curiosity: That we’re not in a bubble but in an ongoing “software crisis”. A crisis existing basically since the invention of programming languages. The theory has it that here aren’t enough developers available and may never will be. Is it demand-side economics on steroids what’s behind it? I don’t know.
But what struck me maybe the most is that I’ve never even heard about this theory, and never did any of my much more experienced and better programming friends and colleagues. The fact they didn’t knew kind of underlines the theory: there is so much demand that there is simply no time nor need to even tinker with the history of our craft.
I think this is being overthought a bit. Good programmers can generate a huge amount of value, or reduce a lot of cost. Someone who is a little better can double those values. Someone who is a little worse can make them 0 or negative by decreasing the quality of the code. Therefore, good programmers are worth paying a lot for. The amount of money available may change, the amount of good programmers that are needed will change, but this formula will remain the same. As long as average programmers are worth something, the best programmers will be worth a lot.
Whether or not the best programmers actually get paid a lot depends on how efficient the market is and I have no idea about that. When crap programmers are getting paid a lot, then its probably a bubble.
SV Bubble? Maybe. But looking at it more long-term, aren't all the incentives there for programming to become a blue-collar (as in more modest pay) job?
* Good to great pay both attracts more people and makes employers supportive of policies to increase the labour supply.
* Cultural acceptance.
* Widespread access to computers and the internet. This was not the case just ~15 years ago.
* Low risk of automatization gets it promoted by governments all over the western world.
* Low barrier of entry. I mean come on, let's not pretend that work on most apps/websites today necessarily requires a CS degree.
* Remote work, in all its glory, may however increase competition for jobs.
I'm just worried about how far this will go given the currently rather widespread anti-union sentiment - even after years and years of game industry bullshit.
Let's be honest: programming and another field of expertise are not mutually exclusive, just like how being proficient in English is not exclusive to whatever other job it is you're doing.
The longer people treat it that way the more money they will continue to leave on the table for these Google/Facebook engineers.
You don't have to be a master programmer, just like you don't need to do four years in college studying English, but unlike English, most non-technical professionals still have absolutely no idea how a computer works even though the world revolves around it. If any universal second language is important, it's not English, it'll be a programming language.
It might be a huge bias but I would imagine a significant amount of productivity gained in the economy, and thus new wealth created, over the last couple of decades has been software driven. So it _should_ pay well, right? It doesn't really matter that it's considered easy or hard, just scarce.
I've heard it, perhaps jokingly, stated that more than half of software that gets built fails; it never finds a market or never meets completion. In that case high-salaries are also a good thing as it increases the funding, and thus social proof, required to start a new software project.
Being the engineer on-call when the website/service/app is down and the company is losing $xMM/minute while you debug the problem is the very definition of stress.
Some companies like Google cordon this responsibility off to SREs. While Amazon has started developing a similar job family, the burden of supporting critical services still largely falls to software engineers.
While software engineering is certainly not the worst job from a stress perspective, my experience (particularly with operations at Amazon) is far from the zen-like state the author describes. It can also be pretty exciting.
Meanwhile in Paris, you might be having a double masters in International Law, but still be earning 25% less than an entry-level Software Developer with just a Bachelor's degree. In most organisations, simply sort the employees in increasing order of age, and qualifications; and decreasing order of salary; and you're likely to find software developers top the charts.
Of course, the precondition is that you should be worth your salt, but finally I have come to realise that salaries are dictated by market dynamics. Your compensation is based on how difficult it is to replace you.
Bubble, no. Supply is going up for engineers, traditionally trained or not. So it all rests on demand, which I don't see calming. The real cliff for programmers is when code starts to write itself.
> The real cliff for programmers is when code starts to write itself.
Meh. People have been claiming that will happen since the first LISP machines.
Writing code isn't a technical challenge, it's a social one. Until self-writing code can figure out how to extract business requirements from the mind of hungover MBA and turn them into a technical design that can be repurposed to meet what turn out to be the real requirements a year later (with zero overlap with what the MBA came up with), I think my job is safe.
Another factor is our tools are becoming more complex than necessary. I do a lot of "internal CRUD" programming, and the GUI IDE's of the latter half of the 90's took far less key-and-mouse-strokes to use, had more UI options, and used screen real-estate better. (They were clunky at first, but got better with time.) One could spend more time on analysis; you didn't need an army of coders.
The Web bleeped that up. Everyone hoped HTML5 would fill in the CRUD gaps, but it didn't. Thus, we still waste lots of time because the Web was not designed for CRUD and shoehorning CRUD into it is like backing an 18-wheeler truck into a parking slot. In the old days you just pointed your sadan steering wheel into the parking slot and DONE. JavaScript-centric UI's have proven too fragile. It's great for eye-candy, but not reliability.
Yes, I know there are some good Web stacks out there that can mostly overcome this, but they are rare and/or hard for managers to recognize. Their urge is to "keep up with the Joneses" even if the Joneses are doing something that doesn't help typical CRUD. Nobody can tell fads from good stuff.
One good invention and/or new standard could wipe out half of CRUD coders. I propose the industry experiment with a standard GUI Markup Language designed to do desktop-ish things out of the box and be more stateful. Mobile-friendly UI's are nice for mobile devices, but not good for regular office productivity. Desktops and mice still rule work.
I was hoping this would go into how the demand from programmers is temporarily high because X. Maybe AI learns how to program? Maybe we're just seeing investment because of the smartphone revolution? Instead, it basically argued that other jobs are harder. While true, that doesn't hold economic water. The wages he's talking about are part supply and demand, part market conditions and RSUs. All he really convinced me of is that wages will go down in a recession because RSUs won't appreciate.
It would only be a bubble if there is some conceivable chain of events where suddenly companies would want less programming than what they currently pay for. That's just not going to happen barring a literal collapse of the internet. That said, average programmer compensation is inevitably going to go down as more and more programmers enter the workforce (though the high-end salaries are going to continue to go up, as exceptional programmers become rarer the more juniors enter the market).
No, nor does the United States' Congressional Research Service[1]. In a 2017 report, John Sargent projects that Computer Occupations will "see the largest increase in the number employed (546,100), the largest annual average number of labor force exits (75,800), and the largest annual average number of occupational transfers (217,300)". In total, 58.6% of all science & engineering jobs will be computer occupations.
Now, "is there a bubble"? Again, no. If we look at those numbers as hard deadlines, then let's see how many students are leaving college with CS degrees? The National Center for Education Statistics reports that only 60,000 bachelor degrees were conferred in 2015 [2]. Enrollment is still at its highest point, but no where near the amount Sargent says we'll be needing by 2026.
I think the biggest concern to a programming bubble is the capacity challenges we face in training these new developers. If you look at historical trends, there is no dot-com crash right now. Maybe people getting scared because of data analytics, but nothing as severe as 2003's web companies over-estimating their worth.
Instead, I think this looks more like the bubble from the 1980's [3,4]. Roberts and Henn try to explain the cause for the drop in enrollment. Henn focuses predominantly on gender representation in the media making females feel like CS was "boy's only". Female enrollment is only now beginning to return. Roberts, on the other hand, looks at how institutes handled the increase in CS enrollment. Since there were only so many qualified instructors, colleges began to make harder and harder qualifying requirements for enrollment. Students, as a whole, got the point and sought out other degrees.
So "is there a bubble?" maybe/maybe not; but it would be because there are not enough qualified instructors to teach the need.
If they paid you 'what you are worth' they could not make a profit. Since they make obscene profits, by definition, they are not paying you what you are worth.
Maybe programmers and a lot of workers are underpaid. If a company is making a profit of at least $2mil per worker, paying that worker 300k$ seems like theft. Though, we do have a problem not every company makes $2mil per worker, but most of these corporations are robber barons, if you look at Executive compensation vs average worker, difference is at least 21x. Most average difference is 100x in the real world.
A key issue beyond the supply and demand of programmers is the wider technological transformation of the economy. Programmers are highly remunerated because they provide massive market value to the capital owners and they do that essentially by automating, by teaching machines to do things previously done by people. This activity seems to create significant capital accumulation and machines don't usually forget what they learn or need to be reprogrammed from scratch: once a problem is solved, its solution is frequently made available to other programmers who can build on it to solve even more complex problems previously thought intractable for computers.
If this continues unabated for the long run and we don't hit an intrinsic automation plateau, it can logically have two possible outcomes: either programming will become the main employment of humans and source of income, which seems unlikely, or the capital owners with the help of programmers will manage to make large swaths of the population unemployable.
Another feature of technological capitalism is that it highly conducive to monopoly rents, walled gardens and strong barriers of entry. The early mover advantage of Microsoft on the operating system market still earns it billions in pure rents. So competitors who would like to change and equalize the earnings distribution, for example by employing programmers in another country, will be in general unsuccessful.
I know it doesn't seem like that now, on the contrary, the tech industry seems like a wild west of endless possibilities and opportunity for all, but it's worth pondering if the apparent "tech boom" isn't really a massive economic shift for a very different, highly polarized and almost feudal society, with the programmers acting as the samurai, the sell-swords used by capital-owning rulers.
No. I do not believe we are in a bubble. At least for skilled professions that keep learning, the likelihood is the reverse. As more layers of abstraction make it easier to produce business value, a smaller number of skilled professions will be able to generate more and more revenue with less time.
Programming is different than doctors or lawyers. Programming in terms of value can scale. Doctors can only see a certain number of patients or perform a certain number of surgeries. A lawyer can only take a certain number of cases. They produce value based on their time in each of these tasks. That amount of value is somewhat fixed.
However, a skilled software professional can complete one project that produces a large amount of business value, or the coder can develop a new product that generates a long-tail stream of revenue over time.
For example, a coder that built the Google Ads Platform has helped generate a tremendous amount of revenue for Google, and that revenue keeps pouring in.
The bubble is real, but for IT/SD professionals that tend to get complacent with their skills. Traditional IT is getting abstracted away entirely. SD is getting more and more complex as many more frameworks/knowledge is needed on a yearly basis to do modern software development. Building a simple monolith is no longer acceptable for many companies. One must learn modern distributed systems.
The sad truth is that it will continue to take less and less resources to accomplish the same business goals in the future. One must stay on top of the latest technologies to climb on top of the coding pyramid. Luckily, this pyramid is starting to segment based on Front-End, Back-End, DevOps, etc. which means the truly skilled professions in those domains will rise to the top and command salaries much higher than currently seen.
Could not read the article, site is down; with that said I don't think we are in a bubble. Software is essentially eating the world, replacing non software jobs. Just look at grocery stores and the proliferation of self checkouts. Then look at Amazon's concept stores where you just walk out with whatever you want and your account is charged. All of these technologies lead to reduced non tech jobs but an increase in the need for software engineers.
Self driving taxis and trucks are another example of the potential of software to replace non technical jobs. As automation and technology increases it appears that the need for non technical workers will decrease and the need for skilled engineers will continue to increase. The future is coming and software / hardware engineers are it's architects. Unfortunately the chances that non technical workers will lose jobs goes hand in hand with that.
It might not be a bubble, but what you are describing (which I agree with) is not stable. At the moment there is a boom in software development because there is a lot of software to build. But, as the industry stabilizes, we will find that once we have built the software, then the software is already built. At some point in the future, there will be an inflection point where there is simply less software work to be done today then there was yesterday. I don't know when this inflection point will happen, but it will happen.
I didn't understand the bit of the refresher grant. If his grant is $50K a year on joining, wouldn't it be 50K a year on refresh? Or is he making the point that as you remain you get more refresher stock?
You get a 4 year grant when you sign your offer. Then, usually after your first full calendar year, you get another 4 year grant which usually is worth about 1/4 of your initial grant. This way, your total compensation stays somewhat constant* year to year even after your initial large 4 year grant finishes because you will have accumulated 4 1/4 sized grants.
*There are some caveats to that like usually people experience a slight drop in total compensation on their 5th year, FB/G refreshers are much much better than Microsoft's and Amazons (Amazon won't even give you refreshers if the stock is doing well like it has because your are "overpaid"). Netflix doesn't do any stock stuff at all and instead pays everyone in straight up cash and yes, they do offer senior engineers 300,000+ a year in base salary.
Your initial grant is not "$50K a year". It's for example, "200k over 4 years". And it's not actually in USD, it's in number of stocks. So if the GOOG stock is $1000, your grant would be "200 stocks vesting over 4 years".
Now when you get a promotion, you'll get a new set of stocks vesting over 4 years. If you get the promotion before the 4 years expire (usually the case) it will overlap with your initial grant.
There are many forces at play and I think it is impossible to say how things will progress. However, the biggest worry I have is that the elimination of maintenance and bug fixing tasks will dramatically reduce the need for engineering hours. I don't see it happening in the next 10 years, but at some point someone is going to figure it out.
If an ML system is able to extract the intended structure from a block of code and somehow translate that into a known-good pattern you could see things shift. I see it as being somewhat akin to human language translation.
No, "Microsoft Millionaires" referred to a large percentage of Microsoft employees becoming millionaires and how they changed the Seattle economy. Stories like, don't kick the poorly groomed guy in shorts and flip flops out of your expensive sports car show room, because he might be there to pay full price in cash for a car and drive it off the lot.
The answer to this question is easy. It's all about supply and demand. The demand for programmers is ever increasing with the way we are advancing as a world. So the demand is extremely high for good developers. This will continue to be the case going forward as become of a digital world. I suppose most of the workforce in 50 years will need at least a basic programming skill set like reading and writing. The ice block isn't freezing anytime soon, it's outside in the snow, in Antarctica.
People look at these huge salaries but often forget the living costs around them. I interviewed two years ago for a company in the Bay Area, was offered a 180K position. I was excited and it was very tempting. After I did the math, I realized the money I have left over each month was higher where I was at, which was not California. I decided not to upend my life for less financially rewarding circumstances. There might have been other benefits, like being a part of the tech culture there, etc, but just wasn't for me.
I remember the same phenomenon in 1999. It's absurd to imagine that supply can continue increasing indefinitely without price being affected. Look at what happened to the legal profession.
The starting salaries for bootcamp grads aren't really much to write home about ($50k-$70k) except at a few select bootcamps; plus a fairly significant fraction of the grads don't seem to get jobs from some of these places. A bunch of bootcamps publish their data at https://cirr.org/data .
Don't know whether high total compensation of programmers at some elite companies and locations is a bubble or not, but interested in flip side of question: how has and how will it spread? Have any of the many efforts to copy Silicon Valley included a systematic effort to increase the compensation of programmers in the Silicon _ region? Perhaps through a combination of jawboning by boosters and leading employers or investors making a bet that large increases in programmer compensation will pay off?
My take on this is that: yes, a small minority of devs are substantially overpaid for their work right now. I don't think it will stop, since even if there is a glut of new developers, the ones with production experience at scale will always stand out from the fresh devs, and will be more prized for that reason.
What may end up happening is the funnel narrowing at entry, that is the salaries for brand new developers collapsing, while only a select few make it to the top. It reminds me of professional sports.
What does it mean to be overpaid? There are certainly professions that provide more overall value to the world than software engineering and yet make less money. So in a sense you could say every software engineer is overpaid. Otherwise what does this mean?
I get the feeling that a lot of programmers (especially the older senior ones) are overestimating their significance. It's like people don't want to accept the inevitable future of AI and automation and be like "oh, code is never going to write itself! nope, it's never going to happen! herp derp my job is safe!".
Give it 10-15 years and the bubble is going to burst whether we like it or not. If you're not into machine learning or deep learning at that point you're screwed.
I think its because we saw this big consolidations and growth around the big players. If the big guys happen to saturate the market then we will see a drop in bonuses.
Is this run on big companies might be because of the end of Moores law: this event brought some kind of stasis that benefits big established players, as the big guys are now less afraid of disruption (as it is easier for the upstarts to plan ahead if they know that you have two times the horsepower within a year) Is that right?
four ppl from my extended family have quit their jobs and joined "code school" to become programmers this year. None of them have any programming background.
I work for a gaming/toy company managing RPG related miniatures (among other things). It's not a very technical role, but the tech background helps in a lot of ways.
There is a lot of bubble logic in these replies - things like "salaries are destined to go up forever because they've gone up since the dot-com days", "smart people in other fields are baffled by things we find trivial", "programming is incredibly hard so it's obvious we should be paid more," etc. There's probably a kernel of truth to some of these statements, but there are eerie parallels to past bubbles here.
It is my opinion that we are not in a programming bubble, but in a venture capital bubble. The latter causes the appearance of the former. Let me explain the meat of how I think it works.
1) A well-to-do person decides to start a VC fund. This person recruits a few high net-worth friends and convinces them to invest a sum of money in the new fund. Let's say, hypothetically speaking, that this person gets $1m to play with in total from 10 individual investors.
2) This new fund does around 10 unpriced seed investments with its $1m (convertible notes). 50% of these companies do not go on to raise more money, and thus the money spent on them is written off. The other 50% go on to raise a priced A round, whereupon some bigger funds lead the rounds and mark up the price of each company's equity by between 50% and 150%.
3) Our seed investor is now sitting on equity roughly worth $1.25m. This person has made a 25% return on the capital under their management. They take a 5% fee off the top - for a handsome payday of $62,500. The investors are pleased.
4) Then, this thought crosses our brave new fund owner's mind: "Boy, I'm really good at this." So our owner goes out to a wider network and solicits $10m this time, with the intent to participate in A rounds instead of seed rounds. They cite their successful 25% returns in seed stage companies, and people scramble to hand them money to manage.
5) Goto step 2 (but increase the numbers and change up the preferred investment round occasionally)
The sums get bigger, the paper returns get bigger, and the management fees get bigger. But what brings this all to a crashing halt? Where does all that VC money come from?
I believe that this is all a consequence of the zero-interest rate environment in the United States throughout the last decade. People with assets have largely had no attractive places to put their money, so they were forced to chase riskier and riskier investments. Venture capital was the perfect target - the compelling narrative of technological progress makes for a feel-good investment avenue, and the general opacity of tech concepts to non-technical people makes the mystique that much more compelling. Plus - there's a mathematical strategy here! Why buy bonds and get a lower rate of return than inflation when you can chase unicorns? Why not take the probabilistic and "scientific" approach by investing in 100 companies - expecting 90 to die, 9 to do okay, and 1 to be a mega-success that makes your investment worth it?
The hunt for the fabled "unicorn company" is this economic cycle's equivalent to "housing prices are always going to go up." But the models work! you say. The success and failure probabilities are accurate! So were the models that led to mortgage-backed securities - if we bundle enough of these loans together, on average they will have to be profitable - right? The problem then, as now, is that such a model is only accurate when the broad macroeconomic conditions underlying it remain true. When you run out of money coming in, the music stops. Loan defaults started to spike and the MBS model fell apart. Likewise, I suspect that less money entering the VC world will presage the whole thing falling apart - and as interest rates climb, it's only a matter of time until debt is more profitable again and the easy-money faucet turns off.
I've ranted for a bit, so you're probably wondering - how in the hell does this relate to the OP's article? Let's now address the other side of the equation here - where does all that VC money go? Well, that money that was invested in all those failed startups (or the successful ones) gets spent on something. But what? It clearly doesn't all get spent on catered lunches and ping pong tables (much to the chagrin of our industry's critics). But it does get spent somewhere - and I'd hazard a guess that the most popular targets for that spending would be Facebook Ads, Amazon hosting, Apple hardware, Google Ads, Microsoft software, etc. The VC money gets spent on stuff the tech giants are selling, fueling the dramatic increase in their share prices over the past ten years.
Given that OP's article makes the point that compensation is huge and primarily driven by share price increases, it's easy to see how VC money could be actively impacting this situation. But as a gut check - if all of this talk of "US economic conditions fueling a domestic venture capital bubble" does have a grain of truth to it - what would you expect economic conditions for software engineers to look like in other parts of the world?
Interesting, right? Proximity to the nexus of Sand Hill Road does appear to have an impact. I'm no data scientist, but I imagine there's enough publicly available data about venture rounds and salaries that a more rigorous assessment of this general thesis is possible. If anyone knows of existing studies, please let me know.
I bet compensation in the US will go down once businesses figure out a way to really outsource to programmers around the world.
The last push failed but businesses will adjust their conceptual models and try new things. Eventually, something will stick.
Heck, maybe the last round failed because the technology culture needed to develop more in different areas. If so, how long before local geeks do to their area that has apparently already happened in the states?
There's only one way salaries go down: over-supply.
Demand is super high because of 2 things:
1. Tech companies are very profitable
2. There is a lot of cash in the economy (thanks to QE) a lot of which got invested in big tech and startups.
And supply of high quality senior engineers is not there yet.
When the wave of young kids who chose to do MIT instead of Harvard Law School gets older, things should get more in line.
At least that's how market theory is supposed to play out
It would be very instructive to see a studied comparison of what factory work looked like during its peak as a well-compensated profession in the US, and what is happening in software development now. Certainly there are meaningful differences, but I am pretty driven by a concern that we are the factory workers of the future -- people whose profession used to guarantee a comfortable wage, but no longer.
The comparison is between senior (5-10 years of experience) positions at FAANG, not all software companies, and doctors over all hospitals. How much do doctors make at the top 5 hospitals after 5 years ? How about top lawyers ?
In terms of job difficulty, I see no argument there: I can't imagine the stress a surgeon has during an operation; especially compared to a computer scientist such as myself.
Will the demand of software in the economy go down? Unlikely.
Could the economy itself go down? I'm betting it will.
My take is that we currently have bullshit economy, and that the software industry is a big part of that bullshit economy, almost funded entirely by it.
Of course there's a lot of essential software, but if the bubble of bullshit software pops, then the industry as a whole will suffer a lot.
I often think this (we are in a bubble) and feel a mild sense of dread. It is time consuming and difficult still to make top notch websites, but I don't think it will always be that way. Eventually we will be able to churn them out like little plastic toys from China, thanks in part to all this nice functional composable programming :)
The question is really:
will the (value/time unit) a programmer can generate for a company increase or decrease in the future?
Since the world keeps getting digitalized and connected, I think the answer pretty clearly is that it will increase.
IIRC, $100K a year before 2008 was considered pretty high compensation in the valley, and things started to change around 2009. The total compensation of software developers started to increase by double-digit percent almost every year.
I know it sounds simple but local city councils have much of the blame here. They are underpowered to deal with the influx of workers for the past two decades making the salaries necessary to attract people.
Bit off on the law salaries - big law is pretty locked at ~190k to ~450k for associates base bonus (most firms entirely dependent on year). For partners, many making 1M /year.
About the last sentence: If only we will still need developers in the coming 10 to 20 years... I hardly see how 'AI' (note the quotes) could not replace them.
I think the high wages are due to a lack of understanding from non-programmers. Non-technical folk see it as scary or hard to understand so would rather pay someone else to handle that stuff. We're on the inside looking out, not everyone wants to understand. As I'm sure we're all familiar with- i.e. the conversation killer of explaining our day job or even mentioning I'm in software/IT/programming etc.
Some of the smartest people I know work in other domains: biology, chemistry, and even physics. They are sometimes baffled by tasks that seem trivial to me, and I'm under no impression that I'm more intelligent than them. I simply specialized and focused only on programming, while they program to accomplish other tasks in their domain of expertise.
Can this last forever? Of course not, nothing lasts forever. But wondering why the wealthiest corporations in the world pay their workers high salaries is perhaps like wondering why water is wet. Software has a low marginal cost, and the rest is basic incentives for the corporations.