Hacker News new | past | comments | ask | show | jobs | submit login
The diminishing half-life of knowledge (rednafi.com)
134 points by notaboredguy on Dec 10, 2023 | hide | past | favorite | 80 comments



It is important to recognize that even if knowledge becomes untrue because some assumption or fundamental has changed, knowing the history of these changes and why they occurred is still extraordinarily valuable knowledge.

Too many software developers just know the "current thing" without knowing why it is the current thing and the specific issues that caused us to move on from the old thing. This ignorance of the past frequently encourages developers to reinvent an old thing poorly without understanding that not only is it not new but that we stopped doing it in many contexts for good and nuanced reasons.

I think this sense of history is one of the most important aspects of "experience" when hiring. It is easy to train someone on an arbitrary tech stack but difficult to train someone on the history and path dependencies. Many developers are not interested in that history because it doesn't feel relevant to doing a job now. We tend to gain this sense of history by being in the industry long enough that we were part of that history.


This is one of my pet issues. I'm absolutely of the opinion that you need to at least be familiar and understand what has come before in a technical field in order to have real mastery in the same field today.

Practically speaking, knowing the hows and whys of obsolete technologies makes it much easier to learn new the new stuff that is coming tomorrow as well. Everything is built on what has come before.

I see many "kids these days" who not only don't know the history, but actively think that knowing it is a waste of their time. It's a shame, because that attitude is a handicap. I also take it as an indication that the person doesn't have an interest in the field for its own sake.


> It is important to recognize that even if knowledge becomes untrue because some assumption or fundamental has changed, knowing the history of these changes and why they occurred is still extraordinarily valuable knowledge. [...] I think this sense of history is one of the most important aspects of "experience" when hiring.

I can tell you that in my experience employers do not value this kind of knowledge a lot. Quite the opposite: quite some employers hate such employees who ask "too many inconvenient questions" instead of surfing the hype of the "current/next big thing".


While you’re not wrong, just bc employers don’t care about it during interviews doesn’t mean it’s not important.

In my experience, startups with inexperienced technical leadership tend to be the “next big thing” (as far as tech stacks go) focused types of places.


I think that is an orthogonal issue.

That is more or less about virtue signalling that you are a "team player" that helps your boss advance his career rather than doing what nominally is your job. Think teenage clothing fashion. You need to show you are in.

Pretending that complex and risky solutions are needed is a key property of this. The boss need headcounts for the complex fancy project.

And so on, with different companies being in different circles of hell on the matter.

Good solutions usually look simple and easy in my experience. Downplaying the skill in doing them.

The author however seems to write about machine learning, which honestly is a really fast moving field right now. Sometimes things just move fast?


In a sense, though, that doesn't matter. I mean, it would be nice, but it's not material. That knowledge is of value to the devs directly because it makes them better devs. That is something employers do actually value.


> Too many software developers just know the "current thing" without knowing why it is the current thing and the specific issues that caused us to move on from the old thing. This ignorance of the past frequently encourages developers to reinvent an old thing poorly without understanding that not only is it not new but that we stopped doing it in many contexts for good and nuanced reasons.

I agree with this, but I also have real frustration that so often, as an industry, I see big pendulum swings along the lines of "that old thing is bad, let's do this new thing that solves many of the problems of the old thing." Except that engineering is usually about balancing tradeoffs, and oftentimes that new thing leaves you with a different set of problems that the old thing fixed in response to the old-older thing.

Microservices were probably my best example of this, where companies were frustrated by the issues with monoliths so they said "Aha, microservices solve all those monolith problems", except microservices come with a whole slew of difficulties of their own, which in my opinion are often more difficult to manage than the problems with monoliths.

The industry's love for a hot minute when all these NoSql DBs first came out (i.e. around when Mongo came out), only to later find out "Hey, the ACID guarantees in RDBMS transactions are actually pretty essential a lot of the time" is another good example.

One bright spot is that I feel like the industry has matured somewhat when it comes to searching for these silver bullets. All frameworks and tech have pros and cons, and when you solve one problem, don't be unaware of the new problems that will create. E.g. sometimes you still see some "Do it all in the cloud!!" vs. "No, you lose control when you do it in the cloud, you should own it all", but I think more commonly you see reasoned responses along the lines of "it depends".


Not only that, but the old knowledge can still be useful. Newtonian physics won't explain semiconductors, but it's still useful for slinging space probes around the Solar System or winning bets at a billiard table.


The article references the following IEEE Spectrum article:

https://spectrum.ieee.org/an-engineering-career-only-a-young...

> Given a choice, many employers would rather hire a couple of inexperienced computer programmer and spend a few months training them than hiring (or retaining) a more experienced but expensive programmer.

In the very next paragraph:

> In addition, many employers aren’t interested in providing training to engineers or programmers who may start becoming obsolete, for fear of seeing them leave or be poached for employment elsewhere. [...] employers looking for experienced workers have fallen into the habit of poaching employees from their competitors as opposed to spending the resources to train from within their organization to meet a specific job skill.

That directly contradicts the preceding paragraph, so I find it hard to trust the other claims that it makes.


Once you understand that inexperienced is an HR-approved euphemism for young, and experienced is an HR-approved euphemism for old, then the discussion they’re having falls into focus:

If large companies need to train someone (typically on a company-specific toolkit) they want to train someone with more room for optimism in their workplace expectations, fewer points of comparison for the quality or utility of the toolset, and who will have less of an ability to exit post-training.


>>, many employers would rather hire a couple of inexperienced computer programmer and spend a few months training them [...] In addition, many employers aren’t interested in providing training to engineers or programmers

>That directly contradicts the preceding paragraph,

The "many employers" can be 2 different subsets of employers and/or 2 different tech stack situations. Examples...

Subset (1) FAANG or "tech" companies will train on specific in-house technology stacks for younger new hires. The "inexperienced" was in referencing "young". E.g. Apple hires fresh young college graduates that only did Scheme and Python in school but will train them on Objective-C and Swift so they can work on macOS and iOS. However, Apple typically doesn't hire older experienced COBOL programmers to re-train them in Swift.

Subset (2) companies that don't train new hires (many non-tech companies where IT/dev is a cost center). They usually don't recruit from college campuses and prefer the job candidates to have existing skills that already match the job listing. E.g. a hospital IT's department has a job listing for a Java programmer to help maintain their billing system. The hospital is not interested in a candidate who's skillset is only Turbo Pascal and Macromedia ColdFusion and retraining them on Java.


If you read closely, the former is saying they want to train INEXPERIENCED programmers. The latter is saying they don't want to train EXPERIENCED programmers who are becoming obsolete.

Maybe they were trying to make a different point, but that's how I interpret it.


The former are much cheaper to train than the latter.


I think they're about different contexts.

The first paragraph is about training inexperienced computer programmers to do things their way. It's a point frequently made that more experienced programmers don't just need to be trained in the new company's ways, but often need to be untrained from their previous ways -- yet they still cost more. So this isn't about "industry-wide" training, it's more about how company-specific training.

While the second paragraph is more about training relating to transferrable skills. Companies don't want to teach people to become data scientists or ML experts -- they'd rather hire people with those skills already.

Perhaps it helps to think of the first bucket as "generic" programmers, the ones writing CRUD apps or websites or similar. While the second bucket is about "domain-specific" engineers.


There does seem to be quite the inconsistency here. I feel like the truth lies somewhere near the truth of the no wants to work mantra which may simply be a refusal to raise wages.

I wonder if the so called knowledge half life exists because everyone is working in very niche, specialized area these days. Those skills are simply less transferable to even similar jobs in the same field.


There is some "loyal autodidact" unicorn who would be an ideal hire.

People tend to fall somewhere on a robot/ninja spectrum.

The observation that employers have multiple motives that are in tension is simply human.


> There is some "loyal autodidact" unicorn who would be an ideal hire.

Let us understand "loyal" as "not very willing to switch jobs" (and not in the sense of "submissive"/"obedient").

(Loyal) autodidacts are still a nuisance to many bosses, because autodidacts, nearly by definition, often have a very strong tendency to "think autonomously". Thus they have a tendency to question a lot of (technological, architectural, ...) decisions - often for good reasons (and if they are experienced autodidacts they often also the background knowledge to support their reasoning). In other words: autodidacts are commonly less ingrained in "the way things are usually done", which can easily lead to conflicts with bosses "who love to boss around".


Some angles for me to see this phenomenon

- focus on the fundamentals rather than the particular brand of tools. HTTP, tcp, HTML, OS, CPU, filesystem, etc will almost certainly out last a language, framework and SaaS

- See beyond the assumptions. Solutions are based on assumptions of current problem. Solutions come and go but the fundamental problem of, for example, go from a place to another, rarely change. Try to distill THE problem.

- focus on outcome rather than action. Ask ourselves why such and such actions matter. Do we know why are we implementing a thing, or if a thing we implemented 6 months ago matter at all.

All of these demands consideration beyond do you know "dewalts" or "bosch"


> focus on the fundamentals rather than the particular brand of tools. HTTP, tcp, HTML, OS, CPU, filesystem, etc will almost certainly out last a language, framework and SaaS

Umm, you are just picking survivorship bias tech things.

Focusing on html was a good choice. Focusing on xhtml2 or vrml would be a bad choice.

HTTP will serve you well. FTP or Gopher not so much.

TCP is a better choice than IPX or SCTP, etc.

Ultimately though, i think the point is to focus on the underlying ideas. You should understand transport protocols not tcp specificly, then you can easily apply that knowledge to QUIC or whatever.


Lindy Principle


For those curious

> The Lindy effect (also known as Lindy's Law) is a theorized phenomenon by which the future life expectancy of some non-perishable things, like a technology or an idea, is proportional to their current age. Thus, the Lindy effect proposes the longer a period something has survived to exist or be used in the present, the longer its remaining life expectancy.

https://en.wikipedia.org/wiki/Lindy_effect


One really important issue with your first point is that everybody hiring is filtering by experience in SaaS, framework, and as a last resort, language. Nobody is searching for knowledge of the fundamentals.

But anyway, my take is that the problem the article is describing is caused by the existence of way too many fundamentals that can't all be practiced.


Both your comment and the parent you're replying to are awesome and match my experience in my area of electrical engineering (includes a lot of software, programming, database needs as well, so fairly relevant).

There are those that have resumes tailored to a single particular thing that if hot right now, will have 1000 recruiters after them to run certain grid studies. I'm more of a fundamentalist (need to think of a better term)for my field where I can tell you the underlying math behind most of the studies performed, familiar with the pro/cons of a dozen different application softwares, can code, use SQL, Linux whatever. I'm also a people person, which is helpful as there is a lot of stakeholder interaction. If you understand the equations/theory and how all the technical junk works....there aren't many roles in my industry I can't get up to speed on in very little time, while having a global understanding for how it fits as a cog in the bigger machine. This isn't true of most in my industry unfortunately. Many many only know a single role, have experience with one tool, don't understand the math behind the tools, can't code or do data analysis... etc.

I've found the broad/general experience to be very valuable, but it's harder to convey that to recruiters sometimes that are looking for "X". I sometimes have to tell them that what I have is highly transferable to "X" and that I have a bunch of other goodies their employer would be very interested in. Sometimes it works if they're communicating with the manager looking to fill the role and not HR. If I can actually talk to someone at the company.... usually not hard to arrange, then they've often offered to make a custom role as well. I know it always isn't the same for software shops or large companies with layers of beauracracy, but that's my experience.


>I'm more of a fundamentalist (need to think of a better term)

"I tend to focus more on the fundamentals".


Thanks!


How do you arrange talking to someone at the company?


In my industry there are a lot of big, medium, and small companies. However, it's still somehow a small world. Go to enough major conferences and you meet enough people and start to recognize the faces and before you know it you've made friends with key people at a lot of companies. So it's not too hard to email someone or give them a call and talk about what you're interested in and if they're interested if that makes sense.


>focus on the fundamentals rather than the particular brand of tools.

But you went on to list a bunch of tools. Sure today HTML, TCP, HTTP seem fundamental, but when they first came out?

CPU is very open ended, what do you mean focus on CPU? Do you mean x86 or do you just mean that there exists a concept called a central processing unit where bits go in and bits come out?

With that said all I have to say on this issue is that there are different strategies to learning, and for some people such as myself, I prefer learning things from the concrete and towards the abstract. I like starting from the actual tools and frameworks and very specific and particular things I can manipulate, and then abstracting from them and building up concepts, as opposed to what it seems many others like to do which is to start from high level concepts.


The real fundamentals are graphs, trees, stacks, recursion, dp, etc. in other words, computer science.

If someone understands trees they can understand html in one sentence.


It's not so much the knowledge itself that has a half-life (unless it is front-end tech knowledge), but the ability to monetize knowledge. You used to be able to make a career out of some niche bit of knowledge but those days are over. You need to work hard just to stay current, in almost every field and that is as much a trend driven by technology as it is driven by the fact that there are so many people of working age now.


This is an excellent point. The frontier moves quickly, leaving a wake of commoditized skills behind it, but there used to be identifiable, well-paid skills that people chose precisely because the frontier moved slowly. While my instinct is to say that the frontier moves more quickly now, I have a suspicion that this less true than I think it is and what has really changed is that everything has a fast-moving frontier now.

For example, the evolution and development of the Internet technology in the 1990s occurred at an incredible pace that is as fast as anything I've seen since, but back then you could switch to being e.g. an Oracle DBA if you wanted to avoid the chaos, and many people did. Those safe harbors have become rare in tech and the relative pay for them declines every year.


This is why it is important to gain very good knowledge about the basics and "axioms" of whatever one works with. That way one can very quickly grasp "new" things, once it is needed. Without a solid grasp on the basics, one is bound to keep chasing the hype.


Credibly signalling that you do have some bit of knowledge is hard too. Its easy to learn a lot on your own today but the only widely valid signals are academic credentials or past experience.


In response to the spirit of the article and ignoring the specificity of content, a great way of retaining knowledge over time is Sean Whitelys Memletics courses. His frameworks and practices for studying, learning and retaining any subject matter are very practical. While his “Learn” tool can get a bit verbose, the concepts of reviewing notes in an irregular patterned method over time are extremely effective. I recommend using the Learn tool on a simple subject just to observe the effect the process has on your own experience of retaining knowledge over a few weeks, months and years. I don’t know where this guy Sean ended up - but if you get your hands on his materials, hold on to em! Hopefully that helps.


This is not specific to IT. It is a general trend. Also, it's not new; cf the saying "Jack of all trades, master of none". Also, the reason why we don't see Da Vinci types anymore.

    Enthropy is increasing.
Things that used to be in one domain will in time split up into multiple, each having a higher level of detail / sophistication than before. Or, that one domain will die off, possibly being replaced by "something else". This can not be reversed - "you cannot unbreak the broken glass" (It seems that we need to be able to reverse our direction in spacetime to do that, and I believe that is considered a hard problem.)

If you want deep knowledge you can't have broad knowledge. It follows from this that, eg AGI is deadborn; but then specialized AIs have huge potential. This should be the scary part, not the utopian know-it-all Mechanical Turk / HAL / Marvin.

Speaking of the latter, we will have little use for an AGI anyway as such a thing will be way out of our league and we will not be able to use it, as D Adams famously postulated: "Here I am with a brain the size of a planet and they ask me to pick up a piece of paper. Call that job satisfaction? I don't."


> "you cannot unbreak the broken glass"

I really don't like this analogy. You can weld glass back together. Or melt the pieces down and reforge it.


Replace 'glass' with 'vase'.

Analogies are always breaking down but break down faster if people don't just try to see them as analogies rather than the thing itself, the idea was to convey a point and you indicated that you got it perfectly well.


As a counterpoint, it is impossible to discuss this topic without coming to some definition of "mastering" a tech stack. The concept itself seems suspicious. That is like saying "I mastered fashion". What, exactly, does it mean to master something that is in constant flux? How do we differentiate between a master and someone who read the documentation a few weeks ago?

It is a mug's game trying to master a tech stack anyway. The clever thing to do is master problem domains. Then if you encounter the problem you can just solve it using old tools and be happy.


Beyond some point, mastering "fundamentals" tends to look a bit like a barbershop pole. Engineering and computer science principles are in competition with each other. Within each tech stack, the balance of forces is different. It takes years of experience within a particular tech stack for an organization to find elegant and harmonious balance in techniques.


while I agree that there is a half-life to certain type of knowledge but I think it would be overstating to reasonably apply to all knowledge.

its very easy to retain a significant portion of the knowledge by building mental models. sure you will forget about the API surface of a technology, but you would surely remember the underlying knowledge models and you would surely apply it in many other contexts.

like remembering the REST API inputs and output data types is also a knowledge whose half life is much shorter. but building mental models of those would stay for a long time. the point never is to remember and include REST API input and output data as part of your skill set, the point is to include their underlying knowledge. and you can't treat the API surface of a library as knowledge, its essentially a volatile memory and supposed to be cleared away


Exactly. It's weird for me that jobs are so strongly focused on some narrow technology. E.g. "React programmers" or "Javascript programmers" or "Python programmers".

When you know the basic "underlying" model, switching from a framework or language to another is not a very big deal. Not much bigger deal than figuring out a new large codebase with a familiar framework/language.

Sure, it may take a few weeks of learning new stuff and being unproductive (or even negatively productive) during this phase, but this is to be expected in any job.

I don't think learning the more fundamental concepts is that hard but it does require some time (and interest) that is not immediately productive. Perhaps due to demands of being productive, as in churning code, all the time gets people (and the industry) to get stuck in such "local optimum".


Switching frameworks usually implies learning all of the quirks if these frameworks/platforms, which is the long tail of problems


There's something to be said for having a deep familiarity with a language. In terms of just churning out basic tasks like churning out CRUD endpoints it maybe doesn't matter so much, but it's both a force multiplier for productivity and an enabler of building something altogether less trivial.

Arguably a lot of the ways modern software sucks is a direct consequence of developers not understanding the tools and languages they're using. It's a bakery that bakes bread by scraping the toppings off frozen pizza.


like who would care if you can bring certain type of knowledge's half life to 200 years. since you can't live anymore for that much time, you don't even need to think about solving that problem. just need to think about how to convert knowledge whose half life is less to the knowledge that has higher half life. (like converting API surfaces to mental models)


At the link below there's an interesting discussion of how an emphasis on competence-based education over knowledge-based education is not leading to an improvement in learning outcomes, quite the contrary.

So despite the fact that knowledge may have a half-life, it may be even more important to acquire the skills to learn that knowledge.

https://news.ycombinator.com/item?id=38590888


I think most "tech stack" knowledge is superficial. Knowing the tech stack is not the hard part of the job. The deep knowledge does not atrophy the same way.

I think its kind of like the difference between being up to date with latest slang vs knowing how to read well.


I've actually recognized this problem myself. I found that even though I took good notes, reviewed it somewhat consistently - when jumping back to something I used to be a 'master' in, my knowledge is still lacking.

I'm building a learning focused note taking app for this very purpose - https://www.wonderpkm.com/ , would love to chat further with you about your learning approach!


As the author highlights, this is not a new thing. What is definitely a more recent thing is that half-life is much shorter than before. Now it is fair to say that even the knowledge that's most staying power has at most a 2-3 year half life.

My own solution to this problem has been to work in startups for 2-3 years and then move on rather than try to seek a well paid job at a large, prestigious company.

Why? I started my career at one of these brand name, "everybody wants to get in" kind of companies that hadn't made institutional layoffs ever. When time came to do their first mass layoff, I saw people who had spent their entire careers at the company lose their jobs unemployable because thy had essentially become bureaucrats.

Startups offer you the possibility of "on the job training" for the most recent technical stack. And precisely because there are too many people with golden handcuffs at the large companies, good startups are a bit easier to get into (not "easy", but "easier").

The downside is that 90% of startups fail, and you need to live with that. At the same time, if you happen to work at one that succeeds spectacularly, you won't have to worry about making a living until you die.


Obviously I don't have a solution to offer anyone, but this is one of my motivations for wanting to learn more about things like category theory, type theory, functional programming...

It's not directly about getting skills that will land me a high paying job (via... marketable resume keywords?), it's more about trying to understand the fundamentals that no PL or framework trend, or social current, can obviate.


I agree with the general idea of math and science. However the fields you mentioned are mostly useful for formalizing computer languages. If you want more concepts to use in actual programs, I think there are more practical areas to look in.


Personally, I find the 'knowledge half-life' problem to be easily solved with spaced repetition. I've used SuperMemo for a few years now, and have found it invaluable for retaining and revisiting old concepts, especially when switching between different tech stacks. The priority queue system of supermemo fixes the issues Anki and others have when collections grow to larger than 10K+ cards, where the daily load can often become unmanageable.

I find 30 minutes a day of supermemo, supplemented by an hour of purposefully randomized review on weekends keeps even very old topics near the forefront of my mind. It's like having a well-maintained toolbox I can throw at different problems at any time. The slight daily effort is more than compensated by the retention gains I've noticed over time.


I often watch a pair of YouTubers who cover china, and they used to reside there. The one thing that they lamented often was a lack of maintenance, from historic buildings to lightbulbs in elevators. They tried to posit it as something more cultural, and I dont know if that is true or not for china.

I do feel like the culture of tech has become one of maintenance not being part of what we do. When was the last time you saw someone get promoted for "cutting costs" or "making the code less complicated". When was the last time you sat down and read someone else's code and were impressed at how CLEAR and SIMPLE it was?

20 years ago we were all running on bare hardware. Now it's a hypervisor, with a VM, that has a docker container, that imports everything and a cat. We get tools like artifactory to make up for the fact that all of that might disappear. Top it of with zero ownership (infrastructure as a cost center and not a depreciable asset).

It feels like a fuck shit stack of wet turds and were trying to play jenga with them, shuffling them back to the top and trying to get our next gig before the whole thing falls over and hopefully lands in someone else's lap.

To make a point: do we need docker? No, but writing installable software is hard (depending on language and what you're doing). Docker doesn't fix the problem it just adds another layer in.

The original service is the database. Yet we dont normally expose it because its security model remains rooted in the 19xx's. So we have layers of shims over it, ORM, JSON... pick your stack and "scalable" abstraction.

The author mentions LLM's. The more I play with them the more I feel like this is going to be cool after engineers beat it into submission over the course of the next few years. So much opaque, and so little clarity on what is going on under the hood. It's no wonder people are having trouble coming to grips, it's a mess! If it were a battery break through it would take 10 years to turn it into a mass producible product, but because its software we throw money at it and let it dangle out on the web!!! (and I dont fear AGI)

FTA: >> I don’t have a prescriptive solution for this. I wrote this text to start a discussion around a feeling I previously struggled with but didn’t know how to label.

I do. We need to do better. We need to stop piling on more and strip back to clear and well understood. We need to value readable code over DRY, or Design patterns or what ever the next FOTM is. We need to laud people who make systems simpler, who reduces costs, who reshape and extend existing things rather than build new ones and pile more on top because it's "easy" or looks good on the resume.

I am part of the problem too, and I need to stop. I need to stop reaching for that next shiny bit of tech, next framework, next library. Because acting like a schizophrenic ADHD child finding candy hasn't really served anyone.


Before looking at technical problems, we need to look at organizational problems. Is the tech there to solve a business problem, or is it there to solicit the next VC round, an invite to a cloud provider conference, an expensive dinner paid for by some vendor, or to perpetuate a career based on flawed technology?

A lot of the problems, associated tooling and "best practices" you mention arose as a result of the VC bubble from the last decade, where the primary objective was not to solve the business problem but to manufacture complexity so the next VC round and large headcount could be justified.

Sadly, this is not limited to VC - either collusion or technical incompetence is rampant at the executive level, which means crap vendors can nevertheless get their "solutions" into companies and lock them in. Do this long enough, and entire careers start relying on these "solutions" so you get a guaranteed supply of people who can collude with you to bleed their company dry.

See also:

* cloud computing

* blockchain

* microservices

* resume-driven-development


Would you mind listing those youtubers which cover China? Sounds interesting.

Also I'd like to kindly ask you not to use "ADHD child" in that manner because I think it stigmatizes it although I do understand the point you were trying to make there.


Here is them covering the topic directly: https://www.youtube.com/watch?v=o9eXi3RL8q4

AS for the ADHD thing, I get it, it's also a pretty accurate description of how I feel some days working in this industry. Its hard not to be a technological magpie collecting shiny rocks!


For context: both of these YouTubers were eventually denied stay in China and turned their channel into bashing China full time for a living.

I really valued their insight and perspective (rural China by motorcycle, for example, is not a common perspective in the west), but eventually had to unsubscribe from their toxic bitterness.


Yeah, it's a shame they've been audience captured. At the beginning they leaned a bit to the rosy side, clearly glossing over visible negatives. Somewhere around when they left and were able to speak about good and bad but hadn't committed to a narrative was probably the point of peak value. Now they are almost comically anti-china. Ah well.


Plenty of toxic bitterness for sure, but there is signal in the noise and unless you personally know people in China who are free thinking and also consider you close enough that they would tell you things that can incriminate them this may be the only source of that signal...


Cutting costs by adopting a better architecture was a big thing at one of my previous jobs. People were praised and promoted for cutting thousands of dollars off monthly AWS bills.


> People were praised and promoted for cutting thousands of dollars off monthly AWS bills.

Your in a place that is more rare than you think, a lot of us have experiences more like this:

https://news.ycombinator.com/item?id=38069710


> Cutting costs by adopting a better architecture was a big thing at one of my previous jobs. People were praised and promoted for cutting thousands of dollars off monthly AWS bills.

I think this is a slippery slope. praising is fine, but aws bills is essentially a non-functional quality attribute of the software. the job is to never even let it become a problem in the first place

what about teams that have built their software in time and with quality, are they essentially losing one of the opportunities to get promoted because they built a better software in the first place?


Same at my current job


why fix anything if it's easier to add some other layer and kick the problems down the line?

it's the economically rational thing to do... you woudln't want your kids to have a boring future

gota leave some problems to them, heck cause some because we didn't know any better anyways


>To make a point: do we need docker? No, but writing installable software is hard

Writing installable software doesn't help with isolation, self healing and scalability. In a microservice world you kind of need Docker/Kubernetes.


Hum... Docker isn't a great solution for isolation (it can do some of it, it promises way more than it can do).

Your system's scalability is completely defined by how you architect it, and Docker changes nothing.

And WTF is self healing? Docker can't fix any ops problem that it didn't create.


>And WTF is self healing? Docker can't fix any ops problem that it didn't create.

The gp you replied to mentioned both "Docker/Kubernetes"

It's the Kubernetes management layer of Docker-style containers (in pods) that helps with monitoring and restarts: https://www.google.com/search?q=kubernetes+self-healing


Agreed. Could you please share your thoughts on the best current solution for isolation? Thanks in advance!


VMs are designed to provide isolation. Docker depends on what Linux provides, and Linux puts less and less importance on it as the time passes.


Um. VMs were doing the thing you said before there was Docker


> If it were a battery break through it would take 10 years to turn it into a mass producible product

I think it really did take approximately that much time for LLMs as well. first transformers paper came in 2017, almost 6 years back.

text to code came almost 2-3 years before ChatGPT: https://www.microsoft.com/en-us/research/video/intellicode-c...

so even for software with the same underlying tech i.e transformers, it took almost 5 years to get to a breakthrough that can be scaled.

I really like paulg's observation that "knowledge grows fractally". If you put the scope of chatgpt as all human jobs, it would still seem that its we have only scratched the surface, same in terms of throwing money at it, we are only throwing a very little fraction of the money

> We need to value readable code over DRY, or Design patterns or what ever the next FOTM is. We need to laud people who make systems simpler, who reduces costs, who reshape and extend existing things rather than build new ones and pile more on top because it's "easy" or looks good on the resume.

not just in tech, its always been hard to quantify and reward people based on non-functional attributes of a system's output.

> I am part of the problem too, and I need to stop. I need to stop reaching for that next shiny bit of tech, next framework, next library. Because acting like a schizophrenic ADHD child finding candy hasn't really served anyone.

referencing paulg again, I think this reach for next shiny bit of tech should still happen, but reaching fractally i.e in context of everything that you do, reaching for new tech should be a small part of it but still an essential component to grow


>I do. We need to do better. We need to stop piling on more and strip back to clear and well understood. We need to value readable code over DRY, or Design patterns or what ever the next FOTM

This sentiment is common in people that lack understanding why each of the stack elements currently in place has been put in there for. I'm not picking on parent specifically, but having been working for "big tech" for quite some time I'm meeting young-er people that are starting their first gig in a "big tech stack" company(at a senior position due to their entire 5 years of experience) and their first instinct is as the above. "Why are you using all this crap? Just rip it all out and start from scratch!

No

I was like that a couple of times in my career and being more convincing I was allowed to" rip it all out" on more than one occasion. A year later my system was better than the original, but by the time I finished it was already out of date with "modern practices" and during that year I rediscovered every single seemingly stupid decision I saw made in the original system.

Now, when I see something that doesn't make sense that looks like a mind boggling tech stack doing almost nothing(and yet it works well) I ask myself, what is it I don't know about this. What documentation is lacking/out of date? (all of it usually). I then dive into the source code and learn why things were done the way they are. Also knowing how the entire stack works from the bare metal to k8 and "server less" helps.

If I was an educator I'd make up an It curriculum teaching the basics of how computers work with something like basic on 8 bit, later assembly.

Then I'd go through features of modern hardware, multi-cores, caches, etc.

Then networking basics with focus on low level protocols "tcp/Ip Ethernet, vlans, vpns". With some layer 7 stuff added like HTTP(S),SMTP etc at the same time as OS level knowledge based on Linux on the console and Windows Server is introduced(as well as bash/powershell/python scripting) . I'd have students code simple servers/clients, stand up their own SMTP gateway and Web server on bare metal. Data science should run from this time on too.

Only after they've been using and learning on bate metal for at least a year I'd start with hypervisors. Teaching things like distributed switching, more advanced features like FT and HA using virtual machines. Also NAS and Sans at this stage.

Then and only then comes containers and k8 at the same time as serverless and cloud. Then topic like resilience, meeting SLAs, SLOs (risk calculation) , business continuity, DR in depth, etc

As mentioned before few things should be thought alongside this throughout, probably java programming, data science with python. Maybe ML basics.

I don't know what is thought on It courses these days, but I strongly suspect not what I listed above. /rant over


The key part of curriculum design is prioritization because you generally have a fixed amount of hours for a degree - every important thing you put in requires taking out something else, and learning a topic in more detail requires learning fewer topics. And I'm not sure if most potential students would value spending a year on bare metal at the cost of throwing out a year's worth of stuff that is more high level. Like, be specific - count the things you're adding, take a look at the CS/SE courses in the program you had, and choose an equivalent number of courses that you would cut out to make space for your proposals - does it still make sense then?


I would love an easy way to learn all those things you mentioned. For my job I only need to know an API framework, Python, SQL, an ORM, really.


When reading and writing was new, there was a real problem of authors becoming reactive and narrowing their writing to a reaction of what they read.

The same concept can happen with coders. They don't have an understanding and purpose for code outside of the computer, and they code reactively. Therefore it is easy for them to forget.

It is also worth noting that the culture and electronic lesuire has become very mentally consuming (if you allow it to be). It is possible to use electronic media so much that you forget code.


I think this is two phenomena being observed as one.

While there’s certainly currently an accelerando going on in terms of knowledge growth and change in many fields, there is also bias built into us and our perception of the world - every human has a tendency to be surprised by the degree of change that occurs in their lifetime - we are raised with the world presented as-is, with a static set of truths, yet the only constant thing about the human world is change.


You get used to it. Just approach it every time as “learning a new stack” and those old muscle memories will come back automagically.

Knowing how you got to Z from A is valuable knowledge too. Not just in changes to the old stack that is now new again, but to your own knowledge and journey.

It’s ok to not be the smartest person in the room. It’s ok to admit that you need ramp up. What’s not ok is to go around claiming to be an expert and spout inaccurate facts so you’re already one step closer to being awesome in stack A again.


Darwin


IMHO, bullshit article. The author describes half-life as the time in which the knowledge is superseded or shown to be untrue. However, his actual example is around how he has forgotten math.

It might be half-life of his retention, but not half-life of knowledge.


If you clicked the link hoping you will learn something about Gordon Freeman, you will be disappointed.


If I came to HN and learned anything about Gordon Freeman, I would be disappointed.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: