There's a part in the Foundation series by Asimov where the Galactic civilization has forgotten how their technology works. They have all this amazing technology, things that generate free energy and terraform planets or whatever, and all that's left are technicians who kind of know that if you push this button that thing is supposed to happen, but they have no idea what the machine is actually doing internally or the physical principles on which it operates. When the machines break, no one is able to fix them.
I think about this a lot and the world seems more and more like this every day.
Here's a talk Preventing the Collapse of Civilization [0] for the same idea from Jonathan Blow.
He mentioned Bob Colwell, who was the chief microprocessor architect at Intel, once encountering some issues with chips from TI. He went to TI, and those people told him they couldn't figure it out either because those fundamentals were done by first-generation magicians. And it's not just TI, Motorola, and Fairchild all have this kind of problem.
I think in the old days there were people who actually understand how the whole system works in a fairly detail manner. A Polymath in Computing. How your Code > Compiler > OS > CPU > Silicon Design > Manufacturing works.
I remember there was an old Intel fellow who understands Fabs, designed CPU, has experience with OS, wrote VM and does Web Programming for his hobby. ( He gave a talk somewhere but my Google fu is failing me )
Now everyone is so specialised in their domain no one has a good overview of anything. And no one understand how all the puzzle were fitted together in the first place.
And after one or two generation, the Why it was crafted in the first place was lost. And we are left with people specialise in that domain to "Maintain" it. Or reinventing the Flat tire as Alan Kay calls it.
There are certainly places in the world, today, where machines from the 60s-70s-80s are running critical processes, and all the original people involved in design and implementation are either dead or retired - so you're stuck with technicians that only have repair manuals, along with spare parts of everything. Bespoke, one-of-a-kind circuits / machines / etc.
It's the problem of keeping things in living memory. The craziest thing about human civilization is every 75 years or so, we have to complete a 100% knowledge transfer of everything we know to a new set of humans who start out not knowing anything about it.
Obviously it's a little more complex then that, but the importance and value of education and documentation is very much in that realm: drop the next generation in a desert, and we go back to the stone age instantly.
Cixin Liu’s SF novel “The Supernova Era,” in which a supernova irradiates the earth in such a way that everyone above the age of 13 will die in a year, takes this as its focus.
Oh man, got stuck half way through that one for some reason but hadnt thought of it from that angle somehow. Im inspired to pick it back up... Like right now.
Which is why when we happen to dig out some form of document from 5000 years ago, and eventually translate, we end up discovering that besides technology and religion of the day, the society was somehow pretty much the same.
The problem isn't that these machines are from the 1960s. It's that we stopped making machines like that. I'm sure someone somewhere along the line convinced someone that it'd be better to do it all over from scratch, but if these machines lasted 60 years, then they can't be all bad. In Italy, there are textile manufacturers that use looms that were built in the 1920s and still make cloth perfectly well, just like the old times.
We do this all the time with technology. It's because we're addicted to change, and we have no judgment about how much change we need and where.
That scenario isn't caused by an addiction to change. It's just not economically viable to continue manufacturing obsolete equipment for a shrinking customer base.
I agree that the problem isn't that these machines are from the 60s.
They may have even been cutting edge then.
Indeed, the problem is that the user of the machine faced a choice between 3 possibilities: replacing the machine as expertise in the old tech dwindled; building and maintaining that expertise over the course of decades; or doing neither, and letting the equipment inevitably lapse into its inevitable disrepair; and they chose the latter.
Change is constant, which is good, because there is no progress without it. The only choice we have as individuals is how, and how well, we adapt to it.
Last time I cleaned my apt I managed to delete my grub conf somehow and had to spend the day learning to manually mount an encrypted zfs pool. Luckily I have some etc keeper thing that keeps a git log of all changes to my conf files so I could restore it.
I do think it's a bit telling that we have a lot of automated software that exists to help us style our code, but there doesn't seem to be nearly as much investment in documenting code.
Simple things like a linter that yells at you if a class has an undocumented method would probably at least be a step in the right direction. People may complain that it would lead to overdocumentation but I'd argue that it's probably better than underdocumentation.
Even something as simple as providing a warning when there appears to be an long code block without documentation would probably be a step in a decent direction. What seems trivial and obvious today likely will not in a matter of weeks - and it generally just gets worse from there.
That's actually super cool; I wish I had more reasons to reach for Rust on a regular basis but this is definitely something I'll keep in mind should I ever get the opportunity to use Rust in any capacity.
Visual Studio nags you if public methods and properties aren't documented. It's not a panacea, in fact it's easy to work around but at least they're trying.
IME, the only thing that works is to attack it from a Project level: make usable documentation (needs a review before being signed off) a deliverable or the project isn't done.
> Tell me more about how people hate writing documentation
The thing is, I don't this happens in isolation. At least not for me. I hate writing documentation because time is not allotted for it. So I have to rush it, and in the process I feel like I'm doing a poor job, which is demoralizing. I'm gonna dread it next time I have to do it.
I saw this a lot with Y2K. Suddenly all those places that had been ticking along for a couple of decades on their COBOL line-of-business systems with minor maintenance had to make some major changes.
The choice, really, was "hire some ludicrously expensive COBOL devs, or replace the entire system". Replacing the entire system failed on every sane business evaluation: high-risk, hugely expensive, no guarantee that the new system will even work. So they hired the ludicrously expensive COBOL devs (because the devs needed to understand 1970's-era COBOL, and they were rare in the OO-frenzied 1990's) and patched up the old system.
But as time wears on, those systems fall further and further behind, and the COBOL devs who actually know how to maintain them get more and more expensive (or actually unavailable). The costs and risks of replacing the system is still too high for any given marginal change, but the marginal changes are getting very, very expensive.
And then 2038[0] rolls around and they'll face the same choice. And the same risks will come up. It'll be interesting to see what they do, and what choices are available. Because patching the old system may well not be possible at this point, because there's nobody left who understands it. And migrating the complex business logic to a new system may well not be possible, because there's no-one left who understands the old code.
I don't think them falling behind is necessarily bad. Modern systems, languages and programming practices are much nicer, but they also have a lot more effort behind processors, compilers, frameworks, operating systems, etc., so overall the system is much more complex.
If anything, I think having a separate, archaic, mostly frozen system base for critical infrastructure is a good thing. The current ancient COBOL abandonware situation is probably not the simple and mostly standardized solution one may wish for, but it's a lot closer.
I'd agree, if nothing ever changed and it was feasible to run them forever. But it's not. At some point (probably 2038) the whole thing will fall over and a new system will have to replace it.
Fortunately we still have 17 years till then so surely all these companies have started preparing for this inevitability already, right? (i know the answer)
> the COBOL devs who actually know how to maintain them get more and more expensive
COBOL doesn't actually pay that well, no matter how badly the business depends on it. The law of supply and demand would suggest that it should, but it still really doesn't.
They are a sign of a deficiency in the tech business model. Maintenance is a bad business, and so the only way to keep things understood and working is to build new things.
This is fine as long as the "in maintenance only" portion of the industry remains small. Which will only remain true while the industry is growing exponentially. However like industrials, there will eventually be a physical limit to how much better computer networks, microchips, web search, cameras and other activities can get.
Once this happens growth in the core parts of the industry will slow, and more software will be in maintenance. However software components are both opaque and orders of magnitude more complex than physical parts. As an example, Losing practical knowledge of the linux kernel involves losing knowledge of ~27.8 million lines of code.
There may come a time in a number of decades that getting a driver patched is an impractical activity due to the scarcity of knowledge ( or potentially even the ability to build the driver... )
Linux drivers are already a mess, and have been since the beginning. But the problem is due to IP concerns by hardware vendors, not a lack of knowledge about the code base.
which makes the problem more likely to emerge when the various hardware companies stop developing new drivers due to lack of significant new hardware evolution.
e.g. why update the nic drivers if the nic hasn't changed in 20 years? this could easily turn into "how do we even modify and release our companies nic driver?"
I would agree that such business has relatively low ROI. Still, for example, Linux distros and *BSD foundations are to an extent a maintenance business.
This is actually good news. It provides a lever for the government to change our entire culture (in a good way). Much easier than convincing every individual that maintenance is required.
Even that is good news. We have 1 knob to turn instead of 8 billion. even if today that knob is on a nonoptimal setting. Convincing 1 government is a lot easyer than convincing every individual.
Reliance on deprecated behavior, but without the compiler warning. There may be some hidden risk out there, and we have no idea how much it'll cost to fix or replace.
The example that springs to mind is the social security check printers. I think they wound up reading the wire voltages as checks were printing to duplicate the behavior for the y2k fix. Was urgent as many people relied on that income.
There are rare events that you can't do much about up front, they're external. A pandemic might be a good example. There are other rare events you can simply avoid, but it's often tempting to just skip the maintenance and let someone else deal with it when it breaks. It's rare right? Not like we're going to get blamed, or even be around to have to deal with it.
That is probably less of a concern than most other things because the industry is so highly regulated that everything is formally specified and documented.
Now guess what happens if we forget how to make something intrinsic to modern farming.
Not being snarky; just want to point out that current soil erosion rates due to modern farming mean the world will run out of dirt by the end of the century.
A similar analogy from MacIntyre's After Virtue is also apt:
"Imagine that the natural sciences were to suffer the effects of a catastrophe. A series of environmental disasters are blamed by the general public on the scientists. Widespread riots occur, laboratories are burnt down, physicists are lynched, books and instruments are destroyed. ...Later still there is a reaction against this destructive movement and enlightened people seek to revive science, although they have largely forgotten what it was. But all that they possess are fragments: a knowledge of experiments detached from any knowledge of the theoretical context which gave them significance; parts of theories unrelated either to the other bits and pieces of theory which they possess or to experiment; instruments whose use has been forgotten; half chapters from books, single pages from articles, not always fully legible because torn and charred. Nonetheless all these fragments are reembodied in a set of practices which go under the revived name of physics, chemistry and biology. Adults argue with each other about the respective merits of relativity theory, evolution theory and phlogiston theory, although they possess only a very partial knowledge of each...
In such a culture men would use expressions such as 'neutrino', 'mass', 'specific gravity', 'atomic weight' in systematic and often interrelated ways which would resemble in lesser or greater degrees the ways in which such expressions had been used in earlier times before scientific knowledge had been so largely lost. But many of the beliefs presupposed by the use of these expressions would have been lost and there would appear to be an element of arbitrariness and even of choice in their application...What would appear to be rival and competing premises for which no further
argumentation could be given would abound. Subjectivist theories of science would appear and would be criticised by those who held that the notion of truth embodied in what they took to be science was incompatible with subjectivism."
Interesting talk by Jonathan Blow based on a similar (though software-focused) premise: "Preventing the Collapse of Civilization" https://www.youtube.com/watch?v=pW-SOdj4Kkk
Fascinating. Although I've never read Asimov, I've often posed the following thought experiment:
Suppose humanity were set back to the stone age. In the sense that all working technology is 100% destroyed into ash. We don't even have a proper hammer or knife. What we do have is written documentation about how the modern world works, and an army of the brightest engineers.
How long would it take to re-create modern tech? And with the documentation/academic papers we currently have, could we even do it? What I mean is, what are all the steps we need to take from building a forge out of mud and flint and steel, to producing an i9 intel chip, 5g networks and space ships?
Think of this as an ansible script that bootstraps cavemen to modernity.
The "How to make everything" YT channel (https://www.youtube.com/channel/UCfIqCzQJXvYj9ssCoHq327g) does that. He started from digging up clay with his bare hands to make pots, and is now roughly in the iron age IIRC, melting iron using only stuff from earlier 'tech tiers'.
One thing I found interesting is that apparently many metals used to be collected from ores just laying around. Those sites are all exhausted today. So a full 'reset' of the earth wouldn't be possible today. Of course metals could be scavenged from the machines we have now (in a nuclear disaster style scenario), but we could never fully 'redo' the technical evolution we did over the last 10's of k's of years.
Without technology, we wouldn’t have the resources to sustain our population, leading to widespread skirmishes for local resources and a rapid devolution of society. Many people would die and we’d probably be set back a thousand years at least. Pockets of the world would escape this fate, having successfully organized around the right leaders. They would probably only be set back a few hundred years and become the new governments of the world a hundred years later. These may feel like long time scales, given that we would somehow have access to all human knowledge, but I feel they are correct. There is an unfathomable amount of work to be done, and a lack of societal stability will constantly set us back even as we make progress. We will basically have to relive much of modern history.
We would lose access to modern tech and ancient tech is lost completely already.
How many people can farm without access to machinery at all?
How many would be able to produce any hammer from scratch? How you would ensure that they are heard and followed?
How many draft animals are present in your area, for example oxen and horses?
Rebuild time just to restore food supply is too long, we would collapse to prehistoric stage with some pockets of places where food stockpiles were not destroyed.
And maybe even worse as animals are not available for hunting!
We would survive as species, but with nearly total population, cultural and technological loss.
There is no conceivable scenario in which we lose access to modern technology. OK, in some far-fetched case we can't use complex electronics. Then what?
Well, no one need draft animals because there are still many gasoline and diesel engines around that run without electronics. Since we're not all driving cars around any more, our supplies of fuel will last at least until we can get pumpjacks and small refineries running again.
Believe it or not, the farmers can farm just fine without their $500k tractors, it just means they have to farm a lot smaller area with less output, but their skills will transfer just fine.
This is akin to "preppers" learning to make fire by rubbing sticks together: in what situation do they think they'll run out of matches and cigarette lighters?
I think that noone here was thinking that this scenario is likely?
> all working technology is 100% destroyed into ash. We don't even have a proper hammer or knife. What we do have is written documentation about how the modern world works, and an army of the brightest engineers.
Preparing for such scenario is waste of time, but thinking about it may be interesting. Actual complexity and importance of various infrastructure is interesting.
> This is akin to "preppers" learning to make fire by rubbing sticks together: in what situation do they think they'll run out of matches and cigarette lighters?
This one is actually possible to be directly useful, and not an example of something useless.
One thing to mention, though, is that all the easy-to-dig carbon fuel has been dug, so "use coal to power X" just isn't quite as feasible. Biomass just doesn't work for a lot of it.
If everyone is still alive then a "reset" is impossible. Sure the electric grid may die and kill everyone, but the reset only happens after lots of people are dead.
In a few generations of nobody knowing how their technology works anymore will there be conspiracy theories regarding how that technology came to be and then became impossible to replicate.
I think about the same thing a lot lately. It was not supposed to be a prediction but just a warning, I thought, but apparently it was a prediction all along.
It's very easy for people to slip into self-indulgence and pleasant illusions but when something important breaks then nobody has a clue what to do. This happens more and more around me.
I never thought I'd live to see something like this. And I am only 41.
I'm sure there are plenty of EE types here who can tell you how to refine silicon, design and print a circuit, program in binary, assembly, C, and high level languages, explain network stacks, and build a web app.
But really, can anyone really explain how computers work nowadays? Look at how optimisations in cache stuff on the CPU led to horrible security flaws, surely that should be obvious if anyone really knew what was happening under the hood?
Everything is obvious in hindsight; these "horrible security flaws" you speak of are only a problem now because we're treating software as adversarial. If the machine's owner is 100% in control of the software that runs on the CPU, there is no security boundary between the CPU and the software running on it, so there is no need to even entertain the notion of leaks through speculative execution.
Ubiquitous software delivery from questionable sources is the problem, that's what invalidated the design assumptions underlying the cache optimisations. You might argue that it is the hardware designer's fault to not see how the software world would change in the next 20 years, but you can similarly argue that it's the software designer's fault to not keep in mind the security limitations of the processor it's running on.
Timing attacks are not all that hard to understand - if the timing (or anything else) a user sees depends on information they shouldn't be able to see, then there's a (theoretical, at least) information leak and that's a security flaw.
I'm not a security expert, and this is something that's always in the back of my mind when considering security. It's ... one of the only things in the back of my mind - you need a constant time string compare for passwords is the most obvious example (and one that literally any decent programmer should be somewhat aware of). I only know the basics of security, and this is one of the basic principals. It's up there with "escape strings".
People simply don't think about every layer of the stack. If they did, then it would be obvious (but there's so much in the stack that this is infeasible).
JavaScript came out in 1995. Timing attacks on the CPU have been feasible since them (and heck, if you go back further then everything was terminals and multi user operating systems anyway). There's never been a time when CPUs haven't run potentially hostile operations. But virtually no-one (myself included) thought "can a timing attack happen"? despite timing attacks being a very common type of attack.
This is a key part of the Warhammer 40k lore too; a huge amount of tech is ancient and the technicians who care for it have slowly turned into a religious order that considers their maintenance routines to be religious rites.
I was thinking - maybe wikipedia will cure this...
then I thought... could this happen to knowledge repositories too? could knowledge degrade under "need source" for this and other administrative controversy?
This is also humanity’s situation in Warhammer 40,000—but seeing as how randomly lobotomizing people is the norm in that universe, I hope we don’t end up there :-P
Don't forget that humanity turns into an exceedingly fascistic/fanatically religious and almost absurdly authoritarian dictatorship, with literal superhuman stormtroopers enforcing its might. Also cosmic horrors from beyond time and space, civilizations that literally feed on the souls of tortured prisoners, terrifying unfathomable alien hive minds and warmongering sentient fungus-grown brutes who irreversibly infest every world they touch.
Fun times.
Also originally a scathing commentary on Thatcherism and British society, similar to Judge Dredd, but that has been toned down a lot over the years, unfortunately.
Well, in their defense, the planetary genocide is partially justified. When do they call it:
- Ork infestation. Without scorching most of the air and ground, Ork spores will just give bigger nastier Orks, and a possible staging ground for further infestation.
- Tyranids. Yeah, you're just doing whatever Tyranids would do anyway. Plus denying them resources
- Chaos. Even if there is a single heretic, they can spread their belief around and then use psykers to turn the planet into a Warp world, which is a fate worse than a virus bombing.
(I do know there have been a Tau and Necron Exterminatus, but I did said partially justified)
It's funny. In Warhammer 40k lore, any society that didn't practice witch burning was essentially wiped out by daemonic incursion. Therefore Warhammer 40k human world evolved to a racist, xenophobic, psyker-hating society.
This reminds me of the current influx of software guys into the microcontroller space. Arduino's, pi's and the like. You have these super knowledgeable people who are unable to execute the electronics side of their project, so now we have a huge market of things abstracted up to their level (i.e. LEDs with all the necessary electronics already embed, just call the address and go).
This isn't even just a hobbyist phenomena. Electronics design itself is becoming more and more "turn key hardware", where you just slap chips on a board, string them together, then get chugging on the software.
Well it's a lot like a doctor fixing your sneezing :D
The difference between tech and other machines like cellular life, is that we built it from scratch in a giant writing civilization so to absolutely lose the ability to reverse engineer or reproduce by copy would require quite a dumbing down of the humans or a catastrophic loss of resources.
Something I saw somewhere is that in engineering you can copy without even looking at the thing. If you know it s possible, all countries seem to somehow access it the next decade. Sometimes it's just a psychological block.
At the core of it, the very core of it, tech is waged work. Just like in manufacturing, which was outsourced quite prolifically. Tech isn't special in this regard.
The thing that hasn't come about yet though is, sales, marketing, accounting can also be outsourced. Dare I say automated. It is at the end of the day, still waged work. Marketing is not an asset class. Accounting isn't capital ownership. Fabrication is a tool, tech is a tool, social media is a tool.
The moment that this sort of stuff gets automated there will be a reckoning on how businesses are built in western society. I don't mean a saleforce competitor with a slicker UI, I mean completely abstracting away payroll and the like.
Imagine for a moment:
You could say to anyone, "I like the work you do want to work for me?" either in person, online, even here. They say yes. Now suddenly a pipeline process kicks off, like a CICD for recruitment. You have your budget metrics all in place, the receiver has their negotiation asks, etc. And AI hashes out a negotiation, in browser or on your phones etc. comes up with a negotiation contract, highlights the important bits or deal breakers that need a higher level sign off. "Click 'Yes' to work." Done. Hired. Here's your onboarding package and account credentials, your paycheck comes in at the end of the month.
All the while, the two people at the table or on twitter have just been continuing on with their lives. Nobody talked to HR. Nobody had to manually do filings with the IRS. The business owner never opened a dashboard or logged in to their negotiation.ai account. It is not this easy at the moment. But no rule of physics says it can't be in the future.
Entire classes of work could crumble while people could create in a much more competitive manner.
It’s kind of terrifying how few people in this industry take this seriously and dismiss unionizing and organization efforts. Every publicly owned company is incentivized to squeeze every possible penny out of their income statement and we know from history that corporations will fight hard to eliminate high wage labor whenever possible.
Our salaries and power won’t last forever, and if we don’t turn around our culture of worshipping techno-aristocrats quickly then we’re all going to have a rough time in the next few decades.
To play the devil's advocate, that would also mean keeping things inefficient on purpose to avoid job losses.
I want my taxes to be automatically by some computer without having to deal with a human that's more prone to errors. I want self-driving cars so that I don't have to deal with another human that's more prone to accidents, etc, etc. The list goes on.
I think it's inevitable that we will see some kind of UBI, otherwise there would be big revolutions when most jobs are automated to some degree.
> I want my taxes to be automatically by some computer without having to deal with a human that's more prone to errors.
This is already in progress in New Zealand. The government simplified taxation rules and everything is fairly automated now, the payments and the end-of-year. I got an email this year from our IRD (≈IRS) telling me that my annual tax return was already completed by them (electronically) and I just needed to login to confirm/authorise it.
In Poland taxation rules were not simplified, but tax filings for individuals are automatically generated based on documents supplied by employees.
You only need to claim subsidies, select NGO which gets your 1% of tax - or skip that,
Tax files itself automatically,
If you are in weird situation, you can relatively easily modify it.
Tax code is still awful (not sure whether I filed taxes correctly due to my weird situation), but frontend is one of better sites that I have seen. And the best government site that I encountered.
Same in NL, your employer already supplies the tax service with your wage information, and banks already supply the tax service with year-in/year-out balance information on your accounts. You can login on their portal, verify the pre-filled data on the form, and submit it. You only need to enter your own deductibles, if relevant.
Around here taxes are automatically deducted from my income by the employeer.
Skatteetaten (≈IRS) then sends us a letter in early spring telling us what we owe or what we will get back, we get a chance to add extra expenses etc and then a final calculation is done and you either get an invoice or a payment, most likely a small payment as taxes payments are typically slightly larger than they need to be.
Except if you run a business, then it is slightly more complicated depending on the size of your business, if you have employees etc. I have a small company so I need to report income from the company as well.
> I think it's inevitable that we will see some kind of UBI
Is it though? I don't think it's impossible we see some kind of cyberpunk dystopia, where powerful live isolated lives, protected by the police/military, and the rest live in squalor.
What happens if you automate everything, and realize you don't need to feed millions of mouths. And, hey, police-military complex is already automated, so just send the drones and let them remove the excess humans.
> some kind of cyberpunk dystopia, where powerful live isolated lives, protected by the police/military, and the rest live in squalor.
Even in that scenario, if all useful work is truly automated, there's either still some sort of UBI (just not enough to live comfortably) or the people living in squalor are subsistence farmers eating whatever they can manage to grow in whatever land they can lay claim to.
To play the devil's advocate, that would also mean keeping things inefficient on purpose to avoid job losses.
Well... yes? The purpose of life isn't just to enhance shareholder value. Some stuff really is just inefficient of course but a lot of what looks like inefficiency in a pure economic sense is actually what people back in the day would have called living a good and satisfying and meaningful life.
Imagine if we did that when printing presses came along replacing the scribes or cars came along replacing the coachman, our civilization would stagnate.
Imagine if we did that when printing presses came along replacing the scribes or cars came along replacing the coachman, our civilization would stagnat
Yes, there are things that have taken genuine inefficiencies out, I wouldn't go back to the days e.g. of standing in a queue at the bank rather than online banking. But an efficient world looks like 996 and I don't think that's actually a good way to live https://en.wikipedia.org/wiki/996_working_hour_system
I'm very excited about this reckoning. New grad engineers making 200/300k a year in the US vs Sr 10+ YOE engineers make 40k a year globally will be a wonderful collision.
New grads don’t make 200k a year. There are some rare exceptions. Top kids from top schools. Very occasionally a quite special kid from a non-top school. It’s not normal.
And IMO the majority of 10+ Yoe engineers are actually 1 yoe * 10, or more usually 2 yoe * 5.
I think the parent comment is talking about Silicon Valley salaries where the ballpark starting compensation at FAANG is roughly in line with ~180k$ (that was my starting comp many years ago, it probably has increased since). In the Bay, that's the norm, but then again, cost of living is also very high.
You don't have to be from a top school, but it does help.
Yes, I know. But again, it’s not even close to normal. There are a LOT of engineers outside of the bay and the fangs.
It’s a bit like saying that finance pays well in Manhattan. Sure. But most people in the USA who work in finance are making mid/high 5 figures. Wall Street is small.
That’s not terrifying, that’s awesome. Economists were saying decades ago that near the year 2000 we shouldn’t working much because of the gains of productivity. We would be free while everything would be automated. But instead we have billionaires confiscating the gains and owning the means of production. Not arguing, just saying automating work isn’t bad and can be good in itself, that’s the society rules that make it so.
Automation implies boosting productivity. Who gets to keep the surplus (profit)?
During New Deal and Great Society, Capital shared that surplus with Labor. Since the mid 1970s, not so much.
Instead of automating, we could simply eliminate huge swathes of work. Squint a bit, and all the bureaucratic burden (bullshit jobs) just look like very complicated rent seeking.
So you mean, how work was done before the modern nation state, income tax, passports, visas and mass literacy? All of the work you describe was created by human bureaucracies for human bureaucracies. Beforehand you didn't need the paperwork.
The difference being, before there was no paperwork, at all. In such an automated system, the checks, balances and oversight all still exist. They're just invisible to the human eye rather than libertarian style just gone.
You don't need AI or advanced software for automated background bureaucracy made well. All of that is possible today, and the amount of extra work it is today consuming human time is still ultimately a choice of humanity.
The real trick will be to make sure that the AI negotiating on your behalf is really your advocate and has not somehow been subverted (by any variety of means).
Basically the same dilemma industries with actual union representation encounter when individuals are picking what union to join today. Some unions have lower membership prices, some have more negotiation strength, is the organization competent etc.
Sounds like a business opportunity if 3rd party algorithms become a thing. Pick our negotiation bot! Will get you the best deal! Only $100 per analysis!
Then go a step further and get ensemble negotiation bots that aggregate the negotiation providers. Or meta-negotiate the best deal among the negotiation AI providers.
Sounds like a precursor to Distributed Autonomous Organizations. Although previous experiments in these have been disastrous to say the least, I find the idea very appealing.
The DAO fiasco from a few years ago [1] comes to mind. Personally I think capital allocation is way too complex an endeavor to automate. Data annotations maybe...
Not quite what you were asking, but the first thing that popped into my head are "dispute resolution organizations" which come from Libertarianism and other schools of thought. Might be an interesting avenue of reading if you want to learn more about inter-actor relations and contracts in the absence of centralized control.
Automation is the next outsourcing. I'm one of the people making it happen.
People forget that it doesn't need to be 100% automated. If I can automate 25% of the work of a staff-level engineer, that's amazing. If I can do 50% of a junior engineer, superb.
Right now software engineering is still in baby steps as far as tools are concerned, but the next decade will see some remarkable stuff happen once we have a more solid foundation and start applying the previous few decades' progress in machine learning and statistics.
> People forget that it doesn't need to be 100% automated. If I can automate 25% of the work of a staff-level engineer, that's amazing. If I can do 50% of a junior engineer, superb.
This has been the trend since ENIAC.
And yet, the demand for programming is higher than ever. It could be another century before those lines intersect. Maybe longer.
We keep automating ourselves up into higher levels, only to discover that there is a ton more work to be done up there. Which, imo, is a very good thing.
You know, I truly hope that's true. At the same point, that has been the mantra for a while. Back in the 80's, when I was but a wee lass, as it came time to choose a college path, my uncle warned me not to get into software engineering because his buddies at IBM told me that it'd be all automated away in the next decade.
4GLs happened, but the work didn't really become less. (And 5GL spectacularly face-planted)
In the 90s, we got widespread adoption of the Internet and search engines, and the prognosis was we'd just be able to look up everything.
In the aughts, it was javascript & frameworks that were going to eat everybody's lunch.
In the 2010s, did we ever build fancy linear regressions^W^W^W machine learning solutions.
We always talk about how that somehow would reduce the work, but instead we continue to build barely maintainable mountains of complexity at the outer edge of what we can handle.
I wish you success. I'm not holding my breath it'll significantly reduce software engineering work. It'll shift what we do, no doubt. I'd be surprised if this is the decade where we make a significant dent. We'll just heap on more complexity.
> Back in the 80's, when I was but a wee lass, as it came time to choose a college path, my uncle warned me not to get into software engineering because his buddies at IBM told me that it'd be all automated away in the next decade.
> 4GLs happened, but the work didn't really become less. (And 5GL spectacularly face-planted)
Hah. The pipe dream is older than that.
They said that in 5 year's time there would be no demand for professional programmers. You see, there's this new programming languge that's so easy to use that anyone can write the software they need themselves.
There is still _so much_ untapped demand for programming that I'm not terribly worried about partial automation. Just look how many manual processes exist is governments or large corporations that would be more efficient and less error prone if a computer assisted, but the necessary programming is still too expensive to switch over. Sure, programmer wages might decrease in the future to more reasonable levels, but I don't see mass unemployment looming.
From what I’ve seen, I’d say around 50-60% of the world population lives in poverty or close to it.
But take the States. In the 1980s, having a 120,000 usd annual salary would be insane. You could buy a nice place in Santa Barbara for that money.
Due to inflation, that same salary buys much much less in today’s money.
Give it another 20 years, 120k per year will buy even less than that.
But now consider that most people in the 80s would make 20-40k per year. That was enough to have a good life on the high end. A decent car, a house in a good area with a mortgage.
Can you do that now? No. Not unless you live in the sticks, and good luck finding a job there!
So to suggest that engineer salaries should be “more reasonable” is basically saying you want people to be poor in twenty years. Because that’s exactly what will happen.
Oh and your future 40k per year job will be even harder to make ends meet, if nigh impossible, because most jobs will not be in cheap areas to live.
Hell, even the cheaper areas to live are more expensive now.
>The thing that hasn't come about yet though is, sales, marketing, accounting can also be outsourced. Dare I say automated.
Yep, just consider how much sales, marketing and accounting work the App and Play store has automated. Not to mention software delivery and installation work.
I think it might not be a coincidence, just in the opposite direction you are suggesting. The non-bullshit jobs will be automated, the bullshit will remain. Many of categories of bullshit jobs are actually unusually resistant to automation.
> flunkies, who serve to make their superiors feel important, e.g., receptionists, administrative assistants, door attendants
Impossible to automate. The whole point of the job is to be wasting a human's time to feed someone's ego.
> goons, who act to harm or deceive others on behalf of their employer, e.g., lobbyists, corporate lawyers, telemarketers, public relations specialists
Automation will make your goons more effective, but the arms race against the other side's goons who also have automation will prevent the jobs from going away.
> duct tapers, who temporarily fix problems that could be fixed permanently, e.g., programmers repairing shoddy code, airline desk staff who calm passengers whose bags do not arrive
If they aren't going to pay the upfront cost to fix the problem, they aren't going to pay the upfront cost to automate it either.
It seems to me that you overestimate the cost of people making all that happen currently, and underestimate the difficulty of automating it (and maintain that automation).
A lot of work isn't impossible to automate, it's just not worth the cost. Lots of people unfortunately earn very low salaries, they're cheaper than a machine. This is the case both in manufacturing, and administration.
I mean, if we're just down to the capital class, there simply will not be enough of them to buy from each other to sustain an economy. Those payroll and hr people also happen to buy people's products, which AI never will. If there is no IRS and no tax filers, there goes a lot of purchasing power.
You can't create or compete if no one is there to buy them.
Skipping over the same argument that comes up every time an industry gets threatened. Farmers, Textile workers, Miners, Machine Operators. It boils down to an inability to see what comes next and a fear for ones own quarter in the year the argument gets made.
But why can't AI buy products? What makes a purchase more legitimate by the click of a human finger over the automated confirmation by a piece of software? Hierarchy of needs? Why can't automated systems that have energy, networking, AWS Bill's etc. not also have needs?
And then there's a simple fact of there is a finite number of moths to feed. In the billions, but finite. Why can't an industry of trillions of digital actors, that scales far faster than biological ones, not be an even bigger economy in it's own right?
we could automate the purchasing? Remember the stories about self-driving taxis which would purchase their own maintenance out of their earnings? I think it was supposed to use blockchain.
Our current economy seems to rely on "the modern consumer", as per your point-- and we are told that our main role is to "consume". Seems to me this would be extremely easy to automate. Then we don't need people any more, the robots can consume themselves!
From DoD "2020 Industrial Capabilities Report to Congress"[1] page 9:
> issues confronting our defense industrial base [...] first has been the steady deindustrialization of the United States over the past five decades, including workforce and manufacturing innovation [...] While total manufacturing output has grown during this period, [...] the workforce on which a defense industrial renaissance would depend has become [...] an endangered species.
> Together, a U.S. business climate that has favored short-term shareholder earnings (versus long-term capital investment), deindustrialization, and an abstract, radical vision of “free trade,” without fair trade enforcement, have severely damaged America’s ability to arm itself today and in the future. Our national responses – off-shoring and out-sourcing – have been inadequate and ultimately self-defeating, especially with respect to the defense industrial base.
> These trends have had particular impact on the core element of a successful manufacturing economy: the machine tool industry. Of the world’s top twenty-one machine-tool makers, only two today are American: Gleason and Haas Automation. By contrast, eight are based in Japan, and six in Germany. [...]
Reader beware. Reports to Congress are highly politicized -- both the Department itself and various bureaucrats within are kissing the ring in hopes of protecting/increasing funding. You can safely expect these reports to mention climate change, green energy, diversity, etc. much more often for the next four years. And guess what? The best office politicians serving life terms in the DoD will make sure those reports direct funds in ways that benefit their priorities.
POTUS in 2020 was protectionist, so DoD tells a story about how they and their private sector allies need money and tell it in a way that aligns with that protectionism. The main thesis about why money is needed matches the biases of the person with the pulpit, and then the report helpfully suggests exactly how to achieve that goal that we all of course agree should be the goal. In 2020, onshoring manufacturing. In 2021, maybe still onshoring but maybe play up the whole climate thing a lot more.
"Caveat Emptor" is merely a warning, not necessarily an indict of the product. Reader beware.
Except there's no point I can recall in the history of strategic policymaking where anyone has concluded "yes, having our manufacturing base be outsourced to foreign powers is a sensible decision with no serious impacts on the actual or expected availability of defense capability".
This is pretty much an obvious problem: if you don't have your own arms facilities, then you've got a problem. If the facilities aren't across a land border but across an ocean, then you've got a bigger problem. And if you can't reasonably expect to be able to expand capacity by internal policymaking ("we need to open 5 more factories") then you've got an even bigger problem.
> Except there's no point I can recall in the history of strategic policymaking where anyone has concluded "yes, having our manufacturing base be outsourced to foreign powers is a sensible decision with no serious impacts on the actual or expected availability of defense capability".
Right. Including in the United States over the past 40+ years. The United States does manufacture its weapons domestically, it does stockpile reserves of key components, and it does require domestic manufacturing for key inputs. The linked report doesn't even contest any of this. It's to a "renaissance" in arms manufacturing.
The question is less "is this a real problem in the abstract" and much more "is this a real problem that the US actually has?"
This is really a bit like saying "if a country can't grow its own food then its people will starve". The problem is not with explicit argument, which is of course common sense. The problem is with the implicit implication ("give us more money for farm subsidies or you will starve"... well, wait, does the US have a food security problem? Do we already have funds in place to help mitigate that risk? Are those risk reduction programs effective? Is more risk reduction necessary? And is the specific spending that's being requested to nominally reduce risk actually going to do so?).
Which is to say, all of this comes before asking the even more important question: even if we do are out-sourcing our arms manufacturing (we aren't, but even if we were...), will following DoD's recommendations make us less reliant on other nations? Or are we just just pissing money away into some DoD beaurocrat's useless jobs program?
Although there is definitely bias, the facts choosen in the quote are alarming regardless.
Only a couple of the major world tooling companies existing in the largest economy in the world is significant.
>If you separate the thinking about things from the doing of things, then innovation will suffer.
I found the author's framework incomplete and not useful. For example, he didn't include any counterexamples. E.g. why is Intel with both in-house chip architecture design capability _and_ chip fabrication factories falling behind in innovation to competitors using the outsource model?
- NVIDIA gpu + outsourcer TSMC is ahead of Intel at hardware for machine learning
- Apple M1 chip + outsourcer TSMC beats x86 for laptop performance
- AMD Neoverse chip + outsourcer TSMC bests Intel for many server workloads
But that doesn't mean those companies outsource everything. E.g. Apple doesn't outsource the programming of iOS and macOS to outside consultants at Accenture or Thoughtworks. They do that in house. But Apple programmers don't write their own financial back office software. Instead, they use Germany's SAP ERP enterprise system. Likewise, none of SAP employees design and make smartphones for staff to use; they let Apple and Samsung manufacture the phones.
Being strategic about outsourcing is a natural consequence of recognizing that other entities specializing in a competency can do it better/faster/cheaper. How did NASA "innovate" and send astronauts to the moon? They outsourced the work. E.g. The manufacture of space suits was contracted out to ladies bra manufacturer Playtex. The Apollo rockets were made by a combination of companies. NASA was the ultimate outsourcer.
I once worked for AMD, back when they had fabs, so I can speak to that a bit.
There was once a time when IBM paid for new semiconductor manufacturing technology. Applied Materials and TEL and others would actually make the equipment, but IBM would pay for the development at each new node. Then, eventually, they decided they wouldn't pay, they would wait for someone else to do it, because cutting-edge semi's wasn't a core need for them any more.
So, Intel became the new source of funding for research on new production nodes. There were various semi industry consortiums that coordinated all of this, but still someone had to step up with the billions that were required for each new node, and for a while it was Intel that did this.
But then, eventually, the costs of R&D for each new node kept rising faster than the money you got for it, especially given that TSMC and Samsung and the rest also got to buy the new manufacturing equipment from Applied Materials and TEL and so forth, even though they hadn't sunk in the R&D cost to fund it. So, it ended up that the actual new R&D started to be done by the upstream companies that made the equipment which was bought for the fabs. This was already happening by the late 90's when I left the industry; actual semi manuf R&D was done at the equipment manufacturers, not at Intel; Intel was just the one with the financing. It worked until it didn't.
TSMC execs have been quoted as saying they "outsourced their sales and marketing" to American companies.
TSMC's upstream is ASML. And interestingly ASML just recently managed to come up with a "good enough" pellicle (a transparent protector for the lithography mask) for EUV, and they are outsourcing the manufacturing of that to Mitsui as fast as they can.
"The pellicle production tools have been installed in Mitsui, which this year will ramp up EUV pellicles based on ASML’s technology. Mitsui is no stranger to pellicles, and already produces optical pellicles. ASML will continue to do R&D for future pellicles."
There is specialization (comparative advantage) outsourcing, with a living connection between the partners. ASML does the R&D an Mitsui scales it up, etc.
And there's stubbornness driven outsourcing, where companies for some reason don't want to offer a competitive compensation, but are willing to spend a lot on service contracts. (Because they can claim that the risks are managed by the service provider.) And they end up with subpar solutions supplied by whatever vendors that had the patience to deliver to a completely incompetent client (and the necessary audacity to bill them as much as possible).
The post doesn't say "all outsourcing is bad", and your 3 examples are all companies that (as far as I can tell) are very deliberate about what they outsource and deliberate about keeping control of the things they want to keep doing:
NVIDIA is not going to go out to someone else and say "we want a GPU chip", but rather they are designing them end-to-end to make full use of what their production partners can do.
iPhones are built in China, but Apple keeps tight control over how they are built. They control and manage the supply chain, they buy companies making tools used to make iPhones to keep control over this. They operate the cloud service stack around them. They made massive investments into doing more themselves: building a world-class CPU design group to get independence from what other SoC makers offer them. They are now leveraging that to outsource less of the Macbook design: move away from outsourcing CPU design and production to Intel, to design inhouse.
They understand very well what the post warns about: If they stop being involved with these parts of the process, they will a)likely fall back and b) have a terrible time trying to recover the ability if they need to, so they only outsource selected parts of their work. The breaking points are further down the curve, and they stay the hell away from them.
One could argue that Apple's attempts at making Macs in the US again are an example of how difficult it is to reclaim such ability, even if the company still has the know-how to oversee it. Especially since nearly everybody else in California also has stopped doing this kind of thing - Apple would need to train people a lot. Which Apple at least can afford, if they want to.
I didn't interpret his essay that way. His acknowledgements of some outsourcing being valid doesn't address my criticism.
>and your 3 examples are all companies that (as far as I can tell) are very deliberate about what they outsource and deliberate about keeping control of the things they want to keep doing: NVIDIA is not going to go out to someone else and say "we want a GPU chip", but rather they are designing them end-to-end to make full use of what their production partners can do.
And this is a great example that ties back to the author's point because he criticized Boeing. Boeing designs the planes and tells the outsourced partners what to make. Boeing then does final assembly in Boeing-owned factories in Washington and North Carolina.
So to use your wording, Boeing does not go to somebody else and say "we want a 787 plane". Boeing does more building than NVIDIA.
I think a fair reading of his essay is that he thinks that a company that is more vertically integrated via less (but not zero) outsourcing leads to more innovation. He was lamenting that outsourcing productivity software like MS Office 365 wasn't a good trend so presumably, companies that insourced that inhouse would be "more innovative".
> They were even telling the manufacturers look, we only put up requirements, we don’t actually tell you what to do
From other sources:
> Starting with the 787 Dreamliner, launched in 2004, it sought to increase profits by instead providing high-level specifications and then asking suppliers to design more parts themselves. [...]
> Rabin, the former software engineer, recalled one manager saying at an all-hands meeting that Boeing didn’t need senior engineers because its products were mature. “I was shocked that in a room full of a couple hundred mostly senior engineers we were being told that we weren’t needed,”
That's the point where you loose your in-house grip on things, and run into trouble if your contractors are not up to it. Keep that up, and you loose the ability to fix it.
The chip-designing companies are betting that there always will be an external fab that's world-class, and likely better than what they can do themselves. AMD literally couldn't afford to keep up. (and when world-class was inside Intel, they somewhat suffered for it, but didn't really have an alternative)
>That's the point where you loose your in-house grip on things, and run into trouble if your contractors are not up to it.
What definitive conclusion are we supposed to get from that Bloomberg article you cited? That cheap outsourced $9/hr programmers from India caused the MCAS flaws which led to plane crashes? Therefore, if Boeing had used in-house American programmers, it wouldn't have been a problem? But if the specifications for the software were designed by Boeing management, it wouldn't have mattered where the programmers were located that actually coded it. From the investigative reports I read, it wasn't the programmers that decided 1 Angle-of-Attack sensor instead of checking 2 redundant ones -- it was the aerospace engineers and managers above the programmers.
The Bloomberg article uses a narrative technique to bias the reader a certain way (reader nods in agreement "outsourcing caused the problem") -- rather than present unbiased Root Cause Analysis.
As counterpoint, Tesla has programmers inhouse in California to code the self-driving software and yet it had serious flaws which contributed to fatalities. So is the correct conclusion to say that inhouse programming is "bad"? Of course not.
Your cite of that type of Bloomberg article gives me an idea of what you found compelling about this thread's blog post. In my case, I was more focused on the "more outsourcing means less innovation" aspect which wasn't convincing.
In the Boeing case, outsourcing is likely a problem not because of the geographical location of the engineers/developers, but the cost-centric mindset that drives such decisions with utter disregard for technology and technical expertise.
Lots of people know the story about killing the golden goose, but few understand the metaphor for the range of actions that actually correspond to slowly strangling the goose, instead of nurturing it.
> But if the specifications for the software were designed by Boeing management, it wouldn't have mattered where the programmers were located that actually coded it. From the investigative reports I read, it wasn't the programmers that decided 1 Angle-of-Attack sensor instead of checking 2 redundant ones -- it was the aerospace engineers and managers above the programmers.
The issues that I've most often seen in working with offshored programmers vs local engineers is that the former will happily code a spec that makes no sense, while the later will almost immediately raise concerns if they see something that doesn't make sense.
Maybe it's a matter of education (formal engineering vs some bootcamp) or that an engineer's signature often carries legal weight.
In that case, I wonder if there was an actual engineers who reviewed what the management has requested for the MCAS or did it go straight to the 9$/hr bodyshop?
The fact that the specification was wrong might actually be a pretty good counter argument to your position that outsourcing doesn’t matter. Do you think the same kind of mistakes would have been made of the people writing the specifications actually did some of the software work?
I’m also very sure Boeing does not do more building than NVIDIA. These companies maintain absolute control over design, processes, materials and component sourcing, because it’s critical to their product, whereas Boeing has also outsourced all of that.
>The fact that the specification was wrong might actually be a pretty good counter argument
The flawed specification was done in-house by Boeing employees. That design wasn't outsourced.
>your position that outsourcing doesn’t matter.
That is not my position. You're misrepresenting my argument even though I've stated it clearly : The author's claim that outsourcing leads to less innovation is incomplete and flawed.
Outsourcing does matter and can ruin a company. But it can also enable innovation and the author doesn't cover that scenario. That's my specific criticism of his article.
>I’m also very sure Boeing does not do more building than NVIDIA. These companies maintain absolute control over design, processes, materials and component sourcing, because it’s critical to their product, whereas Boeing has also outsourced all of that.
You've got that backwards. NVIDIA doesn't focus on materials science of chip fabrication. They let TSMC worry about that. NVIDIA employees focus more on engineering the instruction set, the microarchitecture, the firmware, etc. (Somewhat analogous to ARM chip design.) There have been presentations where the CEOs of both NVIDIA and TSMC are on stage together talking about their different core competencies.
>, whereas Boeing has also outsourced all of that.
Why does Boeing hire all these engineering positions if they've outsourced everything?
Let's try to level set the discussion. NVIDIA outsources the chip manufacturing to TSMC and then the whole card is outsourced to Foxconn. NVIDIA did not spend billions to build or acquire any factories. See the "NVIDIA does not manufacture anything directly, instead designs are sent to suppliers specializing in the technologies needed to create these devices, and the
products are then tested for quality control." excerpt from [1]
Boeing spent billions in building physical factories in Washington, North Carolina, etc. They do final assembly, stress testing of wing loads, systems integration, flight envelope testing, FAA certification, etc. Boeing outsources a lot of components but they also do a lot of in-house work.
And from all this, we still conclude NVIDIA does more actual building than Boeing?
>Do you think the same kind of mistakes would have been made of the people writing the specifications actually did some of the software work?
There will always be a specialization of skills so expecting non-programmers such as managers to code the software is unrealistic. And yes, Tesla conceivably had a more "closed loop" of in-house software programming instead of outsourcing and they did make the same kind of mistake analogous to MCAS with a self-driving car.
The flawed specification was done in-house by Boeing employees. That design wasn't outsourced.
I think you may have glossed over this point of the article:
If you separate the thinking about things from the doing of things, then innovation will suffer.
The article is arguing that the specification might have been flawed because the company doesn't have in-house manufacturing expertise any more -- there was not enough knowledge left to validate the designs.
>The article is arguing that the specification might have been flawed because the company doesn't have in-house manufacturing expertise any more -- there was not enough knowledge left to validate the designs.
Even though Airbus also outsources the flight control systems, they did not make the same mistake of only using 1 AOA Angle-of-Attach sensor in computations.
It doesn't mean Airbus is perfect. Some observers think Airbus' deliberate decision to decouple the pilot and co-pilot joysticks to not show synchronized physical feedback is a flaw which contributed to the 2009 Air France 447 crash. The co-pilot mistakenly pulled the joystick back the entire time and the senior pilot was unaware of it. Consequently, Gulfstream Aerospace (they also outsource many components including flight controls to Honeywell) decided to not copy Airbus' design for the new 500/600 business jets and instead, coupled both joysticks together with force feedback so the both pilots have physical sensation of what the other pilot is doing.
So instead of thinking "outsourcing leads to bad outcomes", there's an alternative explanation of "good or bad outcomes regardless of outsourcing". E.g. Blaming the "outsourcing" can't explain the good & bad outcomes when you study all the case studies of Airbus, Gulfstream, Tesla, Nvidia, etc.
.EDIT reply to : >The scale of outsourcing Boeing has done for the 787 is not comparable to AB (or anyone else). They outsourced core competencies, wing design, materials, software, basically everything,
I still don't understand why we begin the analysis by the amount of outsourcing and working backwards from that to conclude that the 787 is a worse airplane. Instead, why can't we consider that the 787 may be be considered superior to Airbus A330 by pilots and airliners. The more heavily outsourced 787 can also be superior to the 737-MAX in-house wing design.
>, and it pretty much aligns with the point the author is making: you can’t innovate / design if you don’t know how your products are made.
Similar to Boeing 787 carbon composite wings being outsourced in Japan potentially being superior to Airbus in-house designs...
AMD outsources more than Intel (because AMD outsourced to TSMC) and AMD Neoverse outperforms Intel Xeon. AMD out innovated Intel. Intel is so behind that they've made public statements of possibly outsourcing their chip fab in the future. But the AMD innovation story does not align with author's point. See the flaw in his analysis?
The scale of outsourcing Boeing has done for the 787 is not comparable to AB (or anyone else). They outsourced core competencies, wing design, materials, software, basically everything, and it pretty much aligns with the point the author is making: you can’t innovate / design if you don’t know how your products are made.
But you seem way more interested in proving right than having a productive discussion here. I suggest writing a blog if you’re interested in a monologue rather than wasting people’s time with inflammatory retorts.
Jenks, who leads the wing team, said the crucial, conceptual stage of the 787’s wing design was “100 percent Boeing.”
To define the shape of the wing and the system of movable flight-control surfaces, Boeing aerodynamicists conducted detailed analysis of performance requirements, historical flight-test information and new wind-tunnel data.
Only after that defining phase of the 787 design did Boeing bring Mitsubishi engineers to Seattle to figure out the broad parameters of the internal structure of the wing.
“We gave them the shape,” Jenks said.
“That is the family jewels,” Noble said. “That part I could never see Boeing sharing in any way, shape or form. That is what our brilliant engineers are able to figure out.”
Boeing engineering legend Joe Sutter, lead designer of the iconic 747 jumbo jet in the late 1960s, agreed that this first design phase is the key.
“That’s the stuff that Boeing still pretty much keeps under its own belt,” said Sutter, who at 86 still talks at aviation gatherings about jet design.
Of course Boeing will say it’s key. Can I just quote back from the same article?
> Boeing delegated to Mitsubishi of Japan a big slice of the design work.
How can that “100%” be true… these are all just empty words.
> If 10 or 15 years from now the world’s leading authority on this kind of structure is in Japan, then you can’t reallocate your resources to do that work,” Sorscher said. “You are dependent on them.”
This is part of the point the original article makes. Nothing here disproves it. The fact that Boeing succeeded (despite several production issues) doesn’t mean they won’t fall victim to the innovation issues that will result from offshoring engineering knowledge.
>How can that “100%” be true… these are all just empty words.
Seems like reasonable interpretation is that the shape of the wing was 100% Boeing aerodynamic engineers. So the simulations and computational fluid dynamics to design the external flight characteristics was Boeing. But the internal spar structure and key reinforcements for the carbon fiber structure was Mitsubishi.
>this kind of structure is in Japan, then you can’t reallocate your resources to do that work,” Sorscher said. “You are dependent on them.”
>This is part of the point the original article makes. Nothing here disproves it.*
>Boeing [...] fall victim to the innovation issues that will result from offshoring engineering knowledge.
But Boeing didn't have in-house knowledge to build a carbon fiber wing so there was no expertise to offshore. To get around that limitation, Boeing seemed to execute a very shrewd business playbook:
(1) 2003 : currently have knowledge on building metal wings in Seattle but no expertise on manufacturing new carbon fiber wings
(2) 2004 : outsource carbon fiber wing manufacturer to Mitsubishi Japan.[1] This also attracts support from Japan government and Japan Airlines to be first key customers of the new plane. Mitsubishi also helps pay billions for development of the new carbon fiber wing.
(3) 2016 : Boeing builds its own carbon fiber plant in Seattle to switch from outsourcing to inhouse production of the carbon composite wings.[2]
(4) Boeing is no longer dependent on an outsourced supplier for a carbon wing
This thread's article doesn't cover the above scenario either. So lessons learned are: (1) Outsource a key component you're not familiar with. (2) If it later proves to have strategic value, bring it in house.
Boeing used outsourcing to become more innovative -- which is the opposite situation the thread's article was complaining about. He writes paragraphs lamenting about companies outsourcing MS Office 365 but doesn't really dig into business case studies that don't match his thesis.
I still think Apple should bring some manufacturing back to the USA---if for nothing else---Goodwill. I do give them credit on customer service. As far as I know it's still American accents.
I am aching to drop Apple. I will buy their used products, but try to avoid their pricy new stuff. I don't need the latest, lightest computer either? My budget is not what it was ten years ago either?
I am waiting for a viable alternative, and it's pretty bleak.
I honestly fear what's going to happen to this once great country. I fear workers are becoming as disenfranchised as myself.
He did get into that, that sometimes it's smart to outsource, the example was the fuse on the toaster. But you need to build something, you need to have some core competence where you can innovate because you build.
>He did get into that, that sometimes it's smart to outsource, the example was the fuse on the toaster.
The counterexamples I was looking for were companies that didn't fit his thesis instead of a small part like a fuse being outsourced.
The author Bert Hubert keeps emphasizing "making" in addition to the thinking. So a design(thinking) company like NVIDIA doesn't seem to follow his ideal of how an "innovative" company is structured. And another counterexample like Apple in the 1970s used to in-house assemble computers and box them for shipping. That was all outsourced decades ago to China and yet Apple got more innovative with the 2007 iPhone.
I think the main difference between any toaster manufacturer and Nvidia/Apple is: at some point toaster manufacturers don't even supply their R&D with the means necessary to actually make new stuff nowadays. E.g. if the head of R&D says: I want to try this new heating element I've thought of, management will say: "sure, where can we buy it?". At Apple and Nvidia I suppose there's the infra in place to make it happen, even if you can't buy it. How long that will still be the case, is an open question (just look at Aerospace for the downfall, with private equity eating all their manufacturing capabilities)
To be fair, we don't know if Intel's chip architecture design is falling behind. Their inability to move past the 14nm process is preventing them from using their newer designs.
> We barely develop any software here anymore. So even very European companies like like Nokia and Ericsson, that are now trying to tell us that they are building our European telecommunication infrastructure. They’re actually not, they’re getting that built by other people in other countries far away. Anything having to do with server and PC development and manufacturing, there’s nothing left of that in Europe anymore.
This is quite an exaggeration, if not actually an outright lie. Ericsson's main hub for radio software development is in Kista, and there are some 3000 developers in Croatia as well. Some of Ericsson's radios do have their software developed exclusively in China (to my knowledge at least), and there are also a decent amount of developers in Ottawa, but to claim that all of Ericsson's software is "built in countries far away" is highly misleading imo. The 13,000+ Ericsson employees here in Sweden aren't just sitting around doing nothing.
> This is quite an exaggeration, if not actually an outright lie
Or alternately, poor editing? Without the "So", the "software" sentence binds to the previous paragraph, becoming about apps and enterprise software. And the Ericsson sentence binds to the hardware sentence, becoming about telecom hardware manufacturing. Both unremarkable.
I'm still developing software. Asian countries outsource a lot of their telecom eng to open source developers who loved working on telecommunications back before it was marked for deletion in the West, but we keep writing the open source code and posting it online since we enjoy it. I wouldn't be surprised if they had even found a way to repackage it as an opaque blob and sell it back to European telecoms at a profit, since those companies either won't or wouldn't be able to afford hiring someone.
That was a fairly interesting article, thank you for linking it.
The parts about vendors having a great deal of insight into the networks is certainly true from my experience.
For example: I -- an outsourced software developer working for Ericsson here in Europe -- have insight into what software/hardware is deployed in Verizon and AT&T's networks in North America, as well as which of their nodes/radio units are having issues due to software/hardware problems.
Now Ericsson doesn't run and operate Verizon and AT&T's infrastructure, but Ericsson does have a great deal of insight into it via proactive log collection and similar initiatives.
If one of Verizon's radio units in Texas is having a lot of problems, then a software developer in Croatia might end up analyzing some log/crash dumps from it to see what's wrong with it, and then tell someone at Ericsson in Canada to tell Verizon that the unit should probably be replaced.
He mentions the Dreamliner. That project is kind of a poster child for how not to do stuff, but I suspect that many of the problems came about as a result of cultural hysteresis. The engineers and managers were good, but inexperienced in development of such a loosely-coupled project.
I agree with the premise of the talk. In the US, we are facing the same issue with manufacturing. It’s actually impossible to do some types of manufacturing in the US. We’ve crossed the Rubicon. Alea jacta est.
"It was not until the Middle Ages that the letter ⟨W⟩ (originally a ligature of two ⟨V⟩s) was added to the Latin alphabet, to represent sounds from the Germanic languages which did not exist in medieval Latin, and only after the Renaissance did the convention of treating ⟨I⟩ and ⟨U⟩ as vowels, and ⟨J⟩ and ⟨V⟩ as consonants, become established. Prior to that, the former had been merely allographs of the latter."
So at Julius Caesar's time, I and J were the same letter that you could write either way.
> It is a “i” and not a “j”. **J**ulius Caesar said …
Why not "Iulius Caesar", for consistency? :-)
In any case, English may not be the right language to bicker about spelling and pronunciation in. It sports one of the most idiosyncratic orthographic systems in existence.
English also has a long history of butchering names, words and sounds, including its own (e.g. the Great Vowel Shift [0]). Iulius vs Julius is the least of its worries.
This explains the stagnation in the airline industry pretty well. They don't make the planes so there isn't a lot they can change about the in-flight experience, and they don't run the airports so there isn't much they can do about the onboarding experience.
There's a Planet Money episode where they talk about an airline that made more money selling oil futures than flying planes.[1] Maybe any sufficiently outsourced company becomes indistinguishable from a finance company.
The stagnation of experience is totally a choice made by humans. We could let it be like boarding a shinkansen in japan, where your mom can hug you at the fare gate, you can bring bottled water and you don't have to take off your shoes or anything out of your bags, but we chose not to. We could do CGP grey style boarding, but we do not: https://www.youtube.com/watch?v=3e5Jn2gG8Eg
1. Fools enough people to think the government has their shit together and is protecting people.
2. Surveillance.
3. Make work jobs program.
The whole shoe thing is because of Richard Reid, the shoe bomber. There was a later foiled "underwear bomber," I'm glad the government didn't have the same reaction to him as they did Richard Reid. Actually now that I think about it, they have full body scanners. I haven't been on a plane since probably 2006.
we’ve collectively chosen to let financialization ruin societies by allowing money to be the only viable yardstick of worth and value. unconstrained capitalism reduces this way through its natural instabilities, and not without unintended (and intended) consequences. it can only foster better society if we level the playing field and rigorously ensure competition.
I recently read an excellent short book on this topic, titled “Capitalizing in crisis” [1] in which the author traces the path that lead to today’s situation where a large portion of profit across all companies in the US is through financial activities.
It’s an excellent read, albeit repetitive at times.
The more radical theory I've comme across is that Capitalism is always in crisis. It is not bad step or a shortcoming of some kind but rather a fundamental feature. Capitalism works best in crisis because it is more inventive an flexible, it re-invents itself to endure the chaos it creates, may that be social unrest, environmental degradation, financial collapse, wars, etc.. This was Marx mistake (well, one of them at least), Capitalisme will not crumble because of its internal contradictions. The contradictions lead to crisis which only gives it more strength.
In that sense, finance is but one of the problem, which might have a central position in the current run of crisis in technical terms but only insofar as that it is the the most powerfull tool the capitalist has to extend its reach. Even if the financial system were to be sanitized overnight, it would not imply the end of crisis for Capitalism.
i don't think instability (aka crisis) is a fundamental feature of capitalism, but rather, of complex (human) systems in general. early writers on capitalism like adam smith and marx had identified the likeliest first-order instabilities pretty clearly, and provided reasonable safeguards against them. but these were not popular with wealthholders/powerseekers, so were weakly instantiated to begin with and weakened over time so that we now have a corporate fascist form of capitalism taking hold in countries like the US.
for capitalism to work, we need to put all of our regulatory energies into creating fair and transparent markets incentivized toward productive activity and against rent-seeking (these are among the most important of those first-order instability guards for capitalism).
I understand the point you're trying to make, but frankly I'm not convinced that monopolies/oligopolic collusion aren't an inevitable end result of any ostensibly "competitive" market environment. Given that capitalism on a fundamental level is about profit maximization, and monopolies/vertical integration/oligopolies and other similar situations are the inevitable most effective way to achieve this, how do you avoid this sort of thing in any system that can be described as capitalistic? I'm thinking especially of situations like telecoms, where the barrier to entry is impossibly high and so new players in the market just aren't a thing that happens (easily).
i'd posit that very few markets have natural winner-take-all characteristics (even infrastructure can be market-based, with the right conditions, e.g., tokyo public transportation), and consolidation (vertical or otherwise) is not an inevitable outcome in free and fair markets because you'd need to obviate them away in the course of leveling the playing field.
for instance, look at many of the markets of the 70's, just before trickle-down ushered in decades of greed-driven regulatory disembowlment. markets work best when there are 7-9+, if not dozens of, mostly mid-sized competitors. outsized profit conditions are meant to be fleeting, as a temporary reward to encourage constant exploration, risk-taking, and creation (and creative destruction).
I agree with the observation of how things are (high on outsourcing, low on tech & talent). I do disagree with how things got there.
In a competitive market, when you outsource, you get immediate costs savings. If your competitor outsourced more things than you have, you'll be at a financial disadvantage for some amount of time before the "innovation debt" catches up. That can be decades - the quality of the outsourced parts can remain equivalent or superior for quite a while (or even perpetually, in case of fuses).
A similar thing happens with companies that do actually want to innovate. All of them are spending all available resources competing with each other, that the R&D for big tech projects simply cannot happen without external intervention or external funding. Historically, none of the well-staffed and well-funded research labs have been funded by companies whose products are a commodity.
>In a competitive market, when you outsource, you get immediate costs savings.
That's a good point. In addition, there's the (mostly) inevitable tendency for economies of scale to push for outsourcing. Several companies I've worked for shut down their board shops while I was there, there's just no way to practically keep up with that. Fabs got bigger. Specialty sheet metal shops can pound out the work faster than you can.
One related thing I've noticed is that older companies (dunno about places that make exclusively software) are never well equipped to deal with perpetually cheaper products with smaller margins.
As a side note, I suspect that the real magic in making toasters, if all done in-house and using simple inputs, is to design the manufacturing facility. The toaster itself is relatively simple. The Rouge must really have been something.
Well for those reasons and more, every consumer hardware is about design for manufacturability these days. If you cant get the manufacturing right, scaling doesn't happen. Redesign will be costly or impossible without having those design methodologies front and center at the start. If your design is only possible with CNC, then say goodbye to the larger market.
I agree with you ('consumer' being the keyword if it's meant to mean 'high volume') but I wonder how a company digs it's way out of the situation defined in the thread's article.
Is a company intended to become better at toaster components? push the component complexity down to a more basic level and then have more bespoke manufacturing done? or to become a manufacturer/final assembler and all that that really implies?
The real answer I guess is to command that your outsourced engineers design in some IoT toaster sorcery that allows you to sell toaster information to third parties.
I've seen a couple of ways industries change on this. One is when a funded startup comes in and brings the necessary progress. Another is when the industry forms a group and tasks a neutral third party to carry out the R&D.
In a competitive market, when you outsource, you get immediate costs savings
Isn't that just a different phrasing of what the article means by "in the longer term, [shareholders] are not strategically interested in the company, because if the company doesn’t do well, they will invest in another company. And so a lot of this stuff is actually driven by shareholders and consultancies"? Or are you arguing that it is competition that drives outsourcing, not public ownership?
the R&D for big tech projects simply cannot happen without external intervention or external funding
I think you may have a point there: the KPN Neherlab mentioned in the article never was KPN's. It was part of PTT, the Dutch national post & telecommunications service. PTT was rebranded KPN when it was privatized, and KPN subsequently closed the Neherlab.
I meant that selling a gadget at +10% price compared to competition will immediately tank sales - unless there is a tangible benefit to in-house dev as well as the customer being aware of this benefit. This is regardless of whether the company is public or not.
While it's easy to nod along to the thesis of the article, I don't know that I agree with the examples.
Is the suggestion is that European business has been hollowed out to being nothing except a sales channel for imports really true? As far as I can see, manufacturing has been steadily increasing in the EU for basically at least 30 years, with the exceptions of two one-off events (2008 financial crisis, Covid). Likewise for exports.
And when it comes to telcos, why in the world would we want them writing their own tech? Very few of them have sufficient scale to make building their own basic infrastructure sensible. All they could plausibly be writing is value added services that nobody actually wants, rather than being dumb pipes.
(I do think telcos shouldn't outsource their network operations as a whole. Outsourcing individual commodity functions like DNS seems kind of reasonable though.)
> [...] European business has been hollowed out [...] manufacturing has been steadily increasing in the EU
Note that EU-heavyweight Germany has an atypical-for-Europe emphasis on manufacturing. Perhaps making those two thoughts compatible.
Also, people speak of US total manufacturing output increasing, while at the same time US DoD speaks of five decades of US deindustrialization being severely damaging.
First, Unix and Erlang are not good examples here. They were more like tiny research projects that escaped the lab. No particular economies of scale needed. In comparison to that, te technical artifacts these companies were actually aiming to produce (e.g. networking equipment) had hundreds of times more people working on them.
Second, Ericsson is not a telco, they're a supplier. Exactly the kind of entity that can achieve economics of scale on building say a mobile gateway router or a billing system, by building one that can be used by a hundred telcos, not just one.
Vertical integration did make more sense for the AT&T of old. The market didn't yet exist, or wasn't standardized enough, for them to be able to depend on that.
Well telcos did led the pack worldwide regarding networking and mobile coms (GSM started as a French acronym), as said Unix, c and c++, many mobile concepts and services before Apple and Google replaced those.
> All ready to be swept away but any real tech or mega Corp.
Regulations around the spectrum and the concept of auctions make it impossible for any new player to enter the market. Spectrum and radio equipment should be owned by the government and communication providers can just buy bandwidth/airtime on it at a fair price. That works successfully for things like the power grid, water/gas distribution infrastructure, etc.
The people who have jobs in zombie firms are fragilizing their careers. If you are just managing a process, your job will be automated. Your leadership won't hesitate to lay you off. If your firm is protected by intellectual property which isn't respected by China, and your firm doesn't even have the skills to defend its own network, you can't keep your "property". If you train your Asian partners how to work within your regulatory framework, they will just hire a few members of your staff away, then open a competing firm in your own country and completely dominate you. The whole western system is being dismantled, thank you McKinsey & Company.
Why don't people leave? Because they want their buy-out. They want the package that comes from grinding all of those years away at the zombie firm.
"Losing out without the tech" would be a better title.
Also, Europe may start to realize that it has to develop some tech, such as Cloud services, itself. As @jbverschoor mentions, the European attitude needs to change to consider tech as a core business, instead of a cost center, and pay engineers accordingly.
... but Europe is not just some dude making decisions all by itself. So who should do exactly what? Sure, maybe the corporate culture needs to change. Maybe CEOs, boards, investors/shareholders, regulators (and managers, and employees, and education, and ...) have to change to change corp culture. So how?
Also why do exactly Europe has to do this? Is Cloud the new gunpowder? Should the EU Commission just run a big OpenStack cluster?
(So these big alliances are completely useless. Just like OpenStack itself. Instead of this 10B EUR for some pet projects of random big and old industrial companies, the Commission should work on making it easier to start a company, hire/employ folks anywhere, sell stuff, make contracts, develop standards, report problems [ie. corruption], and it should create X-prize like grants/competitions, it should give incentives to build and maintain direct to end-user technology. Anyway, this is just like the F35 in the USA, members states want to spend the money at home.)
Tech is the new gunpowder. USA and China government understand how to prioritize and plan this. The European Union needs a 5 to 10 year tech plan like this older Chinese one: https://crsreports.congress.gov/product/pdf/IF/IF11684
Yes, gunpowder is literally tech too after all, and there's no shortage of tech in the EU. The EU Horizon2020 was a great initiative for example.
The EU thinks in 7 year budget cycles.
I too recommend more spending on tech and science.
That said the direct EU budget is not particularly huge (just 1% of the GNI of all of the member states), but members should spend on tech more anyway. And the NGEU (next gen EU) COVID relief fund is "just" 750B EUR. (Though again, member states themselves will also spend a lot of money on boosting the recovery.)
Naturally there are obvious strategically important areas where the EU as whole should do direct investment into R&D. (Food, energy security and health for example.)
China does understand that the best way to build up a particular industrial capability is to do it. "Just do it". Especially using tech transfer agreements. Hence their amazing railway network. (But their 5year plan is not particularly interesting.)
----
I'd argue that the EU should just do more (a lot more!) of what it usually does anyway. (Which is participating in various projects, from ITER, ESA (+ Galileo), to funding a lot of initiatives like NGI.) Also it'd be great to capitalize on GDPR and other differentiating directives. (So partially funding projects that offer GDPR-compliant alternatives.) ... and healthcare, the EU populace is aging, fast .. if that's not great "market" opportunity I don't know what is. ¯\_(ツ)_/¯
But, still, the core issue is that there's not enough direct capital sloshing around in the EU. If the Commission would partially fund targeted VC funds it could make very big waves in various tech sectors.
On the topic of "reversing outsourcing", a few French shoe brands have decided to bring back production to France, and stop outsourcing.
One of them was Le Coq Sportif.
It was challenging because they realized that manufacturing sport shoes is actually difficult, requires expertise that was completely lost in France because it have been outsourced for 30 years. So they had to re-learn everything.
I was initially confused about this too. The title of the corresponding video of the article replaces "over at" with "in", which is much clearer. In other words, it's not "over" anything it's "over (there) at" the specified places.
> How Tech Loses Out over at Companies, Countries and Continents
> How technology loses out in companies, countries & continents
> How Tech Loses Out
None of these titles are understandable to me. How can it be so hard to give something a simple, intelligible, and coherent title? Or maybe I'm the idiot.
The missing implicit bits and technical conflations are probably what are throwing you. "Technology" really means technical understanding and resulting quality as opposed to use. Even the crappy ones still use tech that they outsource but poorly. Losing out means in the context of matters of popularity. It is losing out in the same way Semmelweiss did in his lifetime over "doctors should wash their hands".
This article / talk seems entirely focused on company boundaries - what's in-sourced and outsourced - but isn't that sort of secondary?
If you have a group of 500 people contributing to the making of a toaster, whether they all call themselves Company A (with many divisions) or are split into many companies, it's kind of the same when the rubber meets the road - it's 500 people working together in some fashion to produce a toaster.
How you draw the lines around them and what company/division names you write on top is sort of secondary.
I think it does matter how the 500 people are set up:
- Information flows become adversarial. I don't want to tell my customer they're wasting money on the heating element, because what they waste ends up in my pocket.
- Incentives are different. Customer wants the best heating element for their use case, but I want the one that I can sell to as many customers as possible, so I'll nudge them towards this one that seems to fit that description.
- Marketing arm sees an opening and wants to innovate. Sadly they don't speak the same language as the techs, metaphorically and literally.
You could say this is the same problem that the whole world has. Why aren't we all a global village, able to make decisions like saving the planet? Incentives are cut up and fall different ways because we don't see each other as one, and can't communicate that well.
Telecommunication is a pretty mature field. Phones aren't changing very much. I think its obvious that a company in an immature field will innovate a lot, there's a lot of low hanging fruit. In a mature field its a waste of $$$.
Everything the author said could still be true, i just think the cause is a lot more intentional and non mysterious. Of course the company that sells tech that has been commiditized will be noninnovative. Why wouldn't they be?
There are other commoditized products out there that provide much better customer experience and reliability. The other day my internet connection went down and I needed to use 4G as a backup.
In 2021, in the age of Amazon, Uber and cloud computing, where you can pay to get pretty much any product/service delivered on the same day, I couldn't get uninterrupted mobile internet from any of the carriers here no matter how much I was willing to pay.
I had to work around some stupid billing systems and artificial constraints, and sit through minutes of pre-recorded marketing bullshit just to be able to pay them money. They seemed to be more interested in marketing irrelevant crap to me than actually taking my money and giving me the one thing I wanted and was happy to pay big bucks for: internet access.
In the end I couldn't find anything better than topping up 30 bucks at a time and resetting the "bundle" every couple days when the data runs out. There was no way to just tell their system "here's 300 bucks, give me data until this runs out".
I remain unconvinced that a company who outsources production does not add value or perform useful engineering and quality control. Sure, I wish the west had retained some of its manufacturing and design capabilities. But I don't think his examples make a very compelling point.
We're mostly SW folks here, I think, so we generally understand that interfacing components requires a well written contract and appropriate testing at the boundaries. Why is it not the same in manufacturing? If I'm outsourcing the manufacturing of my toaster to various parties, I still have a hand in specifying the pieces, assuring they meet my requirements, and verifying they are put together in a way that upholds my brand value and meets customer expectations. That very much requires engineering talent. It also benefits from comparative advantage and provides my company with more profit which could, theoretically(although seemingly rarely in practice) allow me to provide a superior product at a better price point.
Any decline in quality would be the result of (poor?) business decisions and not inherently a result of outsourcing.
> If I'm outsourcing the manufacturing of my toaster to various parties, I still have a hand in specifying the pieces, assuring they meet my requirements, and verifying they are put together in a way that upholds my brand value and meets customer expectations.
Except eventually you can't, other than in very vague, handwavy sense. Engineering specification is a skill, and it dies when manufacturing base moves overseas.
You get maybe a decade of transitional time when you both have an oversupply of now unemployed engineer force AND cost savings from doing the actual work elsewhere, but then people change careers and new college students know better.
> We're mostly SW folks here, I think, so we generally understand that interfacing components requires a well written contract and appropriate testing at the boundaries.
It also means that we understand the overhead in doing so, compared to doing things inside of a team.
I enjoy selling people things that I've written the firmware for and that I've done at least a bit of the manufacturing on. It helps me to think of my customers as real people, even if it's just someone who bought my thing on Amazon. I don't get how people who don't feel that way work. I'm sure they do, and good for them, but I can't really grok how they work.
Hating on apple is a pet peeve of mine, but when the iPhone went out with it's locked down it just works appstore I knew we were in a watershed moment - around that time people just stopped being interested in how stuff works and everything was replaced by techo shamanism - with the few of us still knowing how to troubleshoot something being viewed as shamans.
The idea that people could dig deep and get our knowlege (and the motivation to do so) just went out.
We see this in cars - the check engine light is the most disgusting thing ever invented - you have so many displays, you could put directly the real error and let me know what it is. But no - we have to go the techo shamans to do their job.
We move from not knowing things to unknowable - the idea that the knowlege is unobtainable. And that is bad.
I dunno, that one seems like it would be quite fussy about the type of bread, and you don't have much control. It also toasts the two sides unevenly, and gets very hot on the outside. I prefer my modern one:
- At any point you can lift it up to see how done it is, and drop it back down.
- If you're toasting frozen bread, you push the lever down and press the "Frozen" button.
- If it pops up too soon for you, you push the lever down and press the "A little bit more" button.
- If you're toasting fruit loaf, you press that button.
- The outside stays touchably cool.
A combination of the bread-temperature-sensing (assuming that's not what my one does) and the more modern features might be even better.
"I launched a telecommunications company called PowerDNS."
Is writing an authoritative and recursive nameserver with flex and bison and licensing the software to some telecoms in Europe the the same as "launching a telecommunications company".
Would he say that the companies that some European telecoms are outsourcing to have "launched telecommunications companies".
Why not call it what it is. A software company. That now provides SaaS.
A company that is taking on outsurcing work from telecoms that are internet providers. It is itself an illustration of the problem he purports to identify in the presentation.
He didn't hide that, he stated it explicitly a little later: "I used to sell software, and now I sell services". "Telecommunications company" seems more specific than "software company" to me. Taking on outsourcing work is how he came to be in the position of understanding the topic of the talk/article the way he presented it, and understanding how little the "actual" telecommunications companies were doing.
If as a company you have outsourced too much. No good engineer will still want to work for your company.
This is the root of the article. The bell companies got so bad at technology (in part due to outsourcing to every consultant on the planet) that they can’t hire smart engineers.
There are lessons to broader companies too. How many large non-tech corporates learned this the hard way? And in an ironic twist, it appears that IBM (the outsourcer) can’t hire engineers on their own either, so they have to buy the RedHats and Turbonomics of the world.
Working on Cloud that is exactly what happened. Many countries have federal regulations to host services in data centers inside their geographical boundaries. Instead of having a Cloud provider, these larger corporation asks Google, Microsoft and AWS to build and deploy services in these zones. A very dangerous path for European companies. Excellent article
As an "expert" in telecom, he should really know better. Nobody is under the illusion that regional telecom companies are some sort of tech hubs. Of course they're not. They don't have the economies of scale -- unless they become a gigantic monopoly such as AT&T and then get broken up anyway. Or, in the case of AT&T again, if they just helped create an entirely new industry so that they have to build everything themselves because there is nobody else to do it anyway. But this is just a historical curiosity. The telecom industry is obviously not in this state today.
What incentives does a small, geographically limited service provider has to spend so much on R&D? So that they can get try to make awkward licensing deals for their awesome tech if they strike gold and get to be the first to develop 5G? This makes very little sense. Telecom equipment vendors are an actual thing. They're just not that well known to public. For the optical backbone, you have the likes of Ciena, Juniper, Infinera, ADVA, etc. (More well-known Nokia and Cisco are also heavy players there.) These companies are all system integrators themselves, and have their own degree of vertical integration. There are component vendors which specialize even more, although they are increasingly getting gobbled up through consolidation in the sector. This is partly because people figured that it makes little sense to develop the same transceivers or modems at 10 different companies when you can instead merge them and combine the R&D budgets, patents, and manufacturing know-how. This is the same reason we don't have 20 CPU manufacturers. This is advanced stuff with high R&D costs. His argument seems to be that it'd be nice to just have more people work in this field at regional telecom companies, because then some areas like parts of Europe wouldn't be deprived of a tech sector. The reality is that we don't need a very large workforce to develop the most high-end stuff, and tech, especially the high-end non-consumer grade variety, really is a winner-takes-all market when it comes down to it.
There is nothing preventing a big telecom service providers from buying any of these system vendors. That would also be very expensive and subject to heavy regulatory scrutiny to ensure proper competition in the market (buying a system vendor and then blocking the sales or price gouging your competitors being a big no-no). But there again people figured out in the 80s and 90s that conglomerates are actually less, not more efficient. Everyone is better off specializing in their own thing. The tech inclined people end up joining those companies instead, and that's really for the best. Is he really arguing that innovation and tech doesn't exist in the field? Where does he think it comes from? Some unnamed office in Shenzhen? Besides Huawei or ZTE, not really.
I think the main message is that the EU should have the capability to build/understand/evaluate/reverse-engineer/secure/replace/maintain/innovate telecommunications stuff. It doesn't necessarily have to do everything at the same time at every regional/national telco.
But - according to Bert - it'd be great if the manpower at least would be here, in the EU, not scattered around the world. At least as long as we have these things like countries and geopolitics.
Telecom companies could have been the best speech recognition companies out there. Instead they did nothing until recently. Your perspective lacks imagination.
What a fantastic article. I can't help but think how SpaceX, for example, does everything they can. They don't outsource their engines. They don't outsource anything. Heck, they get involved in making new alloys even if they're not making the bulk alloy metal. Musk knows that to innovate, you must control your own destiny. We need more of this. Outsourcing is this management fad that all the business schools must have taught as de rigeur, and it's good for non-core competencies of a company sometimes, but never ever good for core competencies.
I think about this often. A lot of people worry about the Singularity. I don't: the trivial, non-smart, underlying infrastructure running these amazing artificial intelligence will collapse much, much earlier than any spark of consciousness will be able to emerge.
I help companies with cloud transformations and often I come across situations where businesses are in trouble because they completely lost their operational knowledge. They might know how their applications work internally, but they have no clue how the machinery around them keeps them ticking.
Once I was at a company in Northern Europe that desperately needed to modernize their stack running on legacy mainframes in a colocation datacenter.
Their core business was based on winning the five years government tender for a very crucial public service. They won for years and years, but now they were scared shitless because the new tender explicitly mentioned that the system was supposed to be hosted "in the cloud" and developed using a "microservices architecture".
The main issue was two-fold. On one hand, for decades they offloaded the deployment of their core systems to the technicians of the datacenter itself with which they had an amicable and personal relationship. On the other, said datacenter was being bought by a US megacorp who didn't give a crap about the existing personal relationship with this very specific, crucial use case and told the company to simply pay them a crazy amount of money to perform the migration.
They were also so blind to their lack of understanding of their own deployment process that they called us only two months away from the deadline. So, yeah... I advised my employer to stay away from this death march.
And for the Singularity, it better have a good DevOps neural net.
Sad as it may be, I think it is foolish to resist the commodification of technology. I've disassembled my broken washing machine and found I can find a replacement for most constituant parts. How would you event improve on such a thing, why would you want to do that? I think we better embrace this evolution and move ahead with the right-to-repair legislation.
> And at some point, the technical skills of the company become negative. And what does that mean? That your company knows so little about what it does that if you would ask a random person on the street for advice on the thing that your company makes, they are more likely to provide correct answers than the people that actually work for the company.
This is a scary thought, but one that is regulated by capitalism and economics. Sometimes it's cheaper to hire smart people when needed, rather than keep people around just for their knowledge...
I started reading the article and immediately had a question...
"This is a transcript of my presentation over at the European Microwave Week 2020, actually held in 2021."
What's a European Microwave Week? Well, it's a conference put on by the European Microwave Association.
"The European Microwave Association (EuMA) is an international non-profit association with a scientific, educational and technical purpose. The aim of the Association is to develop in an interdisciplinary way, education, training and research activities."
Ok.
"The European Microwave Association (EuMA) is an international non-profit association with a scientific, educational and technical purpose. The aim of the Association is to develop in an interdisciplinary way, education, training and research activities, including:
"Promoting European microwaves
"Networking and uniting microwave scientists and engineers in Europe
"Providing a single voice for European microwave scientists and engineers in Europe
"Promoting public awareness and appreciation of microwaves
"Attaining full recognition of microwaves by the European Union..."
So, uh, how far down this rabbit-hole do I have to go to find a meaningful term...
"EuCoM 2020 Events:
"GPR and Electromagnetics for Sensing Soil, Objects and Structures: Forward Modelling, Inversion Problems and Practical Aspects" - Lecce, Italy, January 29 - February 01, 2020 - Org.:R. Persico et al."
Whew.
[Edit]
I wrote the comments above before I read the article. Now that I have read it, I came to an epiphany:
*It's exactly what he is talking about!*
EuMA doesn't do microwave things. It's an organization about microwave stuff, but what they do has nothing to do with microwaves. The schedule things, they write contracts for venues and catering, and they send press releases of various kinds.
Wouldn't it have been slightly refreshing if EuMA's web site was written by someone who actually knew something about microwaves? Someone who could spice things up with meaningful examples? Even a little?
Anyway, there are some issues with the article itself.
"And we fight for all technology, even the stuff that is not core because we are attached to it, we love what we do."
What is core, and what is not? And after you've eliminated everything that is clearly not core, what is clearly not core among the remaining things you have left? If you've outsourced the springs, knobs, cords, and cases then those start looking an awful lot like something else you should get from outside. Especially since your manufacturing facility is now just running one shift a day. Or a week.
At the end of the article, he mentions, "JPL at Caltech in the US", which is an interesting (and appropriate) phrase. If you follow the Mars rovers or any of NASA's other unmanned exploration missions, you'll see JPL mentioned a lot. NASA is very proud of JPL. Which is a little strange since JPL and NASA are only loosely related. "JPL is a research and development lab federally funded by NASA and managed by Caltech", as their web site says. The launch vehicle, by the way, was a commercial United Launch Alliance Delta II. (Not that I'm bitter in any way.)
Besides Asimov's and Jonathan Blow's illustrations of the problem. What is the cure?
Instead of focusing on abstractions of technology, why not focus on the fundamentals?
With fundamentals, I mean mathematics and physics.
This is perhaps all you need to understand tech in its fundamental nature. Everything is built out of mathematical and physical principles it seems.
The problem then may be to determine how much you need of it for your field. Do you need quantum mechanics to know how bicycles work? Maybe classical physics (mechanics, electricity, optics) will suffice here and the rest is a combination of physical principles aka engineering or creativity if you will. (Creativity is a fundamental human skill it seems.)
There's also abstract engineering as I would put it: computer science.
What are the principles of CS? I am not so sure, but I have a rough idea that perhaps it is also field dependent.
For computer graphics seemingly you don't need a lot of math. It's roughly at high school math level.
You can do projections and other transformations without linear algebra (vector, matrix math). All you need to do here, is deriving relations/equations out of geometric rules (like you can for perspective projection for example).
Then you have 2 fundamental ways of rendering things to the screen: rasterization (Pineda's method) and ray tracing.
Rasterization:
- after projection, you need to bring your vertices projected onto your virtual screen to an actual "screen" or should I say PPM image. Then you need a filling algorithm like Pineda's one.
Ray tracing:
- here, you don't need any projections. The projections are basically "a given". As you shoot rays from your virtual camera through your virtual screen, all you need to do here is to calculate intersections with geometrical objects and color them accordingly.
That's it.
So for all this basic rasterization and ray tracing you at least need: high school-level math (perspective projection, intersections, transformations, ...), basic rendering algorithms (Pineda's method, ray tracing, Phong shading, ...).
I suppose the other field's fundamentals or basic principles can be "filtered out" from these many layers of abstractions.
Matrix math or linear algebra is not needed for CG. (Although it is really comfortable to know it... It's like doing CG by Assembly instead of C or C++.)
I also suppose that back propagation - the backbone of neural networks - can be also done with basic high school math.
So my suggestion is: decide what are the basic principles of a given field and filter them out.
Also, my question to all of you: learning Assembly is pretty much an educational endeavor nowadays as CPUs these days are doing lots of magic behind the scenes. So what are the fundamental principles here? Physics surely, but I see a problem with my argument above. Getting down to the common denominator (math and physics) is relatively easy, but it is hard to build a CPU out of it albeit theoretically possible.
Charles Petzold's code seems to suggest that those principles are actually not that far from basic principles.
I don't think there's enough understanding about how physics connects to chemistry to biology to psychology etc for a fundamentals first approach. I think what we actually do is we find something that works, and then we explain why it works later. I've had titles like software engineer and database administrator, but the most important aspect of my jobs is always psychology, not computer science.
I like the idea here. As I've gotten older my technical aesthetic has mostly drifted toward amassing knowledge of these fundamentals that you speak of. It isn't that they let you know everything, but they serve as a good map and have very high power-to-weight ratio.
andy grove (of intel fame) has written quite a bit about the dangers of over-outsourcing over the last 20 years. one of his more interesting points was that running a manufacturing operation made intel better at designing chips in unexpected ways. unfortunately, every link i find is either dead or behind a paywall.
I think about this a lot and the world seems more and more like this every day.