How did this crap rate publication in Scientific American? It's really just a Millennial version of "uphill both ways".
I'm 53, and the amount of technological change I've seen in my lifeime is astonishing. I can clearly remember the first time I saw a video game (Pong) because I was 8 or so - old enough for it to make a huge impression. The first time I used a cell phone in a memorable way, I was 36 years old (9/11, calling my spouse to tell her what I just heard on NPR). This is ubiquitous, world-changing technology that simply didn't exist when I was young.
Now, think of CRISPR and other DNA-altering technology, and what that will mean going into the next century. Think about renewable energy more or less completely replacing fossil fuel in coming decades. Think about the extraordinary amount of light-cognitive work, like driving, that will be replaced by automation.
Final perspective... I have a potentially deadly chronic illness that hit me a few years ago, but I've probably had latent since childhood. It affects children more than adults at onset. If it had happened to me as a child, I would have died. Now, it's an inconvenience.
> I'm 53, and the amount of technological change I've seen in my lifeime is astonishing
I think its somewhat telling that your "Think about..." statements are all hoping on technology yet-to-be, not technology that exists today.
> Think about renewable energy more or less completely replacing fossil fuel in coming decades.
But we thought peak oil was gonna be around 1970. Today petroleum engineering still has better job prospects than nuclear engineering. I think the track record for "just around the corner" is quite poor.
> Think about the extraordinary amount of light-cognitive work, like driving, that will be replaced by automation.
I think that the track record...
The examples you have for technology that's actually happened in your life are just computers. Which are great, nobody doubts those strides. But how's the subway vs when you were 10? How much has schooling changed for the better? Are kids happier in school now, or completing it faster?
How has how most people get to work different than 50 years ago? Are more people able to walk or use fast public transport than ever before? Hows your experience at the airport?
How about building roads? Is it cheaper than ever, like with computers? How about housing? If computers getting cheaper and faster is progressing technologically, by the same metric are not roads and houses going backwards technologically?
How's the quality of our food tech improved in the last 53 years? Are we all reaping the rewards of that? Why are we so much more obese and diabetic?
How's the quality of life for the average child? Can they spend their day doing more of what they wanted vs 53 years ago? Or are they under what is essentially house arrest more than ever? If we have more technology, why isn't their environment safer and freer? Why can't they easily get to the library and museum on their own, like they can in Japan? If the technology we do have is improving our lives, are children (and adults) less depressed now than 53 years ago?
What are the things that stopped working? I don't think its a flippant question that can be answered only with computers and cell phones. Some technology has definitely advanced, but it should be just as obvious that some has almost completely stalled.
> How's the quality of our food tech improved in the last 53 years? Are we all reaping the rewards of that? Why are we so much more obese and diabetic?
Purely talking about the technology, food manufacturing tech has experienced radical innovation over the last 53 years. Not just "with computers", but through biology / biotech etc. These things have nothing to do with socioeconomic issues which largely underpin, e.g., public health issues like obesity and diabetes.
That's actually the common refrain through this whole list, it mostly seems you're arguing that technological innovation hasn't kept pace with social progress, which might be true, but is also a completely orthogonal point.
> How's the quality of life for the average child? Can they spend their day doing more of what they wanted vs 53 years ago?
Speaking of technological progress, medicine! Your stat measure has the same problem. The correct answer is: 1/10 infants died in 1915, to .3/10 in 1950, to .07/10 in 2000 to .05/10 per day. Each step along the way, due to technology/scientific advancement, measurable & meaningful numbers of kids got to experience life, who would not have been able to without those advancements. This is why we do what we do, and here's to knocking .05 down to .01 because that is who we are.
The original peak oil theory predicted US production only, and it was correct down to the exact year. We did break that peak again, finally - in 2017, after nearly 50 years of aggressive technological improvement in oil discovery/recovery.
For any given oil field, peak oil theory is extremely accurate.
As for technology that has happened in my lifetime... the reason I have a lifetime right now is due to laser surgery. That technology did not exist when I was a child. If my illness had appeared then rather than today, I would have died from it. Not computers at all, really, although I'm sure computer-aided design played a role.
Our food tech has improved tremendously - but that doesn't mean that the food is healthier. It means it is cheaper, more plentiful, has longer shelf life, and more enticing flavors.
On the other hand, there was the big bang of the internet in the 90s which changed everything for me, but post-2000? social media, uber & airbnb. The consumer internet of phones is not that major technology-wise, even though socially it was very disruptive.
The modern internet-connected-cellphone-in-your-pocket was the radical advancement of the 2000s, and it's more of a story of packaging existing technology into a tour-de-force which makes it even more interesting. As it's own "event" though, I don't think I've seen a single product so radically transform a society. Yeah the social media & ride-hailing all seems goofy, but we took a stab at out-innovating the very concept of boredom (and created some new categories for the DSM to boot), that deserves some credit
I think you are underestimating the technological advancements required to make the smartphone you are holding in your hand today. From processors to batteries to screen and radio technology.
Is internal combustion an "incremental" change from steam engines? That's the kinds of lines we're talking about.
In a way, lithium ion batteries are incremental - they're just batteries. But they've enabled modern phones and modern electric cars, so they're kind of radical tech in their own way.
"Technology" really just means "stuff that didn't exist when I was a kid". The majority of HN grew up with the internet and cell phones, and don't really remember a time without these things.
I wish I could find the attribution, but I read an interesting comparison - imagine somebody from 1880 who was transported to 1950, and imagine how shocked he would be at the changes in the world: cars, washing machines, television, airplanes… now transport somebody from 1950 to 2019. He would be impressed, but not overwhelmed, at the changes that have occurred in the same time period. Besides the computer, most of what we have now is just better versions of what they had 70 years ago.
Agreed, and there's a related point too: These days, strangely, ideas are getting harder to find and research productivity is poor. There was a neat discussion of this a few years ago [0], and here's the paper [1].
If the time travelers go on public transportation in NYC or Boston, they see perhaps worse technology than they had 70 years ago.
Airports work less well than 70 years ago, though air travel prices have decreased.
Our diets work less well than 70 years ago. Agricultural and nutrition science have not materialized any benefits for the public w.r.t. obesity or diabetes. Though we have more life-saving tech. Diabetics live much better, but we have vastly more of them. (~10% of the USA is diabetic. CDC report in 2017: More than 100 million Americans have diabetes or prediabetes)
Building a new house is more complicated and expensive per foot than 70 years ago. Though insulation is better.
High trust societies work less well than 70 years ago. Trust is a resource, slowly renewable, and many places seem to have squandered it. There are a lot of such social technologies that have been working less well over time, I imagine. We may(?) now have fewer pleasant, human-scale built environments than existed in the 1950's. A huge obstacle in the US is that walkability invites social problems. So to have good cities, you must not tolerate social dysfunction. If you tolerate it, people will use car-dependent development to insulate themselves from it. At least, that seems to be what's happened.
Not just technology, but the landscape of our cities too; suburbia, car culture, skyscrapers in the urban core; there haven’t really been radical transformations there either.
Some technologies are not as obvious as the car but other science fields which are not immediate obvious have advanced a lot since the 1950s E.g medicine.
Note the computer revolution has come of out of successive improvements to a very particular technology (FET transistors on silicon.)
Back in the 1960s there was wild speculation of what kind of elements people would make computers of, but the last alternative material that people took seriously was Gallium Arsenide back in the 1980s. (Rumours that there is an Indium Phosphide based microprocessor for military use notwithstanding.)
The revolution that came from the 1960s was embedded in the System/360 mainframe and it was that architecture was independent from hardware so you could wait a few years and expect to get a computer that was better than your old one and could still run the same software.
x86, ARM, Power all attained the same thing, but DEC failed to accomplish this with the PDP-* and VAX machines. The 8-bit machines of the 1980s turned out to be dead ends for the same reason.
I would argue the FET transistor has been such a revolutionary technology in the applications it has allowed that we are still mining the offshoots of this technology. The fact that there is still so much money in computing means that vein has not been exhausted. And yes, I also believe that this has redirected talent from other fields into computing.
Robert Gordon's examples of new technology are almost all physical. It may be true that 1970 was the end of a century long "mechanical age". Now we are in the information age, and innovations are virtual/mental rather than physical.
As such I would place the breakdown of cultural barriers as a result of information technology as a major breakthrough. Gordon is using how long it takes to clean your house as a metric of innovation. He's not using the fact that the current inhabitant of that house may be from another part of the world, and yet still feel comfortable because shared culture is more accessible.
>Robert Gordon's examples of new technology are almost all physical. It may be true that 1970 was the end of a century long "mechanical age". Now we are in the information age, and innovations are virtual/mental rather than physical.
I think this hits the nail on the head.
The other issue is that the closer to the present we get, the harder to see the impact of any particular advance. This is why, e.g. Nobel Prizes are so late in recognizing work.
That said, some advances are already possible to be recognized: reusable rockets, good search, good automated translation.
> '[except for] the exponential increase in computing power.'
That would be incremental improvement, not major technological change. Take this list from the article:
> In our century, for better or worse, progress isn't what it used to be. Northwestern University economist Robert Gordon argues that by 1970, all the key technologies of modern life were in place: sanitation, electricity, mechanized agriculture, highways, air travel, telecommunications, and the like.
Most of the major technological changes in computing were already available by 1980:
Software compatibility across computer generations (IBM 360), microprocessors and personal computing (8-bit micros like the Apple II), packet-switched networking, GUIs (Alto), all the major modern types of programming languages (imperative, functional, object-oriented), etc.
Most of the seemingly big things that have happened since fall more into the category of incremental improvements and popularizations than "major technological change."
Incremental improvement can have a life changing impacts. Google maps is just a map, we had them for thousands of years, I'm still mind blown every time I use it. A kindle is just a book, an A320 is just the same as the Wright brother's plane, &c.
> Most of the major technological changes in computing were already available by 1980:
Sure, but no one had access to it. Availability of tech also plays a huge role in "progress".
The big one missing is mobile telecommunication/computing. Basically the last most transformative tech since your list is the lithium battery I guess ?
And this transformative effect keeps on giving, with the transition to renewables and "electrified everything" (such as cars, but not only).
The thesis isn't that major advancements aren't happening, but that they're happening more slowly. Or maybe more accurately, there was an period in the 20th century that is now over where major advancements happened unusually quickly.
not to bang the trope drum too loudly, but Kurzweil mentions this phenomenon when describing the way that growth looks exponential over a long scale, but feels more like S-curves[1]
That thesis is not well supported by the article, and I didn't see it referencing better-quality material. And it grossly overestimates the speed of changes.
The structure of innovations isn't usually outward progress along a uniform frontier. Instead, one portion of the frontier proves more promising than others, innovators pour into that one area, it fragments into subfields, and then the descendants of that one field become the totality of experience for future generations.
Think of the major innovations of the 1500s and early 1600s Renaissance. Exploration of the Americas, colonialism, transatlantic slave trade, Protestant reformation, joint stock corporation, parliamentary monarchy, scientific method, nation-state, alchemy, and merchant guilds.
By the 1800s, the exploration of the Americas was basically complete. Colonialism was on its way out with U.S. independence (there was a subsequent era with the scramble for Africa, but that wasn't until the late 1800s). Slavery was banned in Britain and the importation of new slaves was banned in the U.S. The Protestant Reformation was old news. Alchemy had been largely debunked. Pretty much all progress was in the scientific method, and in the corporations that came to capitalize on it.
Yet we don't think of the 1800s as an innovation slowdown. Rather, we think of it as a time of rapid technological progress and massive social change. We've just changed our definition of what innovation is. Things like alchemy, colonialism, and slavery are viewed as quaint anachronisms, mistakes on the march of history.
It'll likely be the same with software vs. the massive manufacturing industries of the 20th century. Some of them (eg. coal power plants or gas-powered automobiles) will seem like outright mistakes; others will simply be taken for granted the way we take democracy and corporations for granted. And we'll start defining innovation in terms of different subfields of software, the same way we define industry in terms of the different subfields of automated manufacturing.
World War II is chronologically a mid point between the American Civil War and now. The progress made in the first half of that period seems to me vastly more impressive than that of the later half.
Idk, I can come up with so many example of life changing tech ... But yeah, all things considered, having running water or central heating is def more life changing than reusable spaceX rockets for example. The progresses of the first half are more impressive because they were basic needs with high impact on health, transportation, &c. once these are met everything simply becomes "nice things" to have.
I think the most obvious progress was smarthpones, they are :
- Navigation systems
- Allowing seamless and instant text and video communication worldwide
- Capable of holding more music than you'll ever need
- Having better cameras than anything before (price for quality/convenience, I know medium/large format cameras from < 1930 produce better images but they are nowhere as convenient)
- Letting you manage things like your bank account or your emails, &c.
- Allowing you to rent shared cars/bicycle on the fly
When you really think about it something like google Earth would have been close to witchcraft not even 70 years ago.
I could probably store the entire library of Alexandria in my $60 kindle.
A tesla is "just" a car, but hasn't much to do with cars from < 1950s.
An airbus A320 is "just" a plane, but has't much in common with planes < 1950s.
A single PS4 is probably more powerful that the entire NASA computing system of the 60s.
Reusable space rockets, supersonic planes, landing a probe on a comet, the curiosity rover, GMO, nuclear power plants, discovery of planets out of the solar system, credit cards, microwave ovens, efficient freezers, induction cooking plates, washing machines, dishwashers, tumble dryers, advances in psychology & medicine, satellite internet, photochromic lenses, solar panels, li ion batteries, birth control, 3d printers, you can watch lectures from top universities of the world on youtube for free, the list goes on forever.
If your grandparents are still alive ask them about how life was 60 years ago, everything changed. Being used to amazing tech doesn't negate the fact that progress still happens.
I think people just got bored/lost their ability to see magic. A dog playing with a stick is more ecstatic than the average Joe having the entire world's knowledge in the palm of his hand.
If you want to talk about magic, what do you think a Spitfire would look like to a Union soldier? How about radio? Electricity? Horseless carriages? Radar? The innovations in that period were of kind, 0 to 1 transitions, while from 1945 onwards we have seen rapid, but mostly incremental improvements.
One difference between those two is the amount of energy available. Going from roughly 100 watts of usable energy per person to 2-3000 watts was massive. What am I going to do with 50,000 watts that would make my life better? Commute to Seattle every day on a hypersonic personal airplane? Nah.
And what about the exponential miniaturization of everything? We can now have high res cameras that are crazy small and cost almost nothing. We are starting to make nano technology to use in surgery. We starting to use gene editing (poorly). Those all seem pretty freaking major.
No. Moore's law is that the number of transistors on a chip will double every 18 months. People confuse this with clock cycle doubling every 18 months (which did stop) but with the expansion of multiple cores, AIUI, Moore's law is still continuing.
The author doesn't seem to include any metrics for how they're measuring technological changes, or am I missing that? Seems like a graph and numbers would be useful for measuring what sort of slowdown it's talking about; most of the arguments seems to be looking at what wasn't accomplished.
We have no metric for innovation. So all we get is a bunch of cherry picking of what constitutes "major" "progress" based on undefined criteria, and then comparing those things across different eras.
Some organizations have created innovation indices, but they tend to measure inputs (e.g. R&D spending) more than outputs, or measure outputs that have questionable relationships to innovation (e.g. number of patents registered).
There are some obvious metrics that are likely to be very good: global energy use, global GDP, and economic efficiency from their quotient. It's the data we're missing (prior to 1970). It could be estimated, but almost no one has tried.
Another idea is to count the occurrences of strings representing years ("1974" etc) on wikipedia.... possibly weighted by the PageRanks of the containing articles...
I don't know about "very good." GDP is sometimes underrated as a metric, but it does have serious flaws that would prevent me from calling it "very good."
Capabilities-based approaches seemed more promising for a while, but it seems like they haven't resulted in anything better yet. I'm still unclear why there seems to be so little research on measuring actual value created by measuring things like consumer surplus (at least heuristically).
Really? I'd call it 'more descriptive of human civilization than any scalar has a right to be'.
What do you make of the S&P 500 index as a descriptor of the stock market?
Consumer surplus is one of those economics concepts that probably doesn't exist. People will generally pay anything they can afford. Fortunately, what they pay for positional goods and such mostly washes out as transfers or asset inflation and doesn't pollute GDP, which measures intrinsic value (bits, ultimately).
Sure, it could be one of those things that's "the worst except for all the other options." It's still flawed, the flaws have been known for a long time, and their magnitude is potentially large enough to warrant looking for other indicators.
The S&P 500 describes the S&P 500 fine, but it seems like a much simpler problem? They represent an estimate of a future cash flow generated by a basket of companies. That's way more straightforward than measuring value perceived by consumers.
How does consumer surplus not exist? How would you value something like e.g. Wikipedia?
Stock market and GDP indices are generally the same in that they are sums of correlated time series. Summing suppresses uncorrelated variance (often measurement noise) and retains correlated variance. The index's returns are essentially the first principal component of the returns of the index memebrs. This is how the CAPM works – stocks are analyzed by factoring out their responsiveness to the market.
Stock market and GDP indices are also particularly related, as stock prices are tied to corporate earnings and corporate earnings are a major component of GDP.
GDP has nothing to do with human perception. It measures the productive capacity of civilization, which is a physical quantity.[1]
It's often claimed that if you give something like Wikipedia away for free, its value will not show up in GDP. In fact, if Wikipedia is valuable, it must have a positive impact on civilization. This will be reflected in GDP, even if it's in the form of extra time/money people spend on ice cream in the summertime.
The maximum someone would pay to access to Wikipedia is likewise irrelevant. For one thing, it's a function of their personal wealth, which is an unrelated quantity. But in any case, paying more than something costs to produce just increases the buying power of the people selling it... But it can't increase their buying power above what civilization can produce. It's a just a wealth transfer. Money can't defeat physics.
All of that is straightforward. What can be surprising (at first) is how well GDP measures human happiness,[2] drives wealth and income inequality,[2] and predicts election outcomes.[3][4][5] Economic peaks coincide with moon landings, complexity in popular music, the discovery of canonical physics, commercialization of computing technology, and various athletic records.[6]
In short, there seem to be spikes or groupings roughly every 50 years or so. The most recent spike is relatively light in comparison. However, the impact of inventions is perhaps subjective and debatable. The Internet has had a huge impact on society, changing the way we communicate almost completely, from personal socializing to news.
my take is that we've harvested the easy pickings which were accessible as a result of formalized and professionalized scientific research and engineering processes. now, we're entering the era of formalized and professionalized data-driven processes.
the advances delivered by this latest technological movement aren't going to take us from gazing at birds to being able to construct a plane. they're going to take us from inefficiency to efficiency, and from ignorance about deep, broad, and complex phenomena to descriptive knowledge. if we're lucky, it'll take us to knowledge about causation in some areas too, but this technological revolution isn't about causes so much as about coping with effects.
it's the subtle stuff that reshapes the world invisibly and in ways that are difficult to comprehend without a wide view. for instance, in science, data-driven research via genome-wide association studies (GWAS) etc can lead to more efficient allocation of diagnostic resources in healthcare, meaning that people with certain characteristics can on average be healthier as a result.
So let’s leave out all the advances in biomedicine, personal genome sequencing, immunotherapy for cancer, genetic engineering advances for incurable diseases, etc. All of these are “technological”, yet this author was too focused on consumer tech to notice. What about battery technologies and new solar energy tech which make small devices and EV’s possible. The enormous advances in machine learning and AI (and algorithms in general across all areas of computation). Bioinformatics, advances in protein engineering, etc. I could go on and on. I think this article is going to be one of those quoted in ten years as being incredibly mistaken.
Progress has never been a linear slope of ever increasing capability in all technology areas. It's better to think of progress as a series of new directions that we couldn't access without the prior advances.
Self-cleaning house seemed like it would be here already. If you had described the technologies available in 2019 to myself in 2002 (when the roomba came out) - I would have expected there to also be a boston-dynamics-esque floor, window and countertop cleaning bot that was reasonably affordable. I didn't expect that walking around a house and picking things up would be this hard to automate.
Not necessarily. "Accelerating returns" is just another name for exponential growth. And the economy is growing exponentially. But the rate since 1970 is possibly lower than it was during the "special century".
Also, for what it's worth, Moore's law and its variants are just special cases of Wright learning.[1][2][3]
There's a quote in there somewhere along the lines of: "we've built the computer from the starship enterprise, but we possess little other tech from Star Trek"
Climate change and its impact on our agriculture, infrastructure, and political systems will be an increasing damping force on technological changes this century that I'm surprised didn't get a mention in this article.
The last two sentences of the article seem to at least obliquely address this:
> In this century, we urgently need to undo some of the consequences of the last great boom by developing affordable zero- and negative-emissions technologies. That's another hard problem—and to solve it, we'll need to recapture some of what made the “special century” so special.
That really depends- if we get on war footing to address the problem like we confronted WW2 perhaps. If we procrastinate hunger, water scarcity, and conflict have a tendency to be destructive to knowledge.
Hunger has become far less of a problem in the 21st century than it has ever been before, at any point in human history. And population growth is leveling off - the global population has quadrupled in the last century, but will probably level off about 50% higher than today in the next 50-100 years.
So you're postulating that hunger will get much worse than it is today, without really tremendous population growth, and with significant technological opportunity available in terms of applied computation/robotics and genetic engineering. This seems like an emotional argument, not a scientific one.
it seems you're not considering the effects that the climate crisis is expected to have on the conditions that our food production infrastructure is based upon. Water/soil/temperature are all pretty big components of this process that will (and already are to some extent) impact our ability to grow food.
And we can't change crops? Or change methods? I'm not buying that. Humans grow crops everywhere, from equatorial deserts to the arctic circle. And the changes that make a given crop totally unworkable (as opposed to not quite as efficient as before) are generation-scale, not the flip of a switch.
Water will still exist. Soil will still exist. Short of postulating worst-case scenarios (which are possible), this is all stuff we can handle with existing technology, much less the technology of a century from now.
could it also be the case nowadays, major technological changes require huge amounts of research and budgets as they tend to span multiple teams and many people are involved in the process.
In other words, more and more, it is a team-based effort to discover new things these days.
I can't remember the name of the phenomenon, but there's a theory that technological progress happens in cycles.
The first half of each cycle contains large spurts of innovation and change which result in big shifts to how the world works (internet, vaccines, the automobile, jets, nuclear power/weapons).
The second half contains incremental improvements on those big innovations and focus more on convenience/leisure (social media connecting the world, uber, netflix, airbnb, etc).
We're currently in the second incremental half, but things like Space X (mars colonization, cheap rockets), CRISPR (gene therapy curing numerous ailments), and the like, tell me that we're not far away from the beginning of another paradigm shift innovation phase.
Maybe "quality of life" isn't the best way to describe it. I meant things that are more focused on entertainment or making life more convenient, as opposed to completely changing culture/life for people.
Bringing together this and what Simon321 said here ( https://news.ycombinator.com/item?id=20686466 ), it may be that computers themselves have prolonged the current technological era by allowing us to wring out many more incremental improvements.
The article feels a little weak (though the bit about Wright, Lindbergh, and Armstrong were interesting). Scott Alexander has another take on the topic here [0]. It has more examples and charts, but no real conclusion.
In the books those only sabotaged experiments regarding subatomic fundamental physics. There was still tremendous progress in engineering above the quantum scale.
Take HN threads on drones, where it's taken for granted that they should be regulated so as to be "perfectly safe".
New personal transport tech, that I hear of mostly via they're being variously banned.
Shenzhen with few-hour delivery on assorted scooters, vs NYC's long-term debates over just what kinds of scooters to un-outlaw.
A visiting teacher from a high-school that still has lathes, near a middle-school with bandsaws, and their New England colleagues shock.
Fire and building codes that are a century thick accumulation of violations of the guidance that you shouldn't make major life decisions in the immediate aftermath of disaster trauma. Built on assumptions of decades-old now obsolete tech.
A high-water mark of running directly onto the BOS-LGA shuttle, with flight attendants later walking the aisles to collect two twenties from everyone, now sunk to security theater and prohibitive expense.
Greyhound leveraging corruption in politics and law enforcement, and press incompetence, to crush low-cost competitors. Leaving a black market mostly for illegals, the poor screwed, and the labor pool less flexible.
An industrial policy around patents that's been recognized as badly adjusted for tech for my entire multidecade career, and I don't expect to be fixed before retirement. Extensive valleys of death between research and commercialization, and between commercialization and long-term availability.
Progress in human interface devices being straightjacketed for decades into DIY by patents, then building towards ferment around VR/AR, only to be laid waste by acquisitions. If the soviet-design-bureaus of FAMG quickly achieve mass deployment, then it may have sort of been worth it, but what a cost of missed opportunities.
One could go on, but I'm unsure even this was useful.
The US has been a major driver of world technological change. And it seems the US has come to embraced friction over ferment, a geriatric peaceful quiet over the dirty disruptive inconvenience of progress.
Watching US writing about China, I'm repeatedly reminded of an old Europe writing about a new US. Sclerotic societies marveling and contemptuous and angry at one less mired. They did eventually improve - rebuilding after massive death and cities become rubble. Hmm, the Black Death had a similar effect. Perhaps with improved social tech and AR, we can manage a less painful rejuvenation?
I'm 53, and the amount of technological change I've seen in my lifeime is astonishing. I can clearly remember the first time I saw a video game (Pong) because I was 8 or so - old enough for it to make a huge impression. The first time I used a cell phone in a memorable way, I was 36 years old (9/11, calling my spouse to tell her what I just heard on NPR). This is ubiquitous, world-changing technology that simply didn't exist when I was young.
Now, think of CRISPR and other DNA-altering technology, and what that will mean going into the next century. Think about renewable energy more or less completely replacing fossil fuel in coming decades. Think about the extraordinary amount of light-cognitive work, like driving, that will be replaced by automation.
Final perspective... I have a potentially deadly chronic illness that hit me a few years ago, but I've probably had latent since childhood. It affects children more than adults at onset. If it had happened to me as a child, I would have died. Now, it's an inconvenience.