Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Were water mills, spinning jennies, and printing presses dehumanizing too?


In a way; water mills and spinning jennies led to the dickensian horrors of the textile mills: https://www.hartfordstage.org/stagenotes/acc15/child-labor

The industrialisation itself, although increased material output, decimated the lives and spirits of those who worked in factories.

And the printing press led to the Reformation and the thirty years war, one of the most devastating wars ever.


...and led to our current time of maximal abundance, free time, leisure, freedom to work in more ways, and peace.


Yes, there are many books written about the dehumanizing aspects of the industrial revolution.

Consider we still place particular value on products which are “artisanal” or “hand crafted.”


Of course!

There were people whose entire identities were tied to being able to manually copy a book.

Just imagine how much they seethed as printing press was popularized.


Is that how it went, I wonder?

https://academic.oup.com/book/27066?login=false

Seems the scribes kept going for a good hundred years or so, doing all the premium and arty publications.


Kinda? https://en.wikipedia.org/wiki/Luddite

> The Luddite movement began in Nottingham, England, and spread to the North West and Yorkshire between 1811 and 1816.[4] Mill and factory owners took to shooting protesters and eventually the movement was suppressed by legal and military force, which included execution and penal transportation of accused and convicted Luddites.


I think there are quite a few dehumanizing aspects of the industrial revolution. It wasn't just the water mills, but rather the lengths we put people through to keep them running.


We don't even need to go back that far.

All these arguments could be made for, say, news media, or social media.

AI being singled out is a bit disingenuous.

If it is dehumanizing, it is because our collective labor, culture, and knowledge base have concerted to make it so.

I guess, people should really think of it this way: A database is garbage in, garbage out, but you shouldn't blame the database for the data.


All those arguments have been and are still being made for MSM and social media. AI is not being singled out.


No, because they aren't the same. Those things are tools that reallocate cognitive burden. LLMs destroy cognitive burden. LLMs cause cognitive decline, a spinning jenny doesn't.


I don't know man?

Gonna have to disagree there. A lot of models are being used to reallocate cognitive burden.

A phd level biologist with access to the models we can envision in the future will probably be exponentially more valuable than entire bio startups are today. This is because s/he will be using the model to reallocate cognitive burden.

At the same time, I'm not naive. I know that there will be many, many non phd level biologist wannabes that attempt to use models to remove entirely cognitive burden. But what they will discover is that they are unable to hold a candle to the domain expert reallocating cognitive burden.

Models don't cause cognitive decline. They make cognitive labor exponentially more valuable than it is today. With the problem being that it creates an even more extreme "winner take all" economic environment that a growing population has to live in. What happens when a startup really only needs a few business types and a small team of domain experts? Today, a successful startup might be hundreds of jobs. What happens when it's just a couple dozen? Or not even a dozen? (Other than the founders and investors capturing even more wealth than they do presently.)


I'd totally agree with this point if we assume that efficiency/performance growth will flatten at some point. For example, if it gets logarithmic soon, then the progress will grow slowly over the next decades. And then, yes, it will likely look like that current software developers, engineers, scientists, etc., just got an enormously powerful tool, which knows many languages almost perfectly and _briefly_ knows the entire internet.

Yet, if we trust all these VC-backed AI startups and assume that it will continue growing rapidly, e.g., at least linearly, over the next years, I'm afraid that it may indeed reach a superhuman _intelligence_ level (let's say p99 or maybe even p999 of the population) in most of the areas. And then why do you need this top of the notch smart-ass human biologist if you can as well buy a few racks of TPUs?


Because only the biologist knows what assays to ask the super human intelligence for. And how the results affect the biomolecular process you want to look at.

If you can’t ask the right questions, like everyone without a phd in biology, you’re kind of out of luck. The superhuman intelligence will just spin forever trying to figure out what you’re talking about.


It doesn't really matter what something can be used for, it matters what it will be used for most of the time. Television can be used for reading books, but people mostly don't use it that way. Smartphones can be used for creation, but people mostly don't use them that way. You've got Satya Nadella on a stage saying AI makes you a better friend because it can reply to messages from your friends for you. We are creating, and to a large extent have created, a world that we will not want to live in, as evidenced by skyrocketing depression and the loneliness epidemic.

Read Neil Postman or Daniel Boorstin or Marshall McLuhan or Sherry Turkle. The medium is the message.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: