No. Quite the opposite. Everyone else is in an echo chamber telling them that AI is a cool fad, a toy that will never fully replace people, etc., when in reality AI is the most disruptive technology since the Internet, and the most dangerous technology since the atomic bomb.
I can't stop shaking my head whenever I read any article on AI written by non-tech journalists (and even many by tech journalists). AI is vastly more dangerous and more urgent than climate change, than Russia and China, than literally any other hot topic today, and it's being treated like a combination of a science fiction story and an entertainment tool.
Russia: A country with one of the largest nuclear arsenals literally threatening a large portion of the world with nuclear annihilation.
Excuse me when the prospect of changes in the economic landscape for white collar jobs doesn't look particularly frightening compared to such problems. Especially since we live in a day and age where 2 entire generations have lived most of their lives in almost constant economic upheavals anyway.
As for all the AI-doomerism that's flying around the net: As long as no one can even give me a precise, quantifieable definition of "general intelligence", aka. one that doesn't include pointing at ourselves, and a method to measure how far AI is from that, I will work under the assumptions confirmed by what is measureable and observable: that what we have are still stochastic inference engines.
Russia isn't "literally threatening a large portion of the world with nuclear annihilation". Any 'threat' they have issued has been in response to 'threats' issued by 'our' side... including at the time UK PM Liz Truss being willing to push the nuclear button against Russia. If that's not a threat then neither is Russia saying that they'll be willing to use nukes too.
Our politicians together with our sycophantic media and their weapons salesmen talking heads really do spread the most egregious disinformation throughout every wartime situation we are involved in, by proxy or otherwise.
> Any 'threat' they have issued has been in response to 'threats' issued by 'our' side
Then please, link me the relevant statements. Liz Truss's (who btw. isn't Britains PM any more) remarks were made in late August 2022 [1]. The russian nuclear sabre-rattling started in February 2022 [2].
So who exactly has threatened russia with nuclear weapons to elicit these responses? Helping a souvereign nation defend itself against an invasion and protecting their land and people, is not a threat. Offering a souvereign nation to join a military pact is also not a threat.
But enough unemployment caused by the next wave of automation is sure going to cause civil unrest.
Look at the chartists, luddite and the saboteurs. Some of them were weavers who up until that point had been in high society, running parts of countries through the guilds system. Then over a couple of years, the bottom fell out and they were cast into the mills like the unlanded labours.
That, was not a smooth transition.
The people that claim "oh there will be new jobs" I mean sure, there probably will be, but they forget to mention the important qualifier: "eventually"
> AI is vastly more dangerous and more urgent than climate change, than Russia and China, than literally any other hot topic today
No, it's not. And most (not quite all, there are some genuine nutballs) of the people selling that idea are selling it to push a political agenda attached to their financial interest (mostly, in AI: either by pushing AI danger to advance competition-limiting regulatiom or by pushing the kind of AI danger that is not actually imminent with hyperbolic language to distract from the real and present issues with AI, and sometimes both.)
Horseshit. There is not a single scientific model that predicts human extinction from climate change. Parts of humanity, yes. All of humanity, no chance.
There a plenty of models that predict human extinction from AGI.
I don't think it's likely, but climate change could easily wipe out humanity by exacerbating pre-existing geopolitical tensions. For instance, Pakistan and India famously share a river, and the treaty that governs its water rights was written with the assumption that the river wouldn't reduce in output. A sufficiently severe drought could cause a war between India and Pakistan, and both countries have nukes.
You could dispute that WW3 would cause human extinction, and while I'm not remotely certain of it I think that WW3 could cause extinction, if there's a sufficient combination of 1) climate collapse and 2) worldwide economic collapse that makes high-tech systems impossible to sustain.
Climate change will kill millions in the next ~40-60 years and if course isn't corrected 100s of millions will follow. Seems like it will do a pretty damn good job of killing us even if it fails to kill all of us.
I agree AGI would be horrendously dangerous and if achieved has a higher chance of complete extinction. However, we don't have AGI and it's still not clear we ever will.
These idiots will believe anything if you put “climate change” in front of it…
“Let’s block out the sun, it will be good for global warming” says the billionaire as plants and animals freeze and die. WE NEED THE SUN THEY ARE TRYING TO KILL US OFF. Duh if this is not self evident then you are already dead.
I can't stop shaking my head whenever I read any article on AI written by non-tech journalists (and even many by tech journalists). AI is vastly more dangerous and more urgent than climate change, than Russia and China, than literally any other hot topic today, and it's being treated like a combination of a science fiction story and an entertainment tool.