This is explored further in Nick Bostrom's Vulnerable World Hypothesis. He believes that not even a total global surveillance system would be able to stop a single teenager with the recipe for a homemade pathogen. Nothing changes about the laws of the universe in that case, only the fact that the information was not in our minds before and then it is, and we collectively cannot forget or uninvent it.
We as a species have only survived ~75 years since we became aware of nuclear weapons, which is only a small portion of our overall history.
This has taught me to be mindful of the state of being unaware, because once you are aware of something it is impossible to reverse unless you're an amnesiac. Collectively forgetting something as a society then becomes infeasible without the technology and the willingness to revert our knowledge to a safer state, and a single outlier can ruin the entire system. When Jobs held up the first smartphone and we all started cheering and fantasizing, did any of us think it would lead to this?
(I also think that a pill to forget the experience of [addictive thing] would be revolutionary, but sadly our ingrained curiosity might undo the effect shortly afterward. A lot of tech takes advantage of our curiosity.)
This makes me wonder: what compels us to keep researching AI despite us also being able to hypothesize all these game-over scenarios? Why does it feel like we have no choice but to be crushed by unbounded progress?
Perhaps 50 years from we may find that nothing short of halting all research in the name of curiosity was the only option that could have preserved our culture/existence/values for another few decades.
I was more trying to point out that what we might hope comes out of new technology in the future may not necessarily come true, or the benefits might come with as-yet-unknown downsides that only become visible by the technology being proliferated extensively (and irreversibly).
We as a species have only survived ~75 years since we became aware of nuclear weapons, which is only a small portion of our overall history.
This has taught me to be mindful of the state of being unaware, because once you are aware of something it is impossible to reverse unless you're an amnesiac. Collectively forgetting something as a society then becomes infeasible without the technology and the willingness to revert our knowledge to a safer state, and a single outlier can ruin the entire system. When Jobs held up the first smartphone and we all started cheering and fantasizing, did any of us think it would lead to this?
(I also think that a pill to forget the experience of [addictive thing] would be revolutionary, but sadly our ingrained curiosity might undo the effect shortly afterward. A lot of tech takes advantage of our curiosity.)
This makes me wonder: what compels us to keep researching AI despite us also being able to hypothesize all these game-over scenarios? Why does it feel like we have no choice but to be crushed by unbounded progress?
Perhaps 50 years from we may find that nothing short of halting all research in the name of curiosity was the only option that could have preserved our culture/existence/values for another few decades.