one thought -- i agree with your sentiment towards ai, but i think the goal of stopping AGI is fruitless. even if we stop OpenAI, there will be companies/entities in other countries that will proceed where OpenAI left off.
there is zero chance of surviving AGI in the long term. if every human were aware of whats going on, like they are aware of many other pressing issues, then stopping AGI would be easy. in comparison to surviving AGI, stopping it is trivial. training these models is hugely expensive in dollars and compute. we could easily inflate the price of compute through regulation. we could ban all explicit research concerning AI or anything adjacent. we could do many things. the fact of the matter is that AGI is detrimental to all humans and this means that the potential for drastic and widespread action does in fact exist even if it sounds fanciful compared to what has come before.
a powerful international coalition similar to NATO could exclude the possibility of a rogue nation or entity developing AGI. its a very expensive and arduous process for a small group -- you cant do it in your basement. the best way to think about it is that all we have to do is not do it. its easy. if an asteroid was about to hit earth, there might be literally nothing we could do about it despite the combined effort of every human. this is way easier. i think its really ironic that the worst disaster that might ever happen could also be the disaster that was the easiest to avoid.
the price of compute is determined by the supply of compute. supply comes from a few key factories that are very difficult to build, maintain and supply. highly susceptible to legislation.
how? the same way that powerful international coalitions do anything else... with overwhelming economic and military power.
You can't do it in your basement as of 2023. Very important qualification. It's entirely plausible that continuous evolution of ML architectures will lead to general AI which anyone can start on their phone and computer and learn online from there.
i think we need to "survive AGI"