This is like asking "Why should I worry about a UDP interface that I send 100 bytes of data too, and it sends 10,000 bytes back"
Async affects outside of current power dynamics and upset entire systems and cause rapid collapse. In the case of internet DDOS it's a botnet sending out a few MB/s of data and generating GB/s of attacks. In the case of a superintelligent AI its _______?
Sure, even nuclear annihilation is asymmetric - the amount you spend on making the bombs is much less than the amount you can destroy with them. But that doesn't automatically imply that there are routes to global destruction that are arbitrarily cheap.
The implication seems to be that the superintelligent AI will somehow figure out a way to destroy the world that requires significantly less than the "huge capital and expertise" needed for the nuclear scenario. But that's a critical assumption that requires justification, not something to be left as an exercise to the reader.
How much does, and/or will genetic manipulation cost in the future?
Bio-terrorism is a potential massive risk as proliferation of technologies for customized gene therapies continually develop. If you have the tooling and knowledge to make rice have a bountiful crop, then it's highly likely this skillset easily transfers to dramatically affecting the rice crop via disease. Could a genetic scientist do this now? Likely possible, but if you have a scientist in a box with no ethics and all the time in the world, you're apt to get pretty far.
Async affects outside of current power dynamics and upset entire systems and cause rapid collapse. In the case of internet DDOS it's a botnet sending out a few MB/s of data and generating GB/s of attacks. In the case of a superintelligent AI its _______?