The current state of the art in AI image generation was unimaginable a few years back. The idea that it'll stay as-is for the next century seems... silly.
If you're talking about some sort of non-existent sci-fi future "AI" that isn't just log-likelihood optimization, then most likely such a fantastical thing wouldn't be using NVidia's GPU with CUDA.
This hardware is only good for current-generation "AI".