Is there a reason to expect it'd be significantly more expensive than current-gen LLM? Reading the "Implementation Details" section, this was done with GPT2-medium, and assuming running it is about as intensive as the original GPT2, it can be run (slowly) on a regular computer, without a graphics card. Seems reasonable to assume future versions will be around GPT-3/4's price.
Perhaps not, but it begs the question of if GPT is affordable for a dev to begin with. I don't know how they would monetize this sort of work so it's hard to say. But making game models probably requires a lot more processing power than generating text or static images.