Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is there a reason to expect it'd be significantly more expensive than current-gen LLM? Reading the "Implementation Details" section, this was done with GPT2-medium, and assuming running it is about as intensive as the original GPT2, it can be run (slowly) on a regular computer, without a graphics card. Seems reasonable to assume future versions will be around GPT-3/4's price.


Agreed! There's also no way this is 5 years away from being viable.

I just checked the timestamps on my Dall-E Mini generated images. They're dated June 2022

This is what people were doing on commodity hardware back then:

https://cdn-uploads.huggingface.co/production/uploads/165537...

This is what people are doing on commodity hardware now:

https://civitai.com/images/3853761

I'm not even going to try to predict what we'll be able to do in 2 years time; even when accounting for the current GenAI hype/bubble!


Perhaps not, but it begs the question of if GPT is affordable for a dev to begin with. I don't know how they would monetize this sort of work so it's hard to say. But making game models probably requires a lot more processing power than generating text or static images.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: