Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Did you consider other use cases in which people need custom models and inference other than just open source LLMs ?


Yes. Click through to the L40S post the article links to (the L40S's aren't going anywhere).

There are people doing GPU-enabled inference stuff on Fly.io. That particular slice of the market seems fine?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: