Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Needs a big honking datacenter or billions of compute credits and safety for 6-12 months.


Doesn't Alpaca seem to suggest that assumption is no longer true?


Alpaca rides on LLaMA. And LLaMA was trained on 1T tokens for a long time. The fine-tuning takes one hour with low rank adaptation. But pre-training a new model takes months.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: