Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is an advanced language model that can now run quickly on consumer grade hardware. You used to need thousands of dollars of GPUs to run a model as sophisticated as this - now it can be done on a laptop,


Wasn't LLaMa official meant to run on consumer grade machine? How does this modify the model to make it work.

All of this is confusing.


Yes but it wasn't made to run on a Mac. This project ported LLaMA to Apple Silicon so all the macbook users can finally play with what the rest of us have had access to for the past couple of weeks.


Run meaning run inference, not train, right?


Yes




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: