Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This doesn't appear to indicate whether the model is running locally, so I assume it's not. I'll continue to run Ollama locally in my terminal on the rare occasions that I see a use for it.


the model is running locally - you can check by turning off the wifi.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: