Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m not throwing the towel on Ollama yet. They do need dollars to operate, but still provide excellent software for running models locally and without paying them a dime.


^ this. As a developer, Ollama has been my go-to for serving offline models. I then use cloudflare tunnels to make them available where I need them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: