Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
thimabi
22 days ago
|
parent
|
context
|
favorite
| on:
Ollama Turbo
I’m not throwing the towel on Ollama yet. They do need dollars to operate, but still provide excellent software for running models locally and without paying them a dime.
recursivegirth
22 days ago
[–]
^ this. As a developer, Ollama has been my go-to for serving offline models. I then use cloudflare tunnels to make them available where I need them.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: