Now I am going to go and write a wrapper around llamacpp, that is only open source, truly local.
How can I trust ollama to not to sell my data.
You don't need to use Turbo mode; it's just there for people who don't have capable enough GPUs.
Now I am going to go and write a wrapper around llamacpp, that is only open source, truly local.
How can I trust ollama to not to sell my data.