Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If the superlative LLM can’t handle prompts from another provider, it just isn’t the superlative LLM.

This area by definition has no moats. English is not proprietary.

Use case is everything.



Switching to another LLM isn't always about quality. Being able to host something yourself at a lower or equal quality might be preferred due to cost or other reasons; in this case, there's no assumption that the "new" model will have comparable outputs to another LLM's specific prompt style.

In a lot of cases, you can swap models easier but all the prompt tweaking you did originally will probably need to be done again with the new model's black box.


Host something yourself also for educational reasons, just experimenting, this is how new applications and technologies to be discovered and created.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: