Hacker News new | past | comments | ask | show | jobs | submit login

Maybe I am too pessimistic about the future, but I am afraid we will see _embedded ads_ in generated responses by ChatGPT or similar bots.

I don't think we will be able to run open source version of models that are as good as the proprietary ones, similar to how we can't run google locally




Running locally is not necessarily open source. I am quite sure Apple is cooking something we can run locally but that is not open source. And you can run elasticsearch locally which does enough of what Google does; it’s the index size you cannot run locally cheaply. With LLMs, you don’t have that issue: for the price of a car, you can run gpt4 inference at home. I guess for that same money, you can run elasticsearch with at least a good portion of the web indexed. It’ll get cheaper fast and the distance in quality between local and remote with get smaller. It has to if these companies want to make any money instead of burning through billions of investments.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: