Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's impossible for humans to know a lot about everything, while LLMs can. So an LLM that sacrifices all that knowledge for a specific application is no longer an AI, since it would show its shortcomings more obviously.


They're still very bounded systems (not some galaxy brain) and training them is expensive. Learning tradeoffs have to be made. The tradeoffs are just different than in humans. Note that they're still able to interact via natural language!


The world’s shittiest calculator powered by a coin battery is an AI. I think you’re being overly narrow or confusing it with AGI




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: