Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The point is that these problems will follow the same growth trajectory as every other tech bug. In other words, they will go away eventually.

But the Rubicon is still crossed. There is a general purpose computer system that understands human language and can write real sounding human language. That's a sea change.




> "understands human language"

I've got some oceanfront property in Wyoming to sell you.


> will follow the same growth trajectory as every other tech bug

What you're referring to isn't a bug. It's inherent to the way LLMs work. It can't "go away" in an LLM model because...

> understands human language

...they don't. They are prediction machines. They don't "understand" anything.


> What you're referring to isn't a bug. It's inherent to the way LLMs work. It can't "go away" in an LLM model because...

The 'bug' presented above is a simple case of not understanding correctly. Larger models, models with MOE, models with truth guages, better selection functions, etc will make this better in the future.

> ...they don't. They are prediction machines. They don't "understand" anything.

Implementation detail.


Prediction without understanding is just extrapolation. I think you're just extrapolating your prediction on the abilities of future LLM based prediction machines.


What do you mean by understand?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: