Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Haven’t you ever had the experience of knowing exactly what you want to say but struggling to find the word for it?

I came here to suggest this exact thing. Humans clearly have some understanding of the meaning behind words and sentences. However, I think it's wrong, an overgeneralization, to suggest that ChatGPT is just statistically predicting the next word. While that may technically be true, I think buried deep within it has ways of modeling/encoding common concepts and ideas, maybe similar to how a human models concepts and ideas in their mind.

Then there's the whole problem of consciousness, but I won't get into that here.



>I think buried deep within it has ways of modeling/encoding common concepts and ideas, maybe similar to how a human models concepts and ideas in their mind

Right, maybe meaning and concepts are just “things that usually go together” with different levels of distance.

We’re comparing our minds, which we know very little of, to LLM, which are blackboxes. It seems like a lot of certainty in face of such vast ignorance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: