Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> You are wrong. It does have encoded memory of what it has seen, encoded as a matrix.

Not after it's done generating. For a chatbot, that's at least every time the user sends a reply back; it rereads the conversation so far and doesn't keep any internal state around.

You could build a model that has internal state on the side, and some people have done that to generate longer texts, but GPT doesn't.



Yes but for my chat session, as a "one time clone" that is destroyed when the session ends, it has memory unique to that interaction.

There's nothing stopping OpenAI using all chat inputs to constantly re-train the network (like a human constantly learns from its inputs).

The limitation is artificial, a bit like many of the arguments here trying to demote what's happening and how pivotal these advances are.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: