> You are wrong. It does have encoded memory of what it has seen, encoded as a matrix.
Not after it's done generating. For a chatbot, that's at least every time the user sends a reply back; it rereads the conversation so far and doesn't keep any internal state around.
You could build a model that has internal state on the side, and some people have done that to generate longer texts, but GPT doesn't.
Not after it's done generating. For a chatbot, that's at least every time the user sends a reply back; it rereads the conversation so far and doesn't keep any internal state around.
You could build a model that has internal state on the side, and some people have done that to generate longer texts, but GPT doesn't.