Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're making a specific claim here: "it was taught to react in specific way on specific word", and I'm giving you a specific example of an invented word that GPT used to full effect when composing. Can you explain how it was "taught to react" to this word in a manner that enables it to use it in the new meaning that wasn't even in its training set?


I answered you, gpt is pattern extrapolator and was taught to be so, creating new word is totally fits pattern extrapolation.


How about using that invented word consistently afterwards?

It seems to me that the argument you're trying to make can be extended more or less indefinitely, though. But there's a point at which "it's a pattern extrapolator" becomes less of an explanation and more of a largely irrelevant piece of trivia.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: