Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I find GPT-3.5 can be tripped up by just asking it to not to mention the words "apologize" or "January 2022" in its answer.

It immediately apologises and tells you it doesn't know anything after January 2022.

Compared to GPT-4 GPT-3.5 is just a random bullshit generator.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: