Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Don't know why this gets downvoted. It is simply true. In most cases, ChatGPT4 gets it right. You have to check anyway.


I agree. Also, I've had good luck asking GPT-4 to cite sources with its replies. It speeds up the process of fact-checking, and makes "hallucination" detection trivial. (Does the link 404?)

Obviously it's not perfect, but it's no worse than asking human coworkers, which are also sometimes wrong. (I'd prefer not to interrupt my human coworkers with questions with searchable answers anyway.)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: