I agree. Also, I've had good luck asking GPT-4 to cite sources with its replies. It speeds up the process of fact-checking, and makes "hallucination" detection trivial. (Does the link 404?)
Obviously it's not perfect, but it's no worse than asking human coworkers, which are also sometimes wrong. (I'd prefer not to interrupt my human coworkers with questions with searchable answers anyway.)