And when I find a human hallucinating at the job I absolutely
need them to do, I avoid them where possible too!
But honestly, LLMs are here to stay. I don't like them for zero verification + high trust requirements. IE when the answer HAS to be correct.
But generating viewpoints and ideas, and even code are great uses - for further discussion and work. A good rubber duck. Or like a fellow work colleague that has some funny ideas but is generally helpful.
The problem is that human beings are far more likely to know what they don't know. And we build a lot of our trusting work environments around that feature. An LLM cannot know what it doesn't know by definition.
> The problem is that human beings are far more likely to know what they don't know.
I’ve spent a career dealing with the complete opposite. People with egos who can just not bare to admit when they don’t know and will instead just dribble absolute shit just as confidently as an LLM does until you challenge them enough that they just decide to pretend the conversation never happened.
It’s why I, someone fairly mediocre have been able to excel because despite not being the smartest person in the room, I can at least sniff bullshit.
Yeah sure, some people do this. But average humans understand the limit of their knowledge. LLMs cannot do that. You can find the right person for a space where this knowledge of limitations is necessary. Can't find an LLM which does that
I will grant you that there are at least some of us capable of this, where you’ll find no LLM capable.
> average humans understand the limit of their knowledge.
We’ll have to agree to disagree here. I’d call it a minority, not the average.
Which is why we live in a world where huge numbers of people think they know significantly more than they do and why you will find them arguing that they know more than experts in their fields. IT workers are particularly susceptible to this.