Hacker News new | past | comments | ask | show | jobs | submit login

"useful hallucination" so much AI glazing its crazy



"useful hallucination" so much AI glazing its crazy

I'm still a fan of the standard term "lying." Intent, or a lack thereof, doesn't matter. It's still a lie.


Intent does matter if you want to classify things as lies.

If someone told you it's Thursday when it's really Wednesday, we would not necessary say they lied. We would say they were mistaken, if the intent was to tell you the correct day of the week. If they intended to mislead you, then we would say they lied.

So intent does matter. AI isn't lying, it intends to provide you with accurate information.


The AI doesn't intend anything. It produces, without intent, something that would be called lies if it came from a human. It produces the industrial-scale mass-produced equivalent of lies – it's effectively an automated lying machine.

Maybe we should call the output "synthetic lies" to distinguish it it from the natural lies produced by humans?


There is actually an acknowledged term of art for this: "bullshit".

Summary from Wikipedia: https://en.m.wikipedia.org/wiki/Bullshit

> statements produced without particular concern for truth, clarity, or meaning, distinguishing "bullshit" from a deliberate, manipulative lie intended to subvert the truth

It's a perfect fit for how LLMs treat "truth": they don't know so that can't care.


I’m imagining your comment read by George Carlin … if only he were still here to play with this. You know he would.


Elwood: What was I gonna do? Take away your only hope? Take away the very thing that kept you going in there? I took the liberty of bullshitting you.

Jake: You lied to me.

Elwood: Wasn't lies, it was just... bullshit.


AI doesn’t have “intent” at all.


So intent does matter. AI isn't lying, it intends to provide you with accurate information.

Why are we making excuses for machines?


If intent doesn't matter, is it still lying when the reality happens to coincide with what the machine says?

Because the OP's name seems way more descriptive and easier to generalize.


So you're saying deliberate deception, mistaken statements and negligent falsehoods should all be considered the same thing, regardless?

Personally, I'd be scared if LLMs were proven to be deliberately deceptive, but I think they currently fall in the two later camps, if we're doing human analogies.


Have you asked your LLMs if they're capable of lying?

Did the answers strike you as deceptive?


“Glazing” is such performative rhetoric its hilarious


It's (I'm pretty sure) Gen Z slang. I only started hearing it sometime this year.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: