felipefar 4 days ago

Yes, most people who have an incentive in pushing AI say that hallucinations aren't a problem, since humans aren't correct all the time.

But in reality hallucinations either make people using AI lose a lot of their time trying to stuck the LLMs from dead ends or render those tools unusable.

2
Gormo 3 days ago

> Yes, most people who have an incentive in pushing AI say that hallucinations aren't a problem, since humans aren't correct all the time.

Humans often make factual errors, but there's a difference between having a process to validate claims against external reality, and occasionally getting it wrong, and having no such process, with all output being the product of internal statistical inference.

The LLM is engaging in the same process in all cases. We're only calling it a "hallucination" when its output isn't consistent with our external expectations, but if we regard "hallucination" as referring to any situation where the output for a wholly endogenous process is mistaken for externally validated information, then LLMs are only ever hallucinating, and are just designed in such a way that what they hallucinate has a greater than chance likelihood of representing some external reality.

jimbokun 3 days ago

> Yes, most people who have an incentive in pushing AI say that hallucinations aren't a problem, since humans aren't correct all the time.

We have legal and social mechanisms in place for the way humans are incorrect. LLMs are incorrect in new ways that our legal and social systems are less prepared to handle.

If a support human lies about a change to policy, the human is fired and management communicates about the rogue actor, the unchanged policy, and how the issue has been handled.

How do you address an AI doing the same thing without removing the AI from your support system?

anonzzzies 3 days ago

Still fine the company who uses the AI? It cannot be prevented with the current state of AIs so you will need a disclaimer and, if a user cancels, show the latest support chats in the crm for that user so you can add a human in the mix.