Why Do AI Chatbots Have Such a Hard Time Admitting ‘I Don’t Know’?
Wrong answers [from AI bots] are known as hallucinations because AI apps like ChatGPT and Gemini express them with total confidence. As AI is integrated into our workplaces, schools and personal lives, they pose increasing risks for the people who use the technology. Researchers who once dismissed hallucinations as a relatively minor problem are now working on numerous potential fixes.
Why? Because the geeks who design them can't admit they don't know.
Believe me. I've spent a lot of years working around geeks — professional geeks, even geekier than I am (I know that's a stretch but hear me out) — and almost (note that ungeeky word) all have an unshakable belief they're right about everything (no backing down on that).
You can see that playing out right now on the national stage. Hallucinations everywhere.
You might see it as kind of cute — precocious — in a six year-old.
Until they figure out a way to fix that it's up to the rest of us to do the knowing.