The use of “hallucinate” is a stroke of true evil genius in the AI world.

In ANY other context we’d just call them errors & the fail rate would be crystal clear.

Instead, “hallucinate” implies genuine sentience & the *absence* of real error.

Aw, this software isn’t shit! Boo’s just dreaming!

@Catvalente Yes, I couldn’t agree more. In terms of mental phenomenology (if you’re going to use human terms to describe these things) the AI mistakes cannot be hallucinations. These mistakes are not based on a sensory experience.

A confabulation is a more accurate term - inventing things to fill the gaps