The use of “hallucinate” is a stroke of true evil genius in the AI world.

In ANY other context we’d just call them errors & the fail rate would be crystal clear.

Instead, “hallucinate” implies genuine sentience & the *absence* of real error.

Aw, this software isn’t shit! Boo’s just dreaming!

@Catvalente But wouldn't "real error" require an attempt to be right? AI is just a probabilistic tool. You don't consider dice to make "errors" even when a throw results in a different result than you hoped for. (Not that you would claim dice to "hallucinate" either, but like another commenter already pointed out, that word has some prior history in AI context)

@cazfi @Catvalente

MacKay (2003, _Information Theory, Inference, and Learning Algorithms_, Cambridge University Press, <https://web.archive.org/web/20170610174915/https://www.inference.org.uk/itprnn/book.pdf>), calls them "spurious stable states": does that nicely avoid both sets of pitfalls?

Wayback Machine