The use of “hallucinate” is a stroke of true evil genius in the AI world.

In ANY other context we’d just call them errors & the fail rate would be crystal clear.

Instead, “hallucinate” implies genuine sentience & the *absence* of real error.

Aw, this software isn’t shit! Boo’s just dreaming!

@Catvalente If a coworker said the things the chatbots were saying, you would have to ask why they were lying to you.
@thedoh @Catvalente Or if a coworker was hallucinating at work, you'd eventually get them fired, jailed or hospitalized.