I don't like the term "hallucinations" when we talk about AI. Sure, LLMs can get things wrong, but a hallucination is an error in perception, and you can't have an error in perception when there's no one there to perceive. The only hallucinations that are happening are on your side of the keyboard.
@maxleibman then what do you suggest?
@Amoshias Anything less anthropomorphized would be an improvement, but it’s a losing battle because hallucinations has become the term of art.