I don't like the term "hallucinations" when we talk about AI. Sure, LLMs can get things wrong, but a hallucination is an error in perception, and you can't have an error in perception when there's no one there to perceive. The only hallucinations that are happening are on your side of the keyboard.
@maxleibman @europlus I think the layperson definition is closer to āspontaneously random imaginary visionā and the āerrorā in perception is directly related to having an expectation of a measured observation of reality. Whereas if you shut your eyes and try, you can hallucinate on purpose ā thereās no error but itās still hallucinating.
But, the layperson definition might need it to be vivid before it could get that label.