I don't like the term "hallucinations" when we talk about AI. Sure, LLMs can get things wrong, but a hallucination is an error in perception, and you can't have an error in perception when there's no one there to perceive. The only hallucinations that are happening are on your side of the keyboard.
@maxleibman That's a great point. What do we call them then? just "errors"?
@VE3RWJ That I don’t have a good answer to.
@maxleibman great point about these things not perceiving anything. It's so hard not to anthropomorphize
@VE3RWJ @maxleibman I like the term “wild extrapolation”.
@masp @maxleibman that's good