No one should say that a chatbot "hallucinates". Chatbots do not have minds, they manipulate text. Hallucination requires not only consciousness but the physical brain to falsely perceive a sensation as real. Machine learning models have neither consciousness or physical form, and they never will.

#AIHype #mathymath

@annedrewhu Yes! It seems to me that everyone is submitting to using the term even if they are not convinced!

https://dair-community.social/@OmaymaS/110017912767830605

Omayma (@[email protected])

I have a feeling that the term "Hallucination" of Large Language Models stuck and many people will regret it for years to come like the term AI.

Distributed AI Research Community