@Catvalente I keep telling people that they're just helping the scam when they use terms like "AI" to describe a thing that, by definition, can never ever become anything remotely close to AI. You're right. Using hallucination and any other terms the actual scammers come up with are all things that just help their scam instead.
This is why I almost always tell people not to call it "AI." Preferably refer to the actual individual tech involved specifically like "LLM" or "image diffusion." And yeah, don't refer to the inevitable errors as "hallucinations" because that term doesn't even make sense in context of how they function...
(To the people saying "technically it's not actually an error" it is a logic/correctness error, but, putting that aside, it sure AF is not a "hallucination")