When I make a mistake, I'm "wrong"
When GPT-4 does, it's "hallucinating"
I'm not wrong, just had something extra in my tea this morning 🤷♀️
When I make a mistake, I'm "wrong"
When GPT-4 does, it's "hallucinating"
I'm not wrong, just had something extra in my tea this morning 🤷♀️
@ellie
It's a funny joke! I think it has to be said for the sake of those who know less about this stuff, who may see it as a criticism (which is getting passed on by smart people so much it's hard to watch because it perpetuates the popular misconceptions)...
Hallucinations of GPT refer to elaborate fictitious responses that stem from asking about real world stuff. It's "being wrong" only if you squint hard enough. It's very wrong without any hint of doubt. That's why