When I make a mistake, I'm "wrong"

When GPT-4 does, it's "hallucinating"

I'm not wrong, just had something extra in my tea this morning 🤷‍♀️

@ellie
It's a funny joke! I think it has to be said for the sake of those who know less about this stuff, who may see it as a criticism (which is getting passed on by smart people so much it's hard to watch because it perpetuates the popular misconceptions)...

Hallucinations of GPT refer to elaborate fictitious responses that stem from asking about real world stuff. It's "being wrong" only if you squint hard enough. It's very wrong without any hint of doubt. That's why

@ellie When GPT-4 does something right, it's _also_ hallucinating.
@ellie i will have whatever was in your tea...
@yuvipanda super powerful mind altering compounds 🤯
@ellie That's because GPT-4 doesn't make mistakes, it makes nonsense