Maybe a better analogy for people who still think "chat AI" knows what it's saying is that it's like a parrot. Parrots can talk like humans but they don't have any idea what they're saying. They're just repeating patterns of sound they've heard us make. If you ask a parrot a question enough times it can even learn that after the sound pattern that your question made another pattern follows that makes up the answer. It still doesn't know what you're asking it, or what the answer means. It's just making sound patterns

These AIs are doing the exact literal same thing with letters. It doesn't mean anything, because it doesn't understand any of it, because IT'S A FUCKING PARROT, BRENT

@eniko I'm pretty sure my little cockatiel buddy makes sounds that are meaningful to him, and I just don't understand his context because I'm not a bird.

AI doesn't make statements that are meaningful to it. It doesn't care what it says. It feels like a weird Rorschach test that you assign meaning to.