Maybe a better analogy for people who still think "chat AI" knows what it's saying is that it's like a parrot. Parrots can talk like humans but they don't have any idea what they're saying. They're just repeating patterns of sound they've heard us make. If you ask a parrot a question enough times it can even learn that after the sound pattern that your question made another pattern follows that makes up the answer. It still doesn't know what you're asking it, or what the answer means. It's just making sound patterns

These AIs are doing the exact literal same thing with letters. It doesn't mean anything, because it doesn't understand any of it, because IT'S A FUCKING PARROT, BRENT

Now, parrots are cool! Their ability to seemingly answer human questions is a really neat party trick!

But you wouldn't replace human answers with parrot provided answers in a serious setting with any kind of stakes. That's exactly what relying on something like chatgpt is

@eniko Also it's hard to see how a cool party trick is going to cause some kind of disruptive revolution. At this point it mainly just costs a lot of money.
@Tijn enough hype will do a lot of things, see cryptocurrencies
@mmby I think calling AI "a cool party trick" is a good way to talk about this stuff going forward haha
@Tijn I mean, it will be very disruptive if they start replacing jobs like therapists with party tricks >_>

@Tijn @eniko For some applications, pattern-reactions is sufficient: transcription and translation AI, AI art TOOLS (used by an artist), search engine augmentation (well anything would be better than 2020s Google), and many other places. Disruptive tools (and already you can see the result)

But key is the AI industry thinks a true AI is 10-30 years off.

@eniko it's an absolute *novelty* but as time goes on it becomes clear that its ability to function as a parrot means it's only useful for *boilerplate code* and even that's kinda overselling it because sometimes it generates code that just doesn't actually do anything

@eniko I don’t think it’s fair to compare ChatGPT with parrots.

Parrots are much smarter than ChatGPT.

@bgolus @eniko ye but parrots can't live in the cloud.
@eniko I mostly agree with this, BUT it depends on the parrot/species. There are some parrot species who can learn to understand human words and can use them the same as a human (to an extent, although the exact specifics are up for debate, and likely will be for a long time). There are like 350 species of parrot, so broad generalizations and all that.
@eniko In general tho, yes. Most common parrot species learn words, learn contexts/situations to use specific words that get "rewards" (human interactions, treats, etc), and use them not really caring about the "meaning" of the word.
@eniko Thus the name of the famous paper.
@dalias what paper?
@eniko Stochastic Parrots, the one Google ousted and maligned Timnit Gebru over.
@eniko This actually feels like a great example of explaining why it can get things horribly wrong at times and think it's confident it makes sense: regardless of whether the parrot wants a cracker, it's heard that it should say that it wants one.

@eniko this must be the best analogy I've heard on this or just AI currently.

I'm gonna steal it for using IRL kthx 🦝

@eniko just like how Eliza fooled people almost 60 years ago, chatgpt is fooling people today
@eniko even this does parrots a disservice tbh

@eniko Bad comparison, because that's how humans learn languages too: they repeat (for them) gibberish, until reinforcement by adults creates a structure:

A word like "ma" is a very common term for "mother" in human language, simply because this is one of the first sounds human babies make, and their mother is usually the first person that reacts to it. Soon the baby will learn to associate "ma" with the person providing food.

Whereas an AI is not rewarded for saying the "right" things; it is, if anything, eliminated for getting it wrong.

@eniko The only distinction or clarification is that the AI creates patterns billions of associations long, and interrelates the things it has already 'spoken' recently with the new things it is about to say.

@eniko I'm pretty sure my little cockatiel buddy makes sounds that are meaningful to him, and I just don't understand his context because I'm not a bird.

AI doesn't make statements that are meaningful to it. It doesn't care what it says. It feels like a weird Rorschach test that you assign meaning to.

@eniko what if you give a parrot a Chinese translation dictionary