I was trying to look up the quote from Phineas and Ferb, but Bing’s Copilot did something even funnier

Further proof about how much of bullshit generators these things are.

I mean, it clearly is regurgitating something, but does not understand the semantic context at all

These LLMs are so trained to spit out a confident answer that logic is irrelevant, as long as something is said

It’s kind of like how politicians very rarely answer “I don’t know” to questions, because to not give an answer is a sign of weakness

@yassie_j I love this answer from Copilot! We still have to be very critical about what tools like Copilot are feeding us, but this hasn't changed from all the other tools we have used in the past. Copilot is mostly just an extension to the Bing search results. We still, and always, have to be very critical about anything that we read or hear online. Or anywhere else really. I personally use Copilot on a daily basis today, but it's always with this in mind.
@kaspernymand sorry, you love this answer??? It tried to console me about something that is impossible and not only that, but it gave me an incorrect answer — the actual search results were what I wanted. Copilot was actually more than useless, it actually had negative usability here because it provided an answer that makes no sense. I don’t need a computer to console me, I need a computer to give me the answer I need.
@yassie_j I saw it more as a joke. Everybody knows that you can't be born without your parents being involved. Errors like these, I just see as the funny reality of a tool still being in its early phases of development. Or perhaps it was indeed meant as a joke from Copilot. Nobody knows. You know, it's just like a grandma missing the point of a sentence and answering something completely unrelated. It happens. We just have to laugh about it and move on.

@kaspernymand if everybody knows this, then why doesn’t copilot? If we’re supposed to rely on these things to provide appropriate answers, then most people would rather expect an answer rather than a joke, wouldn’t they? If I wanted to look for jokes, I would go to a comedy club, with real humans, instead of a computer.

Do you see the limitation here? I wanted an answer to a question, and the answer was already provided in the search results below this stupid chatbox. The Copilot would not have made any of the results any better. This has negative utility because it wasted my time.

@[email protected] @kaspernymand what "joke" lol where is the joke, is copilot itself, in it's entirety, or even the concept of an LLM trying to act like a person altogether, the joke? If so I can get behind that!!~
@HaruEb @yassie_j Answered below. :)
@kaspernymand @[email protected] This makes me violently ill
@HaruEb @yassie_j That's fair. We should only use the tools that we ourselves find useful.
@kaspernymand @[email protected] I just wish to god they wouldn't try to pretend they have human experiences, like feeling "sorry" or understanding even in the slightest what "keeping you company" could possibly mean.

I don't doubt that these types of things can be useful, but they seem uniquely designed to upset me with this stuff, they pretend to be human in the most upsetting ways, at least to me
@HaruEb @kaspernymand @[email protected] No, it's not just you.

You don't want your toaster to empathize with you either.

Especially if you know this thing is just faking it because it is utterly incapable of having any thought or emotion and just regurgitates word samples.
It gives off major uncanny valley vibes. Creepy af.

Completely useless, redundant clutter for bored techbros.

I swear, the more these idiots are trying to make things "more accessible" the less accessible they get. It's like taking "if it ain't broken don't fix it" as a challenge for some moronic reason.