I was trying to look up the quote from Phineas and Ferb, but Bing’s Copilot did something even funnier

Further proof about how much of bullshit generators these things are.

I mean, it clearly is regurgitating something, but does not understand the semantic context at all

These LLMs are so trained to spit out a confident answer that logic is irrelevant, as long as something is said

It’s kind of like how politicians very rarely answer “I don’t know” to questions, because to not give an answer is a sign of weakness

@yassie_j I love this answer from Copilot! We still have to be very critical about what tools like Copilot are feeding us, but this hasn't changed from all the other tools we have used in the past. Copilot is mostly just an extension to the Bing search results. We still, and always, have to be very critical about anything that we read or hear online. Or anywhere else really. I personally use Copilot on a daily basis today, but it's always with this in mind.
@kaspernymand sorry, you love this answer??? It tried to console me about something that is impossible and not only that, but it gave me an incorrect answer — the actual search results were what I wanted. Copilot was actually more than useless, it actually had negative usability here because it provided an answer that makes no sense. I don’t need a computer to console me, I need a computer to give me the answer I need.
@yassie_j I saw it more as a joke. Everybody knows that you can't be born without your parents being involved. Errors like these, I just see as the funny reality of a tool still being in its early phases of development. Or perhaps it was indeed meant as a joke from Copilot. Nobody knows. You know, it's just like a grandma missing the point of a sentence and answering something completely unrelated. It happens. We just have to laugh about it and move on.

@kaspernymand if everybody knows this, then why doesn’t copilot? If we’re supposed to rely on these things to provide appropriate answers, then most people would rather expect an answer rather than a joke, wouldn’t they? If I wanted to look for jokes, I would go to a comedy club, with real humans, instead of a computer.

Do you see the limitation here? I wanted an answer to a question, and the answer was already provided in the search results below this stupid chatbox. The Copilot would not have made any of the results any better. This has negative utility because it wasted my time.

@yassie_j I can tell you that I just prompted Copilot the exact same words - and today it knows the answer. Again, it is just the reality of a tool in its early phases of development. This just shows that these tools learn very quickly - because today, it understands what you might be referring to. As I said to begin with, we just have to be very critical. Just like when we read or hear anything off of various social media platforms or anywhere else. That will never change.

@kaspernymand I think you misunderstood.

I do NOT want a computer to try to be relatable to me.

I do NOT want a computer to be friendly with me.

I do NOT want a computer to waste my time, screen real estate, and bandwidth with small talk.

The computer serves only one purpose: to deliver information to me.

I do NOT want an micro-essay before I get the answer I want.

That entire discussion response was useless because the search results already has the answer, without any fluff or huff.

There is no way this an improvement over search.

You techcucks are so eager to chase the newest, shiniest thing that does nothing better that you forget that already existing tools serve the appropriate purpose already.

@[email protected] @kaspernymand
I agree (and disagree? idk) with both of you because, while it is true generative AI is getting better, I also think it has
no place in search results, or places where solid information is required or provided. This is due to the nature of generative AI, and no improvement will fix this.
What I'd like to see from our search engines is, perhaps,
semantic AI - AI that is trained to process human intention and fix mistakes, as well as providing information that already exists with no chatter, bluffing, or consolation. Basically, semantic AI would be supercharged Google Search.

I dunno, though. I'm not too big in the AI scene, so I wouldn't care anyway. All I need is for sites to keep generative AI away from informational databases, thank you
@mtr @yassie_j I do see your point there. It is indeed problematic if an AI tool generates an answer based on incomplete information. Or if it picks snippets of information from different sources and make a relation between two unrelated topics. This has happened to me personally. This is indeed a problem - and it could also cause problems, if people are not critical about what they're served. I learned though. It's a lesson we all have to learn I think. Not just about AI, but about anything.
@kaspernymand @mtr @yassie_j bro, why do you write like an AI?