I was trying to look up the quote from Phineas and Ferb, but Bing’s Copilot did something even funnier

Further proof about how much of bullshit generators these things are.

I mean, it clearly is regurgitating something, but does not understand the semantic context at all

These LLMs are so trained to spit out a confident answer that logic is irrelevant, as long as something is said

It’s kind of like how politicians very rarely answer “I don’t know” to questions, because to not give an answer is a sign of weakness

@yassie_j I love this answer from Copilot! We still have to be very critical about what tools like Copilot are feeding us, but this hasn't changed from all the other tools we have used in the past. Copilot is mostly just an extension to the Bing search results. We still, and always, have to be very critical about anything that we read or hear online. Or anywhere else really. I personally use Copilot on a daily basis today, but it's always with this in mind.
@kaspernymand sorry, you love this answer??? It tried to console me about something that is impossible and not only that, but it gave me an incorrect answer — the actual search results were what I wanted. Copilot was actually more than useless, it actually had negative usability here because it provided an answer that makes no sense. I don’t need a computer to console me, I need a computer to give me the answer I need.
@yassie_j I saw it more as a joke. Everybody knows that you can't be born without your parents being involved. Errors like these, I just see as the funny reality of a tool still being in its early phases of development. Or perhaps it was indeed meant as a joke from Copilot. Nobody knows. You know, it's just like a grandma missing the point of a sentence and answering something completely unrelated. It happens. We just have to laugh about it and move on.

@kaspernymand if everybody knows this, then why doesn’t copilot? If we’re supposed to rely on these things to provide appropriate answers, then most people would rather expect an answer rather than a joke, wouldn’t they? If I wanted to look for jokes, I would go to a comedy club, with real humans, instead of a computer.

Do you see the limitation here? I wanted an answer to a question, and the answer was already provided in the search results below this stupid chatbox. The Copilot would not have made any of the results any better. This has negative utility because it wasted my time.

@[email protected] @kaspernymand what "joke" lol where is the joke, is copilot itself, in it's entirety, or even the concept of an LLM trying to act like a person altogether, the joke? If so I can get behind that!!~
@HaruEb @yassie_j Answered below. :)
@kaspernymand @[email protected] This makes me violently ill
@HaruEb @yassie_j That's fair. We should only use the tools that we ourselves find useful.
@kaspernymand @[email protected] I just wish to god they wouldn't try to pretend they have human experiences, like feeling "sorry" or understanding even in the slightest what "keeping you company" could possibly mean.

I don't doubt that these types of things can be useful, but they seem uniquely designed to upset me with this stuff, they pretend to be human in the most upsetting ways, at least to me
@HaruEb @kaspernymand @[email protected] No, it's not just you.

You don't want your toaster to empathize with you either.

Especially if you know this thing is just faking it because it is utterly incapable of having any thought or emotion and just regurgitates word samples.
It gives off major uncanny valley vibes. Creepy af.

Completely useless, redundant clutter for bored techbros.

I swear, the more these idiots are trying to make things "more accessible" the less accessible they get. It's like taking "if it ain't broken don't fix it" as a challenge for some moronic reason.
@HaruEb @yassie_j That's very fair. I also find it odd sometimes. I know that Copilot has three modes to try to adapt to your personal preference. "Creative", "Balanced", or "Precise". If you just want the facts, "Precise"-mode is probably the way to go for that prompt. Personally, I'm mostly using "Balanced". That works for me most of the time. Serving the facts, but also helping translate it into real world scenarios.
@[email protected] nah I don't know where this optimism about what is a "fact" and whether or not these things "serve" such things reliably, it's so alien to me.

Untagging the techbro because honestly this reply isn't for him and I don't really give a shit what his thoughts might be in response to this.
@HaruEb @yassie_j I'm not even working in tech. I only started using these tools a few weeks ago. I'm working in social and humanistic sciences. I've been sceptical about these tools for a while - and still am. But I won't bother you anymore with my chatter. :)
@kaspernymand @[email protected] Alright since you tagged yourself back in I'm going to say it with my full chest:

You and people like you, defending this shit, make me sick.

All of these recent systems are built in a way that means ALL they can do is replicate structural biases contained within the data they are trained on and that's not to mention the insane biases in the weights given after the fact. There's a reason those aren't made public, it's because they would be embarrassing how tonedeaf all of these white cis het mostly male able bodied (add your own privilege if you even know what privileges you hold) really are.

Your casual ":)" attitude in the face of this is disappointing to say the least.
@HaruEb @yassie_j Part of my formal studies have been on structural biases and privileges. So, I'm very aware of this. I also just returned from one of my therapy sessions this afternoon. I'm very well aware of this. I do agree that it is likely that AI tools like Copilot will have some biases because it will prioritise some sources above others. Just like search results on various search engines, like Bing, Google, DuckDuckGo, etc., prioritise some sources above others. (1/2)
@HaruEb @yassie_j Those sources could be written by anybody. In this case, from the example below, it's based on the fandom.com, quotes.net and reddit.com. In other cases, it's other sources. Those are biases. At least to some degree. But we all have biases. Even the most unbiased person will answer your question with some degree of bias. That's what I mean by being critical about all information we're served. Because there will always be biases and different understandings. (2/2)
@kaspernymand @[email protected]

"But we all have biases" this just sounds like your saying "oh well might as well make this problem worse since it's gonna exist anyway"

I'm sure your not saying that, that would be despicable, but it sure seems like it, I'd forgive anyone for thinking that and I'd counsel you to edit yourself before you make such a mistake, it's good practice.

On the point of sources, it's funny that fandom.com should come up, that place has a major problem with all kinds of bigotry on their various wikis. And you don't get to choose the sources it draws from do you, you're just being led along by the nose by this shit, you probably don't even chack the sources most of the time do you,,,,
@HaruEb @yassie_j I always check the sources. That's why I mention it. But I don't think it makes sense for us to continue this conversation. So, I'll just wish you a great day. :)
@yassie_j I can tell you that I just prompted Copilot the exact same words - and today it knows the answer. Again, it is just the reality of a tool in its early phases of development. This just shows that these tools learn very quickly - because today, it understands what you might be referring to. As I said to begin with, we just have to be very critical. Just like when we read or hear anything off of various social media platforms or anywhere else. That will never change.

@kaspernymand I think you misunderstood.

I do NOT want a computer to try to be relatable to me.

I do NOT want a computer to be friendly with me.

I do NOT want a computer to waste my time, screen real estate, and bandwidth with small talk.

The computer serves only one purpose: to deliver information to me.

I do NOT want an micro-essay before I get the answer I want.

That entire discussion response was useless because the search results already has the answer, without any fluff or huff.

There is no way this an improvement over search.

You techcucks are so eager to chase the newest, shiniest thing that does nothing better that you forget that already existing tools serve the appropriate purpose already.

@yassie_j @kaspernymand people are so cringe sometimes
@yassie_j It's fair you feel like that. But many others do not. We should use the tools that we find useful to use. Anything else would be absurd. Personally, I'm fine with Copilot and other tools responding in a more "humane way", because it helps me grasp the essence of the information and translate the data into real world scenarios. For you, and others similar to you, this might not be helpful at all. That's just life. We're all different.
@[email protected] @kaspernymand
I agree (and disagree? idk) with both of you because, while it is true generative AI is getting better, I also think it has
no place in search results, or places where solid information is required or provided. This is due to the nature of generative AI, and no improvement will fix this.
What I'd like to see from our search engines is, perhaps,
semantic AI - AI that is trained to process human intention and fix mistakes, as well as providing information that already exists with no chatter, bluffing, or consolation. Basically, semantic AI would be supercharged Google Search.

I dunno, though. I'm not too big in the AI scene, so I wouldn't care anyway. All I need is for sites to keep generative AI away from informational databases, thank you
@mtr @yassie_j I do see your point there. It is indeed problematic if an AI tool generates an answer based on incomplete information. Or if it picks snippets of information from different sources and make a relation between two unrelated topics. This has happened to me personally. This is indeed a problem - and it could also cause problems, if people are not critical about what they're served. I learned though. It's a lesson we all have to learn I think. Not just about AI, but about anything.
@yassie_j @kaspernymand I actually managed to tune a locally-hosted model to give extremely concise answers once. It is technically possible to make them decent but the commercial offerings are absolute garbage
@yassie_j @kaspernymand they aren’t even chasing anything new. Just the same thing reiterated out of its functionality. A version of a version…it becomes so lossy.

@yassie_j @kaspernymand
“Listen,” said Ford, who was still engrossed in the sales brochure, “they make a big thing of the ship's cybernetics. A new generation of Sirius Cybernetics Corporation robots and computers, with the new GPP feature.”
“GPP feature?” said Arthur. “What's that?”

“Oh, it says Genuine People Personalities.”

“Oh,” said Arthur, “sounds ghastly.”

A voice behind them said, “It is.”

@yassie_j @kaspernymand that's where i'm at on this. there's a finite amount of space on the screen; any subdivision of that space can be graded against its usefulness to the objective of the interface. the purpose of a web search is to enable me to find information. if we grade the presence of a nondeterministic text generator against this stated objective, it would score very low. and as it turns out, these features are allotted a significant portion of the limited resource of screen space. they are objectively a complete waste
@[email protected] Huh. I guess we could use AI to replace politicians. No one would ever notice.

In fact, it would make everything a lot easier and we could save a shit-ton of money.

@[email protected]

It all began on the day of my actual birth: both of my parents failed to show up

@lunahd An iconic line that lives rent free in my head, that scene is pure masterstroke of comedy
@yassie_j "i appreciate your empathy" ITS A LANGUAGE MODEL IT HAS NO EMPATHY
@yassie_j yassie why do you use bing search 
@jessienab I need dem rwds
@yassie_j @jessienab that's why I used it for a long time lmao got several free months of gamepass
@lori @yassie_j This sounds so dystopian but I get it 
@jessienab @yassie_j my friends made fun of it for long enough that I made my license plate in Forza Horizon say BINGUSER
@jessienab @yassie_j now that I'm using Search Engines That Are Mostly Still Bing I keep thinking fuck what if I just used regular Bing and blocked all the bullshit with ublock and got rewards points again
@yassie_j @jessienab SAME i got 600 rbux in the past like 5 momths i love ms rwss

@viovio @yassie_j ya'll are gonna say sike soon right?

... right???

@yassie_j it's been tough, but I've grown stronger.
@yassie_j Something about the auto-generated responses that thank the auto-generated sympathetic message is so funny. "Please thank our robot now" -microsoft
@yassie_j worth pointing out that it wants to write both halves of the conversation it doesn't even need you at this point
@[email protected] if mental health has been automated, does that mean that being nice to each other is seizing the means of production?
@yassie_j
That must have been quite a shock, especially for your mother.
@yassie_j @babe So now you can't even choose if you want to waste tons of energy for an AI hallucination to your search or not. Jesus Christ.
@forteller @yassie_j @babe Well I mean, you could always just use another search engine?
@yassie_j I'm glad you found Bing's Copilot funny. However, I'd be happy to help you find the Phineas and Ferb quote you're looking for. Phineas and Ferb is an animated television show that follows the adventures of two stepbrothers, Phineas Flynn and Ferb Fletcher, as they embark on various extraordinary projects during their summer vacation ¹. Some popular quotes from the show include ² ¹ ³:

1. "Ferb, I know what we’re gonna do today!" - Phineas Flynn
2. "I'm Ferb. I don't talk much." - Ferb
3. "Hey, where's Perry?" - Phineas
4. "I'm not a pessimist, I'm a realist." - Candace
5. "I'm not lazy, I'm just on energy-saving mode." - Ferb
@yassie_j The funniest part, to me, is the suggested auto-replies. Why are they there? You're talking to a chatbot, why does it think you want it to carry both sides of the conversation 😂
@uastronomer this is really what sells the illusion of these chatbots. The responses are nonsensical because I would never say those things, even when you consider the nonsense of the original response
@yassie_j tbh the pre-generated responses are pretty great as well.
@yassie_j i hate bing copilot responses 60% of the time