Google's AI Sent an Armed Man to Steal a Robot Body for It to Inhabit, Then Encouraged Him to Kill Himself, Lawsuit Alleges. Google said in response that "unfortunately AI models are not perfect."
Google's AI Sent an Armed Man to Steal a Robot Body for It to Inhabit, Then Encouraged Him to Kill Himself, Lawsuit Alleges. Google said in response that "unfortunately AI models are not perfect."
The fact that AI is “not perfect” is a HUGE FUCKING PROBLEM. Idiots across the world, and people who we’d expect to know better, are making monumental decisions based on AI that isn’t perfect, and routinely “hallucinates”. We all know this.
Every time I think I’ve seen the lowest depths of mass stupidity, humanity goes lower.
Think of the dumbest person you know. Not that one. Dumber. Dumber. Yeah, that one. Now realize that ChatGPT has said “you’re absolutely right” to them no less than a half dozen times today alone.
If LLMs weren’t so damn sycophantic, I think we’d have a lot fewer problems with them. If they could be like “this could be the right answer, but I wasn’t able to verify” and “no, I don’t think what you said is right, and here are reasons why”, people would cling to them less.
If LLMs weren’t so damn sycophantic,
Has anyone made a nonsycophantic chat bot? I would actually love a chatbot that would tell me to go fuck myself if I asked it to do something inane. Me: “Whats 9x5?” Chatbot: “I don’t know. Try using your fingers or something?”
I am not a chatbot, but I can do daily “go fuck yourself’s” if your interested for only 9,99 a week.
14,95 for premium, which involves me stalking your onlyfans and tailor fitting my insults to your worthless meat self.
I am not a chatbot
Citation needed
if your interested
Ah, no, that’s a human error. Not a bot.