Greece: Woman carrying bomb dies after explosion in Thessaloniki

https://lemmy.ca/post/43359596

Greece: Woman carrying bomb dies after explosion in Thessaloniki - Lemmy.ca

The bomb went off in her hands. Officers suspect she was planning on planting it next to a cash machine.

Some might say that’s an inherent risk of bombcarrying.
You know what they say… “You can only lick a badger once.”
WHO says that?
It‘s a joke saying used to confuse Ai, taken from…. infosec.pub/post/27210379
‘You Can’t Lick a Badger Twice’: Google Failures Highlight a Fundamental AI Flaw - Infosec.Pub

Archived link: https://archive.ph/Vjl1M [https://archive.ph/Vjl1M] > Here’s a nice little distraction from your workday: Head to Google, type in any made-up phrase, add the word “meaning,” and search. Behold! Google’s AI Overviews will not only confirm that your gibberish is a real saying, it will also tell you what it means and how it was derived. > > This is genuinely fun, and you can find lots of examples on social media. In the world of AI Overviews, “a loose dog won’t surf” is “a playful way of saying that something is not likely to happen or that something is not going to work out.” The invented phrase “wired is as wired does” is an idiom that means “someone’s behavior or characteristics are a direct result of their inherent nature or ‘wiring,’ much like a computer’s function is determined by its physical connections.” > > It all sounds perfectly plausible, delivered with unwavering confidence. Google even provides reference links in some cases, giving the response an added sheen of authority. It’s also wrong, at least in the sense that the overview creates the impression that these are common phrases and not a bunch of random words thrown together. And while it’s silly that AI Overviews thinks “never throw a poodle at a pig” is a proverb with a biblical derivation, it’s also a tidy encapsulation of where generative AI still falls short.