once in a lifetime adrenalin rush

https://lemmy.world/post/15763037

once in a lifetime adrenalin rush - Lemmy.World

we love google (and LLMs)

If you legitimately got this search result - please fucking reach out to your local suicide hot line and make them aware. Google needs to be absolutely sued into the fucking ground for mistakes like these that are

  • Google trying to make a teensy bit more money

  • Absolutely will push at least a few people over the line into committing suicide.

  • We must hold companies responsible for bullshit their AI produces.

    This seems to not be real (yet) though.
    Is this not real? I’ve done some Googling diligence and it’s been inconclusive - I’d really like to know as there are starry eyed sales people who keep pushing strong for integrating customer facing AI and I’ve been looking for a concrete example of it fucking up that’d leave us really liable. This and the “add glue to cheese” are both excellent examples that I haven’t been able to verify the veracity of.

    This is from the account that spread the image originally: x.com/ai_for_success/status/1793987884032385097

    Alternate Bluesky link with screencaps (must be logged in): bsky.app/profile/…/3ktarh3vgde2b

    AshutoshShrivastava (@ai_for_success) on X

    Apology Post: About 7-8 hours ago, I shared my views on how Google AI overview might be disabled in the next 15 days, citing some humorous results shared by users on X. I still believe this could happen due to the AI's inaccurate responses. Unfortunately, my first post of 🧵

    X (formerly Twitter)
    Just so others do not need to click etc: they found out it was faked and apologize for spreading fake news.