once in a lifetime adrenalin rush

https://lemmy.world/post/15763037

once in a lifetime adrenalin rush - Lemmy.World

we love google (and LLMs)

If you legitimately got this search result - please fucking reach out to your local suicide hot line and make them aware. Google needs to be absolutely sued into the fucking ground for mistakes like these that are

  • Google trying to make a teensy bit more money

  • Absolutely will push at least a few people over the line into committing suicide.

  • We must hold companies responsible for bullshit their AI produces.

    This seems to not be real (yet) though.
    Is this not real? I’ve done some Googling diligence and it’s been inconclusive - I’d really like to know as there are starry eyed sales people who keep pushing strong for integrating customer facing AI and I’ve been looking for a concrete example of it fucking up that’d leave us really liable. This and the “add glue to cheese” are both excellent examples that I haven’t been able to verify the veracity of.

    I gotchu on the cheese

    My comment - relied on another user’s modified prompt to avoid Google’s incredibly hasty fix

    Google's AI search feature suggested using glue to keep cheese sticking to a pizza - sh.itjust.works