Don't s**t where you eat.

The one adage they forgot to "teach" AI.

https://www.wired.com/story/fast-forward-chatbot-hallucinations-are-poisoning-web-search/
Chatbot Hallucinations Are Poisoning Web Search

Untruths spouted by chatbots ended up on the web—and Microsoft's Bing search engine served them up as facts. Generative AI could make search harder to trust.

WIRED
@axbom
I always thought it was spam that was gonna poison the Web so much that even in StarTrek they had to hand over reports by hand even if they were on a padd.
Buy it seems it's due to AI effing everything up.
@axbom The snake is eating its tail.
@alexanderdyas Exactly, it’s the Ouroboros.
@axbom Nice, didn’t know there was a word for that.