<edit>
Achievement unlocked: I did a predictably stupid thing.

I tagged the author of the linked post here, without thinking about how many times he’s already had to explain this same post to others. And it was the final straw, he took the post down. Feels bad man.

Original post in the screenshot below.
</edit>

@dcoderlt I don't fully trust the sourcing, but https://www.theguardian.com/technology/2023/sep/01/mushroom-pickers-urged-to-avoid-foraging-books-on-amazon-that-appear-to-be-written-by-ai I have seen a few reports of people getting ill after trusting LLM written mushroom foraging books. Very believable even if not true.
Mushroom pickers urged to avoid foraging books on Amazon that appear to be written by AI

Sample of books scored 100% on AI detection test as experts warn they contain dangerous advice

The Guardian
@Argonel
Yeah, I’ve seen plenty of posts about such books. It’s a clusterfuck.
@dcoderlt I also wish I could find better sourcing on LLM costs, but the number I have seen is approximately $100,000 per user, which doesn't seem sustainable even with 10-100X increase in users.

@dcoderlt I looked up vinyl chloride (placard 1086) in the Emergency Response Guidebook (2020 edition). The entry referred me to Guid 116 - Gases - Flammable (Unstable). An Explosive Polymerization Hazard alert popped up.
Under the EMERGENCy RESPONSE heading, for a Large Fire, Water Spray or Fog was indicated.
If ChatGPT has the EGR in its corpus, it legitimately pulled that recommendation.

This is in now way meant as endorsement of ChatGPT or LLMs in general.