"What a world we live in: AI hallucinated packages are validated and rubber-stamped by another AI that is too eager to be helpful."
(Feross Aboukhadijeh)
Thomas Claburn in The Register on the horrors of slopsquatting, where genAI coding tools hallucinate package names, and bad actors then place their own malicious packages under these names. While other genAI systems wrongly recommend these packages. The mind boggles.
https://www.theregister.com/2025/04/12/ai_code_suggestions_sabotage_supply_chain/