“slopsquatting, a new term for a surprisingly effective type of software supply chain attack that emerges when LLMs “hallucinate” package names that don’t actually exist. If you’ve ever seen an AI recommend a package and thought, “Wait, is that real?”—you’ve already encountered the foundation of the problem.

And now attackers are catching on.”

The Rise of Slopsquatting: How #AI Hallucinations Are Fueling... https://socket.dev/blog/slopsquatting-how-ai-hallucinations-are-fueling-a-new-class-of-supply-chain-attacks #npm #dev #infosec

Edit: more info: https://www.bleepingcomputer.com/news/security/ai-hallucinated-code-dependencies-become-new-supply-chain-risk/

The Rise of Slopsquatting: How AI Hallucinations Are Fueling...

Slopsquatting is a new supply chain threat where AI-assisted code generators recommend hallucinated packages that attackers register and weaponize.

Socket
@skry @ultranurd Here's the fantastic tool which is going to write all your code.
It's a black box, so you can't tell why it did what it did.
Of course it will be secure, why do you ask?
@skry I feel like this is could be mitigated with an MCP agent that runs security scans/heuristics against any new dependencies
@cllns @skry how about by just not letting a random number generator attached to github write your code?
@skry are attackers catching on? Leaving aside the fact, that yes, this is a vulnerability of the supply chain for people who trust LLM-coding tools, the article mentions no instances of this being done in the wild. Have I missed news around that?
Or maybe it just that this article might have been in part written by an LLM, but it is hard to tell, and that sentence is to stoke fear to make people read it including the ad…
@skry what a (stupid) time to be alive
@skry What total idiots would ever even consider the idea to use LLM generated slop code?