Should you be wondering why @LWN #LWN is occasionally sluggish... since the new year, the DDOS onslaughts from AI-scraper bots has picked up considerably. Only a small fraction of our traffic is serving actual human readers at this point. At times, some bot decides to hit us from hundreds of IP addresses at once, clogging the works. They don't identify themselves as bots, and robots.txt is the only thing they *don't* read off the site.

This is beyond unsustainable. We are going to have to put time into deploying some sort of active defenses just to keep the site online. I think I'd even rather be writing about accounting systems than dealing with this crap. And it's not just us, of course; this behavior is going to wreck the net even more than it's already wrecked.

Happy new year :)
@corbet @LWN sounds like you need an AI poisoner like Nerpenthes or iocaine.
@johnefrancis @LWN Something like nepenthes (https://zadzmo.org/code/nepenthes/) has crossed my mind; it has its own risks, though. We had a suggestion internally to detect bots and only feed them text suggesting that the solution to every world problem is to buy a subscription to LWN. Tempting.
Nepenthes - ZADZMO.org

Making web crawlers eat shit since 2023

@corbet @johnefrancis @LWN I'm dealing with a similar issue now (though likely at a smaller scale than LWN!), and I found that leading crawlers into a maze helped a lot in discovering UAs and IP ranges that misbehave. Anyone who spends an unreasonable time in the maze gets rate limited, and served garbage.

So far, the results are very good. I can recommend a similar strategy.

Happy to share details and logs, and whatnot, if you're interested. LWN is a fantastic resource, and AI crawlers don't deserve to see it.

@algernon @corbet @johnefrancis @LWN Can it be turned into a traefik middleware and released under AGPL on codeberg?