@Suiseiseki Anubis is the option that saved us a lot of work over the past months. We are not happy about it being open core or using GitHub sponsors, but we acknowledge the position from the maintainer: https://codeberg.org/forgejo/discussions/issues/319#issuecomment-6382369

Calling our usage of anubis an attack on our users is far-fetched. But feel free to move elsewhere, or host an alternative without resorting to extreme measures. We're happy to see working proof that any other protection can be scaled up to the level of Codeberg. ~f

Anubis - using proof-of-work to stop excessive crawling

- https://xeiaso.net/notes/2025/amazon-crawler/ - https://anubis.techaro.lol/ - https://anubis.techaro.lol/docs/design/how-anubis-works This solution came up more than once in the context of mitigating excessive crawling (either identified as AI or not). What do people think about it?

Codeberg.org

@Suiseiseki BTW, we're also actively following the work around iocaine, e.g. https://come-from.mad-scientist.club/@algernon/statuses/01K2N54XEVTEYYAASHZ0P48FBT

However, as far as we can see, it does not sufficiently protect from crawling. As the bot armies successfully spread over many servers and addresses, damaging one of them doesn't prevent the next one from doing harmful requests, unfortunately. ~f

Post by algernon --verbose --silent --debug --quiet &>/dev/null, @[email protected]

OTOH.... [zipbombs](https://www.bamsoftware.com/hacks/zipbomb/) with 28million:1 compression ratio are pretty darn sweet, so... I think that'll be the first.

come-from.mad-scientist.club

@Codeberg I believe @Suiseiseki is not referring to codebergs usage of anubis specifically, rather shares fsfs' stance (which I don't share) that Anubis "acts like malware" for making "calculations that a user does not want done": https://www.fsf.org/blogs/sysadmin/our-small-team-vs-millions-of-bots

fsf saying fsf things :)

Our small team vs millions of bots — Free Software Foundation — Working together for free software

@pluto @Codeberg @Suiseiseki

I think I understand the FSF's point, and I usually agree.

However, we have a practical divergence between who wants what:

If we don't put some protection, our website is unavailable/overwhelmed when the crawlies come in. I don't want that.

If I don't put protections, the crawlies harvest all our data and don't respect our licences CC BY-NC-SA. I don't want that either.

Anubis (or similar) helps, for now.
If you really don't want it, disable JS.

@jfbucas my understanding was that you wouldn't be able to access Anubis protected websites without js (except if you're whitelisted)

@pluto @Codeberg @Suiseiseki

What makes it an extremely dumb stance is that the first thing that gets downloaded when I open that page is matomo.js for analytics.