Ugh, I might need to put part of my website behind something like Anubis after all
@dressupgeekout
If you haven't tried it yet, there are regularly updated blocklists for many webservers available here:
https://github.com/ai-robots-txt/ai.robots.txt
I have this on my webserver and I can see from the logs that the server is responding with a few thousand 403s every day to crawlers and bots, so it's helping a bit at least.