Ugh, I might need to put part of my website behind something like Anubis after all 
@dressupgeekout i think we all will, eventually. Is it AI scraper bots? Far from a permanent solution, but so far I've gotten by with blocking user agents and IPs.
@gordoooo_z Yes, it's ClaudeBot et al. super-aggressively scraping my new cgit instance. The host runs NetBSD, so I've also been looking into blacklistd(8)
@dressupgeekout according to at least one source, claudebot respects robots.txt. I'm not prepared to take one random source's word for it but it's simple enough to implement and find out, assuming you haven't already, that is?