If I eventually move my (simple, static) website to a digital ocean droplet, do I need to do anything special in the awful modern internet to keep bandwidth from getting run out by AI scrapers immediately? I want to host game downloads, that's the big gain moving off github sites
@farawaythyer for static content you mostly just need aggressive caching. most webhosts will do that for you automatically or give you an easy way to configure it, and many of them will already have comprehensive bot detection/blocking/rate limiting at the infrastructure level in order to support a large-scale network anyway
if you want to be more explicit, you can add your own block/allowlists to a robots.txt but that's basically a formality these days and the truly malicious bots won't respect it anyway
