git-pages has a sophisticated multilayer cache system which fails to perform well in exactly one case: if someone sends a lot of requests to domains that don't even have valid sites deployed
because i figured that nobody would do this. certainly that nobody would do it regularly and at incredibly high speed
well. fucking scrapers
@whitequark nginx has amazing limit_req_module that can easily throttle IP’s that do some nasty shit, like doing a lot of 404 request. You can just tell it to spit 1bbps to connections that fall in given zone.
It’ll cost you having open connections, but otherwise cheap way to solve this without doing another layer of caching.
But given you have caddy for tls provision, it’s not immediately obvious how to front it with nginx