grebedoc had its highest share yet of serving garbage requests yesterday (a wave peaking at 150 req/sec)
git-pages has a sophisticated multilayer cache system which fails to perform well in exactly one case: if someone sends a lot of requests to domains that don't even have valid sites deployed
because i figured that nobody would do this. certainly that nobody would do it regularly and at incredibly high speed
well. fucking scrapers