git-pages has a sophisticated multilayer cache system which fails to perform well in exactly one case: if someone sends a lot of requests to domains that don't even have valid sites deployed
because i figured that nobody would do this. certainly that nobody would do it regularly and at incredibly high speed
well. fucking scrapers
@cinebox oh yeah that's basically how i started grebedoc
one thing git-pages intentionally omits is any sort of "run user-provided code in a container" because i believe that most of the solutions here cannot be left unattended if you expect to not be compromised by malware at some point. maybe firecracker vms would work but this still has a lot of issues. so i just let people use forgejo actions or something if they need processing