grebedoc had its highest share yet of serving garbage requests yesterday (a wave peaking at 150 req/sec)
these waves are getting bigger and bigger which is somewhat concerning. it's nowhere near the hardware capacity yet but i'm hitting some software bottlenecks that i've never thought would be relevant

git-pages has a sophisticated multilayer cache system which fails to perform well in exactly one case: if someone sends a lot of requests to domains that don't even have valid sites deployed

because i figured that nobody would do this. certainly that nobody would do it regularly and at incredibly high speed

well. fucking scrapers

@whitequark now that I actually look at git pages I realize it’s exactly the thing I wanted to make a few months back to replace ReadTheDocs, I’ll have to try it out, thanks
@cinebox oh nice, how much replacing are we talking about? like for your own needs or as a service for others?
@whitequark just for myself and some community projects

@cinebox oh yeah that's basically how i started grebedoc

one thing git-pages intentionally omits is any sort of "run user-provided code in a container" because i believe that most of the solutions here cannot be left unattended if you expect to not be compromised by malware at some point. maybe firecracker vms would work but this still has a lot of issues. so i just let people use forgejo actions or something if they need processing

@whitequark yeah I already have forgejo actions for that. I just needed a solution for deploying the resulting html.
action

Forgejo Action for uploading a directory to a git-pages site

Codeberg.org