I keep seeing webmasters talking about how to block AI scrapers (through user agents and IP blocks) and not enough webmasters talking about the far better option of rigging their site to return complete gibberish or transgender werewolf erotica* when AI scrapers are detected.

*depending on which one you think is funnier to poison the AI models with

@foone i’ve been returning a 302 to a 10GB binary file from hetzner’s speedtest page, but honestly.. maybe I should?

all of my pages already contain prompt injection (in multiple places, even)

@domi @foone My approach uses the wonderful "100GB of zeros compressed into 10MB and served with transport compression headers" which usually makes most poorly-written bots fuck off in short order when they OOM...
@becomethewaifu @foone @domi This makes me wonder if Firefox actually has anything in consideration of that issue.
@lispi314 it does not. ask me how i know 🙃
@arisunz Disappointing but unsurprising.
@lispi314 i mean tbf neither does chromium
@arisunz I did somewhat expect that too.