AI companies are violating a basic social contract of the web and and ignoring robots.txt
AI companies are violating a basic social contract of the web and and ignoring robots.txt
Put something in robots.txt that isn't supposed to be hit and is hard to hit by non-robots. Log and ban all IPs that hit it.
Imperfect, but can't think of a better solution.