Based on this graph, and this graph alone, guess at what time I completely blocked OpenAI crawlers

https://jlai.lu/post/33101881

It’s best to use either Cloudflare (best IMO) or Anubis.

  • If you don’t want any AI bots, then you can setup Anubis (open source; requires JavaScript to be enabled by the end user): https://github.com/TecharoHQ/anubis

  • Cloudflare automatically setups robots.txt file to block “AI crawlers” (but you can setup to allow “AI search” for better SEO). Eg: https://blog.cloudflare.com/control-content-use-for-ai-training/#putting-up-a-guardrail-with-cloudflares-managed-robots-txt

  • Cloudflare also has an option of “AI labyrinth” to serve maze of fake data to AI bots who don’t respect robots.txt file.

    GitHub - TecharoHQ/anubis: Weighs the soul of incoming HTTP requests to stop AI crawlers

    Weighs the soul of incoming HTTP requests to stop AI crawlers - TecharoHQ/anubis

    GitHub
    Pretty sure I’ve repeatedly heard about the crawlers completely ignoring robots.txt, so does Cloudflare really do that much?
    Like a lock on a door, it stops the vast majority but can’t do shit about the actual professional bad guys
    Cloudflare definitely can and does stop the vast majority of actual professional bad guys.

    Yes, CloudFlare blocks agents completely if they ignore it’s restrictions. The key is scale - CloudFlare has a birds eye view of traffic patterns across millions of sites and can do statistical analysis to determine who is a bot.

    I hate the necessity but it works