This is fun (to me):

1) Pop a JSON file on your website which lists all the websites you know are written by a human and not AI

2) Install the browser addon and see which websites you trust and who vouch for each other as you browse.

🔗 https://codeberg.org/robida/human.json

#NoAI #NO2AI #HumanJSON #Internet #WebDev

I just set it up on coxy.co - my blog.

I see each of the following people have set it up on their site, and blogged about it too:

@neatnik
@sethmlarson
@hamatti
@foosel
@gedankenstuecke

😃

Scroll trīgintā quattuor

Arcane curation from the IndieWeb, Fediverse and Cybersecurity realms

shellsharks
@matt 3) AI crawlers look for the file to find fresh, unadulterated sources of content to scrape, to try and stave off model collapse

@patrick_h_lauke I'm pretty sure they are scraping these sources already. 😆

(I actually set my blog up on cloudflare a long long time ago but see they have some AI-scraping protection. Wonder if it actually works...)

@matt I am not sure if this will help or not, but I have been using the free version of Known Agents on my wordpress sites and it seems to help:

https://knownagents.com

Track, control, and optimize for AI agents and bots | Known Agents

Use Known Agents to turn the rising wave of AI agents, LLM assistants, and other bots crawling your website into a new growth channel for your business

Known Agents
@rickscully Interesting - thanks for the link! I'll check it out!