In one sense, I like the idea of human.json creating a "web of humanity" ( https://codeberg.org/robida/human.json )

But the reason I won't be joining in is:

- AI scrapers are ruthless and well known for ignoring robots.txt restrictions.

- The high value web content is the human-written content, to avoid model collapse risks (training LLMs on LLM content).

- So providing a network of human content will be like gold dust for AI companies. I bet they're rubbing their hands with glee.

human.json

A lightweight protocol for humans to assert authorship of their website content and vouch for the humanity of others.

Codeberg.org

@peter everything you say is already true, and will continue to be true until the #MOLE Training bubble bursts. With or without human.json, or Libravatar, or LibreTree, or rel=me (already used in @Mastodon for link verification), or any other tech that might make the Social Web (ie the human-generated web) more navigable for humans.

Refusing to use them 'because AI' is 'destroy the village to save the village' logic.

@markhughes