I compiled a short list of anti-AI tools. If you know of others, please add them

[Please see https://codeberg.org/wimvanderbauwhede/low-carbon-computing/src/branch/master/anti-AI-tools.md]

[added ArtShield, Anti-DreamBooth and Mist on 2025-04-01]

Anti-AI tools

Glaze
https://glaze.cs.uchicago.edu
Glaze is a system designed to protect human artists by disrupting style mimicry. At a high level, Glaze works by understanding the AI models that are training on human art, and using machine learning algorithms, computing a set of minimal changes to artworks, such that it appears unchanged to human eyes, but appears to AI models like a dramatically different art style.

Nightshade
https://nightshade.cs.uchicago.edu/
Nightshade, a tool that turns any image into a data sample that is unsuitable for model training

ArtShield
https://artshield.io
ArtShield embeds a well-camouflaged watermark into your images that helps prevent AI models from training on your data. This watermark is what models such as Stable Diffusion use to mark images that it generates in order to prevent it from training off of data it has produced itself.

Anti-DreamBooth
https://github.com/VinAIResearch/Anti-DreamBooth
The system aims to add subtle noise perturbation to each user's image before publishing in order to disrupt the generation quality of any DreamBooth model trained on these perturbed images.

Mist
https://github.com/psyker-team/mist-v2
Mist is a powerful image preprocessing tool designed for the purpose of protecting the style and content of images from being mimicked by state-of-the-art AI-for-Art applications.

HarmonyCloak
https://mosis.eecs.utk.edu/harmonycloak.html
HarmonyCloak is designed to protect musicians from the unauthorized exploitation of their work by generative AI models. At its core, HarmonyCloak functions by introducing imperceptible, error-minimizing noise intohttps://codeberg.org/wimvanderbauwhede/low-carbon-computing/src/branch/master/anti-AI-tools.md musical compositions.

Kudurru
https://kudurru.ai
Actively block AI scrapers from your website with Spawning's defense network

Nepenthes
https://zadzmo.org/code/nepenthes/
This is a tarpit intended to catch web crawlers. Specifically, it's targetting crawlers that scrape data for LLMs - but really, like the plants it is named after, it'll eat just about anything that finds it's way inside.

AI Labyrinth
https://blog.cloudflare.com/ai-labyrinth/
Today, we’re excited to announce AI Labyrinth, a new mitigation approach that uses AI-generated content to slow down, confuse, and waste the resources of AI Crawlers and other bots that don’t respect “no crawl” directives.

More tools, suggested by comments on this posts:

Anubis
https://xeiaso.net/blog/2025/anubis/
Anubis is a reverse proxy that requires browsers and bots to solve a proof-of-work challenge before they can access your site.

Iocaine
https://iocaine.madhouse-project.org
The goal of iocaine is to generate a stable, infinite maze of garbage.

#NoToAI #AI

low-carbon-computing/anti-AI-tools.md at master

low-carbon-computing - Files for papers and presentations about low carbon computing

Codeberg.org
low-carbon-computing/anti-AI-tools.md at master

low-carbon-computing - Files for papers and presentations about low carbon computing

Codeberg.org
My take on Glaze and similar solutions | Notion

By YanK l'Innommé

Kévin BOIME's Notion on Notion
@YanK None of these tools are a solution. It's more a form of protest and that I think has its own value.
@wim_v12e But, as I said in my article, I'm afraid that with these tools, there's a kind of resignation. A distorted sense of protection, when the fight to free AI from capitalism's stranglehold must be at the heart of the battle to make technology beneficial for all.
@wim_v12e In the list, under the section 'Poisoning AI while using static websites,' there is a link error in the 'Quixotic on GitHub Pages, using Hugo.' The correct link is 'https://algorithmic-sabotage.github.io/asrg/trapping-ai/'
Trapping AI

This is a methodically structured poisoning mechanism designed to feed nonsensical data to persistent bots and aggressive “AI” scrapers that circumvent robots.txt directives.

ASRG
@rostro thanks for letting me know, I have fixed it.
@wim_v12e There's https://xeiaso.net/blog/2025/anubis/ , but it should be used judiciously imo, since it requires javascript and very modern browser features to be enabled
Block AI scrapers with Anubis

I got tired with all the AI scrapers that were bullying my git server, so I made a tool to stop them for good.

@wim_v12e

iocaine
https://iocaine.madhouse-project.org
The deadliest poison known to AI. Let's make AI poisoning the norm. If we all do it, they won't have anything to crawl.

iocaine - the deadliest poison known to AI

@wim_v12e

Anubis
https://github.com/TecharoHQ/anubis

> Anubis weighs the soul of your connection using a sha256 proof-of-work challenge in order to protect upstream resources from scraper bots.
> Installing and using this will likely result in your website not being indexed by some search engines. This is considered a feature of Anubis, not a bug.
> This is a bit of a nuclear response, but AI scraper bots scraping so aggressively have forced my hand.

GitHub - TecharoHQ/anubis: Weighs the soul of incoming HTTP requests to stop AI crawlers

Weighs the soul of incoming HTTP requests to stop AI crawlers - TecharoHQ/anubis

GitHub

@wim_v12e

A curated list titled 'Sabot in the Age of AI' by @asrg: Offensive Methods and Strategic Approaches for Facilitating (Algorithmic) Sabotage, Framework Disruption, and Intentional Data Poisoning.

https://tldr.nettime.org/@asrg/113867412641585520

https://algorithmic-sabotage.github.io/asrg/posts/sabot-in-the-age-of-ai/

'Trapping AI': A methodically structured poisoning mechanism designed to feed nonsensical #data to persistent #bots and aggressive “AI” scrapers that circumvent robots.txt directives.

https://algorithmic-sabotage.github.io/asrg/trapping-ai/

#AI #Sabotage #AbolishAI

ASRG (@asrg@tldr.nettime.org)

Attached: 1 image Sabot in the Age of AI Here is a curated list of strategies, offensive methods, and tactics for (algorithmic) sabotage, disruption, and deliberate poisoning. 🔻 iocaine The deadliest AI poison—iocaine generates garbage rather than slowing crawlers. 🔗 https://git.madhouse-project.org/algernon/iocaine 🔻 Nepenthes A tarpit designed to catch web crawlers, especially those scraping for LLMs. It devours anything that gets too close. @aaron@zadzmo.org 🔗 https://zadzmo.org/code/nepenthes/ 🔻 Quixotic Feeds fake content to bots and robots.txt-ignoring #LLM scrapers. @marcusb@mastodon.sdf.org 🔗 https://marcusb.org/hacks/quixotic.html 🔻 Poison the WeLLMs A reverse-proxy that serves diassociated-press style reimaginings of your upstream pages, poisoning any LLMs that scrape your content. @mike@mikecoats.social 🔗 https://codeberg.org/MikeCoats/poison-the-wellms 🔻 Django-llm-poison A django app that poisons content when served to #AI bots. @Fingel@indieweb.social 🔗 https://github.com/Fingel/django-llm-poison 🔻 KonterfAI A model poisoner that generates nonsense content to degenerate LLMs. 🔗 https://codeberg.org/konterfai/konterfai

tldr.nettime

@wim_v12e Sadly I believe Glaze & Nightshade are both still dubiously effective. https://huggingface.co/blog/parsee-mizuhashi/glaze-and-anti-ai-methods

The anti-crawler stuff I can personally attest to; Nepenthes, Anubis, AI Labyrinth.

Glaze and the Effectiveness of Anti-AI Methods for Diffusion Models

A Blog post by Parsee Mizuhashi on Hugging Face

@wim_v12e maybe maintain a list of those tools at Codeberg etc so it dont get lost in the Fediverse?

@wim_v12e ... it may not be a tool on its own, but I created a blocklist project several months ago for use in Sinkholes / DNStraps... https://mastodon.social/@lumiworx/112690566913362512

The project page is here: https://codeberg.org/lumiworx/HPT-AI-Blocklist

HPT-AI-Blocklist

A collection of blocklists, for sites that generate or contain AI generated content. Formatted for use with HOSTS/PiHole/Technitium (HPT)

Codeberg.org

@wim_v12e

@silverface was asking earlier about anti-ai tools for video any suggestions?

@Sh4d0w_H34rt @silverface I don't know of any, I read somewhere the makers of Glaze are working on one.
@wim_v12e Many many thanks.
@TomasHradcky You are welcome. I will update it soon, and maybe put it online somewhere.

@wim_v12e

If you are using Gmail or Office365 you are training AI.