"Nightshade" is a tool that allows artists to subtly alter their digital artwork's pixels to disrupt #AI training sets. It causes AI model malfunctions, leading to errors like misidentifying objects. Alongside "Glaze," which masks artists' styles from AI scrapers, these tools address artists' rights and intellectual property issues in the AI industry. https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/
This new data poisoning tool lets artists fight back against generative AI

The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. 

MIT Technology Review
@arselectronica do they dilute the pixels with a ratio of 10 to 1 with water, or 100 to 1?
@arselectronica now we need it for photos of people. We are already screwed but we can protect our kids from having their likeness used by AI.
@Beeks @arselectronica Should work for photos too. Any images
@arselectronica >inb4 they find out about Gaussian blur...