"Nightshade" is a tool that allows artists to subtly alter their digital artwork's pixels to disrupt #AI training sets. It causes AI model malfunctions, leading to errors like misidentifying objects. Alongside "Glaze," which masks artists' styles from AI scrapers, these tools address artists' rights and intellectual property issues in the AI industry. https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/
This new data poisoning tool lets artists fight back against generative AI

The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models. 

MIT Technology Review

@arselectronica

Not all heroes wear capes.