Meet Nightshade and Glaze: free tools created by the Chicago University for creators to protect their work from unwanted AI training and AI mimicry.

#nightshade #glaze #ai #aitraining

Glaze is a tool to disrupt art style mimicry. It understands how AI models work and computes minimal changes invisible to humans but drastically different to AI. The best thing is that it cannot be easily removed from the artwork with image manipulation.

Credit: @zemotion on X

Nightshade is a similar tool, which "poisons the AI” so it gets bad results of the piece it tries to recreate. Its goal is not to break models, but to increase the cost of training on unlicensed data, making licensing images properly a more viable alternative.
Using tools like Nightshade and Glaze is simple enough and is definitely a clever way to protest against unethical image scraping. However, you often need another line of defense. What can you do?

Ways to protect yourself

✅ Add copyright notice
✅ Make sure copyright-related metadata is included when content is published/shared and not stripped away
✅ Add license and terms
✅ Don't upload high-quality original directly to the platforms

Automate all of this and more with Macula

©️ Macula takes care of the relevant metadata, including AI-related statements
👁️ Share images anywhere without giving away the originals
📈 Gain exposure and grow on your terms

Learn more and sign up today: https://macula.link/

#ai #copyright #macula #onyourterms ms