Nightshade was just released. Poison your art before you post it so that AI scrapers choke on it https://nightshade.cs.uchicago.edu/

(Further edit to stress that I did not make this product and I have nothing to do with it so please don't assume I did)

For the avoidance of doubt, I did not make this product or website
@psychicteeth Not an artist, but please please please let it be widely adapted and work as intended.
@psychicteeth this sounds really awesome!
@Yotenotes The idea of using AI to defeat AI has been around for a while, as well as other techniques like adverserial masks to dupe surveillance AI, an earlier iteration of the art poisoner called Glaze, I guess going back to steganography watermarks and hiding data in your images. Very cool ideas floating around
@psychicteeth 2.6GB size, the server is not up to it and no torrent link? 
I'm hesitant to boost this because you can't download it right now anyway.
@kolya fair! I haven't looked into setting it up, at all, just saw the team posting about it and it seemed cool
@psychicteeth well if I ever get it I'll gladly set up a torrent, but A) that's going to take ~15h and B) the team should do it and link it on their page, otherwise it's of limited use
@psychicteeth oh well, I sent them a mail, we'll see how it goes
@kolya I agree with you, for something of that size a torrent would make perfect sense.

@psychicteeth Nothing for Linux? I guess I can download it and run it in a Windows 10 VM... this seems like a workflow that I won't continue after three days, though.

*Edit*: emailed the developers. Maybe they have a Linux version in the back room or something.

@guyjantic Feels to me like something that could be run as a service
"Nightshade transforms images into "poison" samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space." @psychicteeth
@sciencewrighter for those wondering, these are not my words
GitHub - RichardAragon/NightshadeAntidote: An 'antidote' to the recently released AI poison pill project known as Nightshade.

An 'antidote' to the recently released AI poison pill project known as Nightshade. - GitHub - RichardAragon/NightshadeAntidote: An 'antidote' to the recently released AI poison pill...

GitHub
@psychicteeth But what if I _want_ my art to become part of the training data sets for AI? What if I want image generators to become so good that their results are indistinguishable from human made art, just to see if it can be done?
@psychicteeth
This is very interesting, part of my job, CyberSecurity Architect, is to consider how to protect data and systems from interference and poisoning of AI is part of that.
Model poisoning is usually seen in the CyberSecurity world as a negative act, but I have to say I do like this concept of increasing the cost of training models, though I would prefer the cost to be like a payment to the creator.
@StevieP the security of things that destroy culture is not a concern of mine
@psychicteeth
The security of things that help people to protect, support and care for other people is a concern of mine