😐

hoping AI companies remove your work from their datasets (after already including it without consent) is not going to be enough

https://decrypt.co/150575/greg-rutkowski-removed-from-stable-diffusion-but-brought-back-by-ai-artists

Greg Rutkowski Was Removed From Stable Diffusion, But AI Artists Brought Him Back

More popular than Picasso and Leonardo Da Vinci among AI artists, Greg Rutkowski opted out of the Stable Diffusion training set. The community just created a LoRA to mimic his style.

Decrypt

@molly0xfff here's the thing - if I copy a painting in Greg's style, AI or not, who is publishing that painting? Images are published by *people*, not robots or AIs, saying "AIs are copying my art" is like saying "Pencils can copy my art!"

It doesn't matter *how* they copy it, if a person copies work and publishes it, the *person* is at fault. It's on people publishing AI art to prove they haven't copied anything so overtly that it crosses the line into plagiarism

@molly0xfff that being said, it's 1000% clear that the randos in /r/stablediffusion and likely in bigger companies as well, give zero fucks towards the issues of Provenance and seeing what source images contributed to an output which is deeply disappointing and frustrating :-/
@anizocani @molly0xfff What you said is a variant of the ā€œguns don’t kill peopleā€ argument. Machine learning systems aren’t pencils. A thing has been made that both massively simplifies and obscures acts of piracy, laying it all on the user lets the ML creators off in the same way gun pushers are let off.
@bjn @molly0xfff uh, no? ML is not used solely for plagiarism and you could make an identical argument 20 years ago about Photoshop. It is a Tool, just like Photoshop is.
@anizocani @molly0xfff ML isn’t an image editing program, its trained on millions of other peoples work, Photoshop (well until recently) isn’t. Just like a pencil isn’t. Saying they are ā€˜just a tool’ ignores the fact the tool was fundamentally built on copying. So when it produces copies (sometimes verbatim in LLMs) you can’t say ā€œooh, it’s the users faultā€, because that’s what the thing was designed to do.

@bjn @molly0xfff wait until you find out what the Selection tool does in Photoshop - it literally copies others' work!

Maybe you weren't around for this but this was *actually* the argument and people banned digital art for exactly these kinds of ideas, only even more literal

It didn't kill art then, it actually mostly made it better, and I don't think this will kill art either

@anizocani @molly0xfff I’ve been writing software to manipulate images since the 80s. So I’m well aware of what photoshop did then and does now. There was no ML in photoshop until recently.

Where did I say that ML will kill art? I’m saying it’s built to copy art at best and will steal art at worse. I’m also saying that the people who make it know that their tool will be used for that, so putting infringement all on the user of the ML system, rather than the creators of the system, is wrong.

@bjn @molly0xfff it sounds like we just disagree. Have a good day!
@anizocani @molly0xfff theft is theft, even if you get a computer to hide the fact from you.
@anizocani @bjn @molly0xfff art can't be killed. The huge issue is that artist that charge money for their art will see their incomes affected because users will get for free what they could only get by paying the artist before. Those artists didn't agree to have their art used for this purpose. At least before ML when copying from someone, practice and talent was required and that produced derivative work which included the new artist's own style. Now its just plain and simple theft...
@ChristianRVHM @anizocani @molly0xfff ML will still need to be paid for, but the money will go to the companies that scraped other people’s IP rather than the people who made it.
@bjn @anizocani @molly0xfff indeed, there is no way that a single penny will go to the artists. It will never be profitable to share with the artists.