AI's aren't sentient. They can't "steal."

Programmers and institutions select the data with which to train the model. They take art and writing from artists and authors without credit or payment. The software then remixes and mimics what it is given.

Displacing agency by attributing intent to the AI is exactly how people and institutions erase human action in the creation of technology. It also leads to further perceptions of technology as acultural, unbiased, and, in essence, magical.

@Manigarm This is an interesting point, and certainly correct.

It's also exactly how humans learn to become artists and writers - by studying, mimicking, and eventually adding to the existing body of work. We don't generally consider that theft, unless the copying is exact or deceptive.

Yet AI feels somehow different, much more like plagiarism. Perhaps it's that the ONLY input an ML system has is others' art, with no real-world human experience of its own to contribute.

@Manigarm I think part of it is that we expect art and literature to have a creator, an actual person whose work expresses a human point of view, one that encompasses something beyond the literal work itself. By lacking an author who stands behind it, is AI-generated art somehow inherently fraudulent? Maybe.
@Manigarm Is the person who runs an AI-based art generator and selects which ones are "good" any less an artist than Duchamp with his readymades?
@mattblaze
I think the labor and value-accrual/ownership analysises end up being much more useful than debating whether or not the outputs are art.
@Manigarm @mattblaze
@dymaxion @Manigarm They're related. Consider the two main ways (visual) artists are employed: as illustrators-for-hire (e.g., by publications) and as fine artists collected by rich investors. For the former, AI systems (currently) produce nice looking illustrations, but they require extensive selection before getting one that's an exact fit; it's probably cheaper & faster to just hire an artist. But for the latter, "Is it art" is a central issue for collector value.

@mattblaze
Honestly, a lot more artists do small scale sales in middle-class contexts than do sales to the rich, and in that context, their work is bought for a more utilitarian understanding of decorativeness+meaning, much closer to the illustration case. But what I mean around labor and value-accrual is on the other end. None of these systems work without the training data, and there derivative works that conversation compressed versions of that training data, and yet the value accrues entirely to an intermediary. Artists have the right to set their own rates for derivative works licensing. I'm not usually an IP maximalist, but there's a meaningful distinction between access to culture on an individual basis and the creation of a system intended to evolve to a point where the work of the people from which its creators are stealing is entirely replaced.

Something entirely based on theft from other people cannot be art. An original tuned prompt absolutely can have artistic merit, but the artistic merit and "artness" of the result rests almost entirely on a) the work of the ML engineering team, and b), much more heavily, on the source material. Now, there's of course a long tradition of artists working with general purpose software and having that software considered a tool or at most (e.g. with reactive projection mapping installations and touchdesigner or max) a medium, so we can ignore the first. However the last is and always will be an intrinsic component, massively more important than anything involved in prompt engineering. The prompt engineering has contributed almost none of the art and should receive almost none of the value.

It would be perfectly possible, if any of these companies cared, to create a fair licensing structure and modeling systems that could provide a proportional attribution distribution across the source material derived from each prompt, paid out in accordance with derivative work prices determined by the rights holders. Until they do this, it's nothing but theft.

@mattblaze
And yes, it's unclear if IP law, created and maintained as a tool to make money flow towards capital and as often as not weaponized against individual artists these days, will support this understanding. Certainly, the ML image generators are operating in the fine Valley tradition of "ignore the law until we're big enough to buy the laws we need". Pretending that this is reasonable conduct and something that can be overlooked to consider the product as though it did not occur in a context of theft by the ML data collection teams is unreasonable.
@dymaxion @mattblaze the law is certainly not set up for this, but it’s doubtful that it’s infringement in the US, but might run into moral rights elsewhere. I don’t think theft is the right framework to think about AI datasets at all. The question is of either displacement (artists losing employment) or of French style moral rights, which isn’t a great place to be. Displacement is the obvious first case, though what is being displaced? Is there material harm?
@dymaxion @mattblaze does that displacement harm the creative world, or does the creative work react and adapt? Is there lose or gain order money, prestige, etc.?
@dymaxion @mattblaze Then there’s moral rights, but if we say an artist should be able to control how a machine looks at and learns from art, how to we distinguish that from a human? And in what setting? With what tools? How many digital artifacts between a human and a piece do art are ok? How do we define that socially, much less legally?

@dymaxion @mattblaze If we said somehow that we didn't want to let any copyrighted art be consumed by AI, and it was a successful influence on culture, are we not just washing out all innovation from the last 90s years? Will artists have to chase that fashion to stay relevant? If AI replaces artists, does that just push them more into the fine art world of catering to the rich?

Can they be pushed more that way? They're already just about entirely there.

@dymaxion @mattblaze IP law doesn't stop being utterly problematic in the cases where you like it, nor does the neoliberal logic about art markets suddenly become desirable when a machine and made pictures. The art market and IP are hopelessly corrupt, in a very master's tools way.
@dymaxion @mattblaze The political/economic system we live in makes your friends poor, (Jaron) not filesharing, or now, AI.