Seeing more programmers take the stance of "the plagiarism machine works, so I guess we have accept it".

Contrast that with art and creative industries, where theft and plagiarism aren't new phenomenons. Unions keep these industries alive. And it's laughable to imagine telling an artist that they have to accept a plagiarist in their communities.

Easy to say "let's all be nice to each other" when you're well paid and haven't lost your job yet. Some tough lessons will be learned soon.

You know how you fight corporations? With unions. Picket lines. New contracts blackballing plagiarists. Older workers walking out to support their (financially insecure) younger peers. That's what past generations did to maintain the world we enjoy today.

You do not fight by saying "plagiarists are people too", not picketing, and not walking out.

When you take this stance, you're trying to sound nice, but you're naive -- you're collecting your paycheck and condemning the next generation.

And as an aside: who cares if the plagiarism machines are getting better? I should hope a plagiarist can produce a final product.

It's not a grand observation that theft and plagiarism will help you accomplish a task faster. it's not new information that stealing things is cheaper than making them.

I do not care if Claude produces perfect code. It doesn't, and it won't, but even if it did I would not use it. Because I'm not a fucking plagiarist. And you shouldn't be either.

The action item you can take away from this rant: call things like Claude or the like what they are: plagiarism machines.

That's not a dig -- that's a more accurate description than "artificial intelligence".

When your coworker argues that it's merely automation, be the nerd who corrects them with "automated plagiarism".

Normalize describing gen AI accurately.

I strongly believe if more people understood how it works they would not use it.

@protowlf Do you have a good source (ironically) that sets out the reasoning behind why we should understand what's going on inside one of these models while code is coming out as "plagiarism"?

They absolutely will spit out whole chunks of code that one can point to and say "that was copied from such-and-such without following the license".

But they also don't *always* do that. Are they *always* plagiarizing, or only sometimes?

@protowlf I looked and found https://lawreview.uchicago.edu/online-archive/plagiarism-copyright-and-ai but that is set in the academic and legal sphere, where a whole idea or concept being spit out without the citation to its originator constitutes "plagiarism".

I assume I'm not guilty of plagiarism in the computing field every time I apply a facade or visitor pattern without citing the Gang of Four.

Plagiarism, Copyright, and AI | The University of Chicago Law Review

Critics of generative AI often describe it as a “plagiarism machine.” They may be right, though not in the sense they mean. With rare exceptions, generative AI doesn’t just copy someone else’s creative expression, producing outputs that infringe copyright. But it does get its ideas from somewhere. And it’s quite bad at identifying the source of those ideas. That means that students (and professors, and lawyers, and journalists) who use AI to produce their work generally aren’t engaged in copyright infringement. But they are often passing someone else’s work off as their own, whether or not they know it. While plagiarism is a problem in academic work generally, AI makes it much worse because authors who use AI may be unknowingly taking the ideas and words of someone else. Disclosing that the authors used AI isn’t a sufficient solution to the problem because the people whose ideas are being used don’t get credit for those ideas. Whether or not a declaration that “AI came up with my ideas” is plagiarism, failing to make a good-faith effort to find the underlying sources is a bad academic practice. We argue that AI plagiarism isn’t—and shouldn’t be—illegal. But it is still a problem in many contexts, particularly academic work, where proper credit is an essential part of the ecosystem. We suggest best practices to align academic and other writing with good scholarly norms in the AI environment.

@protowlf I can see this making sense by a sort of "conservation of thought" argument: if those big matrices are definitionally devoid of the spark of original thought, then anything that comes out is definitionally derivative, copied from somewhere even if the sources are so scattered and numerous that they are permanently unidentifiable.