There’s a meme going around that an Open Source project “can’t” prevent LLM use by contributors because there’s no technical means to enforce this. This is idiotic and shows just how disingenuous slopmongers will be when told they can’t just submit slop.

Did you know there’s also no technical means to enforce that you didn’t copy some code you’re contributing from a proprietary codebase and say it’s original work? Somehow we haven’t given up on that!

#opensource #ai #llm #slop

The enforcement mechanism is exactly the same: There’s no *technical means* to prevent someone from being a filthy fucking liar. But there are *social means* to prevent them from contributing: You make sure that if they’re caught, they’re held publicly accountable for all of the rework and mess that resulted from their lies.

This has worked pretty well for decades in Open Source, and won’t stop working just because slopmongers wish really hard. Fucking scrubs.

#opensource #ai #llm #slop

@eschaton This is what I dislike most about these bros, they always resort to "you're powerless to stop it", it really says a lot about their character.
@doombloomart @eschaton and a hell of a lot about their attitudes towards consent.
@doombloomart @eschaton Didn’t they (and it was mostly the same people) say the same about crypto no more than two years ago? Look what became of that.
@gavin57 I'm fairly convinced that these people aren't even serious about predictions like that, but say them so confidently to make critics doubt themselves and maybe to jerk off to it, idk
@doombloomart @eschaton I think they’re just incredibly naive. Don’t get me wrong, I also went through that phase —I even bought a PhysX card for my PC, but eventually they’ll learn the hard way that most of what comes out of the Silicon Valley bubble is…how do I put this politely…a solution looking for a problem.
@doombloomart @eschaton I literally said this the other day, FOSS has had to deal with this since day one, hell proprietary software developers have had to deal with this since day one. Any submission could be legally risky, this is not a new problem.
@reflex @doombloomart Right?! It’s just the same thing as always, that LLMs are involved does not make the situation novel!
@eschaton I mean this sincerely: we need to bring back shame. We need to bring back social consequences for being an antisocial fuck.

@eschaton ‘technically’ of course is the best kind of correct ;-)

These are the same folks who think the law is deterministic, so I wouldn’t put much value in them.

@eschaton as usual, there's an xkcd on this exact point https://xkcd.com/1494/
Insurance

xkcd

@eschaton
Agree about them, there is a point: although there can be concrete evidence one has plagiarized from another source, it’s harder to when the source is unpublished. So it’s easier for the offender to brazenly lie.

That said, there is no reason not to punish them if they admit it, privately or publicly.

@eschaton Yeah. It’s good to be realistic about the compliance but not fatalistic. We may have to figure out how to encourage people to live up to these ideals in some cases, but we shouldn’t just give up on them.

I feel like there was a similar thing with masking. We know it works and we know people don’t always comply. Some people were eager to point the lack of compliance out as a reason to give up on them.

@eschaton One surprising thing I have learned from this is that some or even many projects feel obligated to consider big pull requests from random new contributors. I guess I am enough of a control freak that I would reject those out of hand regardless of how they came to be. Open-sourcing my work would never be an invitation to the world to assign me code reviews. That’s not help; that’s inserting that random new contributor’s priorities ahead of my own. Or maybe I just don’t understand open source.
@eschaton My point here is that it wouldn’t bother me at all to start rejecting big PRs from random LLMs. I would feel much worse telling a random human “I’m sorry you spent all that time writing code that I was never going to accept, but the project policy is…”

@eschaton Just enforce code quality. That'll pretty much eliminate slopomatic submissions on its own.

Better yet start enforcing formal proofs and safe languages to go with it.

@etchedpixels I don’t want LLM submissions wasting my time to begin with. If it’s not written by a person, and they can’t or won’t attest that it’s their own work, they can fuck off.
@eschaton very Air Bud energy: "ain't no rule says a dog can't play basketball."
@eschaton "first time submissions will need to add a new file with the count of the number of 'r's in strawberry"
@eschaton You forgot the first rule of VC techbroism. If you tack "with a computer" onto something that's illegal or otherwise forbidden, it's suddenly no longer forbidden.

@eschaton As a FOSS maintainer, I'm very aware that the work being given away for free in these projects has been the result of humans burning their finite life to give the result to others.

Finally they are getting some help (not from the people posting this kind of wailing) and the complaint is "no, I only want the result of your human work, your actual life force in your code, or I feel cheated".

@hopeless If you don’t want to write code, then don’t. If you want to fill your own projects with slop, that’s up to you. But don’t tell people they have to accept slop and the liability it brings in *their* projects.

@eschaton

Current AI coding agents write code that passes the same static analysis as humans do. And I have literally been given many much worse patches by humans that needed significant work to be able to include them.

1) If you don't understand what the Mar 2026 coding agents produce, why are you giving advice about it?

2) I sometimes wonder if FOSS has created a terrible moral hazard in the users, who thoughtlessly consume the work for free. It seems to create entitled monsters.