LLM-Driven Large Code Rewrites With Relicensing Are The Latest AI Concern

https://libretechni.ca/post/1009978

LLM-Driven Large Code Rewrites With Relicensing Are The Latest AI Concern - LibreTechni.ca

>The newest open-source concern around AI that is seeing a lot of interest this weekend is when large language models / AI code generators may rewrite large parts of a codebase and then the “developers” claiming an alternative license incompatible with the original source license. This became a real concern this week with a popular Python project experiencing an AI-driven code rewrite and now published under an alternative license that its original author does not agree with and incompatible with the original code. > >Chardet as a Python character encoding detector with its v7.0 release last week was a “ground-up, MIT-licensed rewrite of chardet.” This rewrite was largely driven via AI/LLM and claims to be up to 41x faster and offer an array of new features. But with this AI-driven rewrite, the license shifted from the LGPL to MIT.

I’m confused. Why a licensing change is needed? In the particular example they changed to MIT. Is this considered as the first step for paud features and other stuff?

My confusion is regarding why a licensing change is done, not that whether it is valid or not, since it is LGPL it is not valid, and no doubt in that. But what is the intention behind this change is what I don’t understand. Can someone explain?

LGPL is unenforceable with AI-generated code. LGPL puts certain constraints on how the code can be used, but if someone were to use AI-generated code in a way that violates its LGPL license, all that person has to say is that it’s AI-generated code, so it’s in the public domain and they’re free to do with it whatever they want, and they would legally be right.

, so it’s in the public domain and they’re free to do with it whatever they want, and they would legally be right.

What do you base this “all AI code is public domain by legal definition” on?

There has already been a ruling in the US that AI-generated art cannot be copyrighted because it lacks human authorship, so it stands to reason that the same is true for code. Even copyleft is ultimately dependent on copyright to be legally enforceable.

And even if all of the rest of the world were to decide otherwise about whether AI-generated works can be copyrighted (which I very much doubt would happen), given how much software development happens in the US, it would still make the license pretty toothless.

AI-generated art not being copyrightable doesn’t necessarily mean AI-generated art can’t violate original copyright, though.

This is not about AI-generated code being relicensed to different AI-generated code. It’s about the original licensed code being relicensed or otherwise violated through AI-generated code.

You’re not wrong, but I don’t see how it’s relevant to what I’m trying to say. Whether or not they’re legally allowed to change the license has nothing to do with why they might want to change the license.