The Supreme Court has just turned down a petition to hear an appeal in a case that held that AI works can't be copyrighted. By turning down the appeal, the Supreme Court took a massively consequential step to protect creative workers' interests:

https://www.theverge.com/policy/887678/supreme-court-ai-art-copyright

--

If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:

https://pluralistic.net/2026/03/03/its-a-trap/#inheres-at-the-moment-of-fixation

1/

At the core of the dispute is a bedrock of copyright law: that copyright is for humans, and humans alone. In legal/technical terms, "copyright inheres at the moment of fixation of a work of human creativity." Most people - even people who work with copyright every day - have not heard it put in those terms. Nevertheless, it is the foundation of international copyright law, and copyright in the USA.

2/

Here's what it means, in plain English:

a) When a human being,

b) does something creative; and

c) that creative act results in a physical record; then

d) a new copyright springs into existence.

For d) to happen, a), b) and c) all have to happen first. All three steps for copyright have been hotly contested over the years.

3/

Remember the "monkey selfie," in which a photographer argued that he was entitled to the copyright after a monkey pointed a camera at itself and pressed the shutter button? That image was *not* copyrightable, because the monkey was a monkey, not a human, and copyright is only for humans:

https://en.wikipedia.org/wiki/Monkey_selfie_copyright_dispute

4/

Monkey selfie copyright dispute - Wikipedia

Then there's b), "doing something creative." Copyright only applies to *creative* work, not work itself. It doesn't matter how hard you labor over a piece of "IP" - if that work isn't creative, there's no copyright. For example, you can spend a fortune creating a phone directory, and you will get no copyright in the resulting work, meaning anyone can copy and sell it:

https://en.wikipedia.org/wiki/Feist_Publications,_Inc._v._Rural_Telephone_Service_Co.

5/

Feist Publications, Inc. v. Rural Telephone Service Co. - Wikipedia

If you mix a *little* creative labor with the hard work, you can get a *little* copyright. A directory of "all the phone numbers for cool people" can get a "thin" copyright over the *arrangement* of facts, but such a copyright still leaves space for competitors to make many uses of that work without your permission:

https://pluralistic.net/2021/08/14/angels-and-demons/#owning-culture

6/

Pluralistic: 14 Aug 2021 – Pluralistic: Daily links from Cory Doctorow

Finally, there's c): copyright is for *tangible* things, not intangibles. Part of the reason choreographers created a notation system for dance moves is that the moves themselves aren't copyrightable:

https://en.wikipedia.org/wiki/Dance_notation

The non-copyrightability of movement is (partly) why the noted sex-pest and millionaire grifter Bikram Choudhury was blocked from claiming copyright on ancient yoga poses (the other reason is that they are ancient!):

https://en.wikipedia.org/wiki/Copyright_claims_on_Bikram_Yoga

7/

Dance notation - Wikipedia

Now, AI-generated works are certainly tangible (any work by an AI *must* involve magnetic traces on digital storage media). The *prompts* for an AI output can be creative and thus copyrightable (in the same way that notes to a writers' room or from an art-director are). But the *output* from the AI *cannot* be copyrighted, because it is not a work of human authorship.

8/

This has been the position of the US Copyright Office from the start, when AI prompters started sending in AI-generated works and seeking to register copyrights in them. Stephen Thaler, a computer scientist who had prompted an image generator to produce a bitmap, kept appealing the Copyright Office's decision, seemingly without regard to the plain facts of the case and the well-established limits of copyright.

9/

By attempting to appeal his case all the way to the Supreme Court, Thaler has done every human artist a huge boon: his weak, ill-conceived case was easy for the Supreme Court to reject, and in so doing, the court has cemented the non-copyrightability of AI works in America.

10/

You may have heard that "Hard cases make bad law." Sometimes, there are edge-cases where following the law would result in a bad outcome (think of a Fourth Amendment challenge to an illegal search that lets a murderer go free). In these cases, judges are tempted to interpret the law in ways that distort its principles, and in so doing, create a bad precedent (the evidence from a bad search is permitted, and so cops stop bothering to get a warrant before searching people).

11/

This is one of the rare instances in which a bad case made *good* law. Thaler's case wasn't even close - it was an absolute loser from the jump. Normally, plaintiffs give up after being shot down by an agency like the Copyright Office or by a lower court. But not Thaler - he stuck with it all the way to the highest court in the land, bringing clarity to an issue that might have otherwise remained blurry and ill-defined for years.

12/

This is *wonderful* news for creative workers. It means that our bosses must pay humans to do work if they want to be granted copyright on the things they want to sell. The more that humans are involved in the creation of a work, the stronger the copyright on that work becomes - which means that the *less* a human contributes to a creative work, the harder it will be to prevent others from simply taking it and selling it or giving it away.

13/

This is so important. Our bosses *do not want to pay us*. When our bosses sue AI companies, it's not because they want to make sure we get paid.

The many pending lawsuits - from news organizations like the *New York Times*, wholesalers like Getty Images, and entertainment empires like Disney - all seek to establish that training an AI model is a copyright infringement.

14/

This is wrong as a technical matter: copyright clearly permits making transient copies of published works for the purpose of factual analysis (otherwise every search engine would be illegal). Copyright also permits performing mathematical analysis on those transient copies. Finally, copyright permits the publication of literary works (including software programs) that embed facts about copyrighted works - even billions of works:

https://pluralistic.net/2023/09/17/how-to-think-about-scraping/

15/

How To Think About Scraping – Pluralistic: Daily links from Cory Doctorow

Sure, you can infringe copyright *with* an AI model - say, by prompting it to produce infringing images. But the mere fact that a technology can be used to infringe copyright doesn't make the technology itself infringing (otherwise every printing press, camera, and computer would be illegal):

https://en.wikipedia.org/wiki/Sony_Corp._of_America_v._Universal_City_Studios,_Inc.

Of course, the fact that copyright *currently* permits training models doesn't mean that it *must*.

16/

Sony Corp. of America v. Universal City Studios, Inc. - Wikipedia

Copyright didn't come down from a mountain on two stone tablets. It's just a law, and laws can be amended. I think that amending copyright to ban training a model would inflict substantial collateral damage on everything from search engines to scholarship, but perhaps you disagree. Maybe you think that you could wordsmith a new copyright law that bans training without whacking a bunch of socially beneficial activities.

Even if that's so, *it still wouldn't help artists*.

17/

To understand why, consider Universal and Disney's lawsuit against Midjourney. The day that lawsuit dropped, I got a press release from the RIAA, signed by its CEO, Mitch Glazier. Here's how it began:

> There is a clear path forward through partnerships that both further AI innovation and foster human artistry. Unfortunately, some bad actors – like Midjourney – see only a zero-sum, winner-take-all game.

18/

The RIAA represents record labels, not film studios, but thanks to vertical integration, the big film studios are *also* the big record labels. That's why the RIAA alerted the press to its position on this suit.

There's two important things to note about the RIAA press release: how it opened, and how it closed. It opens by stating that the companies involved want "partnerships" with AI companies.

19/

In other words, if they establish that they have the right to control training on their archives, they *won't* use that right to prevent the creation of AI models that compete with creative workers. Rather, they will use that right to *get paid* when those models are created.

Expanding copyright to cover models isn't about *preventing* generative AI technologies - it's about ensuring that these technologies are licensed by incumbent media companies.

20/

This licensure would ensure that media companies would get paid for training, but it would also let them set the terms on which the resulting models were used. The studios could demand that AI companies put "guardrails" on the resulting models to stop them from being used to output things that might compete with the studios' own products.

21/

That's what the opening of this press-release signifies, but to really understand its true meaning, you have to look at the *closing* of the release: the signature at the bottom of it, "Mitch Glazier, CEO, RIAA."

Who is Mitch Glazier? Well, he *used* to be a Congressional staffer. He was the guy responsible for sneaking a clause into an unrelated bill that repealed "termination of transfer" for musicians.

22/

"Termination" is a part of copyright law that lets creators take back their rights after 35 years, even if they originally signed a contract for a "perpetual license."

Under termination, all kinds of creative workers who got royally screwed at the start of their careers were able to get their copyrights back and re-sell them. The primary beneficiaries of termination are musicians, who signed notoriously shitty contracts in the 1950s-1980s:

https://pluralistic.net/2021/09/26/take-it-back/

23/

Take it back – Pluralistic: Daily links from Cory Doctorow

When Mitch Glazier snuck a termination-destroying clause into legislation, he set the stage for the poorest, most abused, most admired musicians in recording history to lose access to money that let them buy a couple bags of groceries and make the rent. He condemned these beloved musicians to poverty.

What happened next is something of a Smurfs Family Christmas miracle.

24/

Musicians were so outraged by this ripoff, and their fans were so outraged on their behalf, that Congress convened a special session solely to repeal the clause that Mitch Glazier tricked them into voting for. Shortly thereafter, Glazier was out of Congress:

https://en.wikipedia.org/wiki/Mitch_Glazier

25/

Mitch Glazier - Wikipedia

But this story has a happy ending for Glazier, too - he might have been out of his government job, but he had a new gig, as CEO of the Recording Industry Association of America, where he earns more than $1.3 million/year to carry on the work he did in Congress - serving the interests of the record labels:

https://projects.propublica.org/nonprofits/organizations/131669037

26/

Record Industry Association Of America Inc - Nonprofit Explorer

Since 2013, the IRS has released data culled from millions of nonprofit tax filings. Use this database to find organizations and see details like their executive compensation, revenue and expenses, as well as download tax filings going back as far as 2001.

ProPublica

Mitch Glazier serves the interests of the *labels*, not musicians. He *can't* serve both interests, because every dime a musician takes home is a dime that the labels don't get to realize as profits. Labels and musicians are class enemies. The fact that many musicians are on the labels' side when they sue AI companies *does not* mean that the labels are on the musicians' side.

27/

What will the media companies do if they win their suits? Glazier gives us the answer in the opening of his release: they will create "partnerships" with AI companies to train models on the work we produce.

This is the lesson of the past 40 years of copyright expansion. For 40 years, we have expanded copyright in every way: copyright lasts longer, covers more works, prohibits more uses without licenses, establishes higher penalties, and makes it easier to win those penalties.

28/

Today, the media industry is larger and more profitable than at any time, *and* the share of those profits that artists take home is smaller than ever.

How has the expansion of copyright led to media companies getting richer and artists getting poorer? That's the question that Rebecca Giblin and I answer in our 2022 book *Chokepoint Capitalism*.

29/

In a nutshell: in a world of five publishers, four studios, three labels, two app companies and one company that controls all ebooks and audiobooks, giving a creative worker more copyright is like giving your bullied kid extra lunch money. It doesn't matter how much lunch money you give that kid - the bullies will take it all, and the kid will go hungry:

https://pluralistic.net/2022/08/21/what-is-chokepoint-capitalism/

30/

What is Chokepoint Capitalism? – Pluralistic: Daily links from Cory Doctorow

Indeed, if you keep giving that kid more lunch money, the bullies will eventually have enough dough that they'll hire a fancy ad-agency to blitz the world with a campaign insisting that our schoolkids are all going hungry and need *even more* lunch money (they'll take that money, too).

31/

When Mitch Glazier - who got a $1m+/year job for the labels after attempting to pauperize musicans - writes on behalf of Disney in support of a copyright suit to establish that copyright prevents training a model without a license, he's not defending creative workers.

32/

Disney is the company that takes the position that if it buys a company like Lucasfilm or Fox, it only acquires the *right* to use the works we made for those companies, but not the *obligation* to pay us when they do:

https://pluralistic.net/2021/04/29/writers-must-be-paid/#pay-the-writer

If a new, unambiguous copyright over model training comes into existence - whether through a court precedent or a new law - then all our contracts will be amended to non-negotiably require us to assign that right to our bosses.

33/

Pluralistic: 29 Apr 2021 – Pluralistic: Daily links from Cory Doctorow

@pluralistic

Not the available products create markets.

Rules create markets.

AI apologists didn't understand this for a long time and declined any rules. Has this changed?

Another perspective:
One of the goals for AI is to replace the human decision making process. And with that make the rules. Become the legislative and judicial power.

That is a problem for the rest of the market, because AI is also a producer, a competitor.

@pluralistic

Or isn't it just a model for my own understanding?
The principles and structures we see in AI nowadays have always been the same. Corruption is not a novelty. It has become just way more abstract and by that more obvious for some and more obfuscated for many others.

@pluralistic in your opinion, is there any impact to vibe coded / AI generated code? Does the company or user own the IP or will AI companies (possibly) be able to claim IP ownership or revoke / charge more for IP created with their tools?