The Supreme Court has just turned down a petition to hear an appeal in a case that held that AI works can't be copyrighted. By turning down the appeal, the Supreme Court took a massively consequential step to protect creative workers' interests:

https://www.theverge.com/policy/887678/supreme-court-ai-art-copyright

--

If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:

https://pluralistic.net/2026/03/03/its-a-trap/#inheres-at-the-moment-of-fixation

1/

At the core of the dispute is a bedrock of copyright law: that copyright is for humans, and humans alone. In legal/technical terms, "copyright inheres at the moment of fixation of a work of human creativity." Most people - even people who work with copyright every day - have not heard it put in those terms. Nevertheless, it is the foundation of international copyright law, and copyright in the USA.

2/

Here's what it means, in plain English:

a) When a human being,

b) does something creative; and

c) that creative act results in a physical record; then

d) a new copyright springs into existence.

For d) to happen, a), b) and c) all have to happen first. All three steps for copyright have been hotly contested over the years.

3/

Remember the "monkey selfie," in which a photographer argued that he was entitled to the copyright after a monkey pointed a camera at itself and pressed the shutter button? That image was *not* copyrightable, because the monkey was a monkey, not a human, and copyright is only for humans:

https://en.wikipedia.org/wiki/Monkey_selfie_copyright_dispute

4/

Monkey selfie copyright dispute - Wikipedia

Then there's b), "doing something creative." Copyright only applies to *creative* work, not work itself. It doesn't matter how hard you labor over a piece of "IP" - if that work isn't creative, there's no copyright. For example, you can spend a fortune creating a phone directory, and you will get no copyright in the resulting work, meaning anyone can copy and sell it:

https://en.wikipedia.org/wiki/Feist_Publications,_Inc._v._Rural_Telephone_Service_Co.

5/

Feist Publications, Inc. v. Rural Telephone Service Co. - Wikipedia

If you mix a *little* creative labor with the hard work, you can get a *little* copyright. A directory of "all the phone numbers for cool people" can get a "thin" copyright over the *arrangement* of facts, but such a copyright still leaves space for competitors to make many uses of that work without your permission:

https://pluralistic.net/2021/08/14/angels-and-demons/#owning-culture

6/

Pluralistic: 14 Aug 2021 – Pluralistic: Daily links from Cory Doctorow

Finally, there's c): copyright is for *tangible* things, not intangibles. Part of the reason choreographers created a notation system for dance moves is that the moves themselves aren't copyrightable:

https://en.wikipedia.org/wiki/Dance_notation

The non-copyrightability of movement is (partly) why the noted sex-pest and millionaire grifter Bikram Choudhury was blocked from claiming copyright on ancient yoga poses (the other reason is that they are ancient!):

https://en.wikipedia.org/wiki/Copyright_claims_on_Bikram_Yoga

7/

Dance notation - Wikipedia

Now, AI-generated works are certainly tangible (any work by an AI *must* involve magnetic traces on digital storage media). The *prompts* for an AI output can be creative and thus copyrightable (in the same way that notes to a writers' room or from an art-director are). But the *output* from the AI *cannot* be copyrighted, because it is not a work of human authorship.

8/

This has been the position of the US Copyright Office from the start, when AI prompters started sending in AI-generated works and seeking to register copyrights in them. Stephen Thaler, a computer scientist who had prompted an image generator to produce a bitmap, kept appealing the Copyright Office's decision, seemingly without regard to the plain facts of the case and the well-established limits of copyright.

9/

By attempting to appeal his case all the way to the Supreme Court, Thaler has done every human artist a huge boon: his weak, ill-conceived case was easy for the Supreme Court to reject, and in so doing, the court has cemented the non-copyrightability of AI works in America.

10/

You may have heard that "Hard cases make bad law." Sometimes, there are edge-cases where following the law would result in a bad outcome (think of a Fourth Amendment challenge to an illegal search that lets a murderer go free). In these cases, judges are tempted to interpret the law in ways that distort its principles, and in so doing, create a bad precedent (the evidence from a bad search is permitted, and so cops stop bothering to get a warrant before searching people).

11/

This is one of the rare instances in which a bad case made *good* law. Thaler's case wasn't even close - it was an absolute loser from the jump. Normally, plaintiffs give up after being shot down by an agency like the Copyright Office or by a lower court. But not Thaler - he stuck with it all the way to the highest court in the land, bringing clarity to an issue that might have otherwise remained blurry and ill-defined for years.

12/

This is *wonderful* news for creative workers. It means that our bosses must pay humans to do work if they want to be granted copyright on the things they want to sell. The more that humans are involved in the creation of a work, the stronger the copyright on that work becomes - which means that the *less* a human contributes to a creative work, the harder it will be to prevent others from simply taking it and selling it or giving it away.

13/

This is so important. Our bosses *do not want to pay us*. When our bosses sue AI companies, it's not because they want to make sure we get paid.

The many pending lawsuits - from news organizations like the *New York Times*, wholesalers like Getty Images, and entertainment empires like Disney - all seek to establish that training an AI model is a copyright infringement.

14/

This is wrong as a technical matter: copyright clearly permits making transient copies of published works for the purpose of factual analysis (otherwise every search engine would be illegal). Copyright also permits performing mathematical analysis on those transient copies. Finally, copyright permits the publication of literary works (including software programs) that embed facts about copyrighted works - even billions of works:

https://pluralistic.net/2023/09/17/how-to-think-about-scraping/

15/

How To Think About Scraping – Pluralistic: Daily links from Cory Doctorow

Sure, you can infringe copyright *with* an AI model - say, by prompting it to produce infringing images. But the mere fact that a technology can be used to infringe copyright doesn't make the technology itself infringing (otherwise every printing press, camera, and computer would be illegal):

https://en.wikipedia.org/wiki/Sony_Corp._of_America_v._Universal_City_Studios,_Inc.

Of course, the fact that copyright *currently* permits training models doesn't mean that it *must*.

16/

Sony Corp. of America v. Universal City Studios, Inc. - Wikipedia

Copyright didn't come down from a mountain on two stone tablets. It's just a law, and laws can be amended. I think that amending copyright to ban training a model would inflict substantial collateral damage on everything from search engines to scholarship, but perhaps you disagree. Maybe you think that you could wordsmith a new copyright law that bans training without whacking a bunch of socially beneficial activities.

Even if that's so, *it still wouldn't help artists*.

17/

To understand why, consider Universal and Disney's lawsuit against Midjourney. The day that lawsuit dropped, I got a press release from the RIAA, signed by its CEO, Mitch Glazier. Here's how it began:

> There is a clear path forward through partnerships that both further AI innovation and foster human artistry. Unfortunately, some bad actors – like Midjourney – see only a zero-sum, winner-take-all game.

18/

The RIAA represents record labels, not film studios, but thanks to vertical integration, the big film studios are *also* the big record labels. That's why the RIAA alerted the press to its position on this suit.

There's two important things to note about the RIAA press release: how it opened, and how it closed. It opens by stating that the companies involved want "partnerships" with AI companies.

19/

In other words, if they establish that they have the right to control training on their archives, they *won't* use that right to prevent the creation of AI models that compete with creative workers. Rather, they will use that right to *get paid* when those models are created.

Expanding copyright to cover models isn't about *preventing* generative AI technologies - it's about ensuring that these technologies are licensed by incumbent media companies.

20/

This licensure would ensure that media companies would get paid for training, but it would also let them set the terms on which the resulting models were used. The studios could demand that AI companies put "guardrails" on the resulting models to stop them from being used to output things that might compete with the studios' own products.

21/

That's what the opening of this press-release signifies, but to really understand its true meaning, you have to look at the *closing* of the release: the signature at the bottom of it, "Mitch Glazier, CEO, RIAA."

Who is Mitch Glazier? Well, he *used* to be a Congressional staffer. He was the guy responsible for sneaking a clause into an unrelated bill that repealed "termination of transfer" for musicians.

22/

"Termination" is a part of copyright law that lets creators take back their rights after 35 years, even if they originally signed a contract for a "perpetual license."

Under termination, all kinds of creative workers who got royally screwed at the start of their careers were able to get their copyrights back and re-sell them. The primary beneficiaries of termination are musicians, who signed notoriously shitty contracts in the 1950s-1980s:

https://pluralistic.net/2021/09/26/take-it-back/

23/

Take it back – Pluralistic: Daily links from Cory Doctorow

When Mitch Glazier snuck a termination-destroying clause into legislation, he set the stage for the poorest, most abused, most admired musicians in recording history to lose access to money that let them buy a couple bags of groceries and make the rent. He condemned these beloved musicians to poverty.

What happened next is something of a Smurfs Family Christmas miracle.

24/

Musicians were so outraged by this ripoff, and their fans were so outraged on their behalf, that Congress convened a special session solely to repeal the clause that Mitch Glazier tricked them into voting for. Shortly thereafter, Glazier was out of Congress:

https://en.wikipedia.org/wiki/Mitch_Glazier

25/

Mitch Glazier - Wikipedia

But this story has a happy ending for Glazier, too - he might have been out of his government job, but he had a new gig, as CEO of the Recording Industry Association of America, where he earns more than $1.3 million/year to carry on the work he did in Congress - serving the interests of the record labels:

https://projects.propublica.org/nonprofits/organizations/131669037

26/

Record Industry Association Of America Inc - Nonprofit Explorer

Since 2013, the IRS has released data culled from millions of nonprofit tax filings. Use this database to find organizations and see details like their executive compensation, revenue and expenses, as well as download tax filings going back as far as 2001.

ProPublica

Mitch Glazier serves the interests of the *labels*, not musicians. He *can't* serve both interests, because every dime a musician takes home is a dime that the labels don't get to realize as profits. Labels and musicians are class enemies. The fact that many musicians are on the labels' side when they sue AI companies *does not* mean that the labels are on the musicians' side.

27/

What will the media companies do if they win their suits? Glazier gives us the answer in the opening of his release: they will create "partnerships" with AI companies to train models on the work we produce.

This is the lesson of the past 40 years of copyright expansion. For 40 years, we have expanded copyright in every way: copyright lasts longer, covers more works, prohibits more uses without licenses, establishes higher penalties, and makes it easier to win those penalties.

28/

Today, the media industry is larger and more profitable than at any time, *and* the share of those profits that artists take home is smaller than ever.

How has the expansion of copyright led to media companies getting richer and artists getting poorer? That's the question that Rebecca Giblin and I answer in our 2022 book *Chokepoint Capitalism*.

29/

In a nutshell: in a world of five publishers, four studios, three labels, two app companies and one company that controls all ebooks and audiobooks, giving a creative worker more copyright is like giving your bullied kid extra lunch money. It doesn't matter how much lunch money you give that kid - the bullies will take it all, and the kid will go hungry:

https://pluralistic.net/2022/08/21/what-is-chokepoint-capitalism/

30/

What is Chokepoint Capitalism? – Pluralistic: Daily links from Cory Doctorow

Indeed, if you keep giving that kid more lunch money, the bullies will eventually have enough dough that they'll hire a fancy ad-agency to blitz the world with a campaign insisting that our schoolkids are all going hungry and need *even more* lunch money (they'll take that money, too).

31/

When Mitch Glazier - who got a $1m+/year job for the labels after attempting to pauperize musicans - writes on behalf of Disney in support of a copyright suit to establish that copyright prevents training a model without a license, he's not defending creative workers.

32/

Disney is the company that takes the position that if it buys a company like Lucasfilm or Fox, it only acquires the *right* to use the works we made for those companies, but not the *obligation* to pay us when they do:

https://pluralistic.net/2021/04/29/writers-must-be-paid/#pay-the-writer

If a new, unambiguous copyright over model training comes into existence - whether through a court precedent or a new law - then all our contracts will be amended to non-negotiably require us to assign that right to our bosses.

33/

Pluralistic: 29 Apr 2021 – Pluralistic: Daily links from Cory Doctorow

And our bosses will enter into "partnerships" to train models on our works. And those models will exist for one purpose: to let them create works without paying us.

The market concentration that lets our bosses dictate terms to us is getting *much* worse, and it's only speeding up. Getty Images - who sued Stability AI over image generation - is merging with Shutterstock:

https://globalcompetitionreview.com/gcr-usa/article/photographers-alarmed-gettyshutterstock-merger

And Paramount is merging with Warners:

https://pluralistic.net/2026/02/28/golden-mean/#reality-based-community

34/

Photographers alarmed by Getty/Shutterstock merger

Several photographers have warned that the merger of Getty Images and Shutterstock will result in even smaller commissions for artists, in a deal that could test the Trump Administration's desire to protect labour competition. 

This is where this new SCOTUS action comes in. A new copyright that covers training is just one more thing these increasingly powerful members of this increasingly incestuous cartel can force us to sign away. That new copyright isn't something for us to bargain *with*, it's something we'll bargain *away*.

But the fact that the works that a model produces are automatically in the public domain is something we *can't* bargain away. It's a legal *fact*, not a legal *right*.

35/

It means the more humans there are involved in the creation of a work, the more copyrightable the work is.

Media bosses love AI because it dangles a tantalizing possibility of running a firm without ego-shattering confrontations with creative workers who know how to do things. It's the solipsistic fantasy of a world without workers, in which a media boss conceives of a "product," prompts a sycophantic AI, and receives an item that's ready for sale:

https://pluralistic.net/2026/01/05/fisher-price-steering-wheel/#billionaire-solipsism

36/

Pluralistic: A world without people (05 Jan 2026) – Pluralistic: Daily links from Cory Doctorow

Many bosses know this isn't within reach. They imagine that they'll get the AI to shit out a script and then pay a writer on the cheap to "polish" it. They think they'll get an AI to shit out a motion sequence, a still, or a 3D model and then pay a human artist pennies to put the "final touches" on it.

37/

But the Copyright Office's position is that *only* those human contributions are eligible for a copyright: a few editorial changes, a few pixels or vectors rearranged. Everything else is in the public domain.

Here's the cool part: the only thing our bosses hate more than paying us is when other people take their stuff without paying for it. To achieve the kind of control they demand, they will have to pay *us* to make creative works.

38/

What's more, the fact that AI-generated works are in the public domain leaves a lot of uses that *don't* harm creative workers intact. You can amuse yourself and your friends with all the AI slop you can generate; the fact that it's not copyrightable doesn't matter to that use. I happen to think AI "art" is shit, but you do you:

https://pluralistic.net/2024/05/13/spooky-action-at-a-close-up/#invisible-hand

39/

Pluralistic: AI “art” and uncanniness (13 May 2024) – Pluralistic: Daily links from Cory Doctorow

This also means that if you're a writer who likes to brainstorm with a chatbot as you develop an idea, that's fine, so long as the AI's words don't end up in the final product. Creative workers already assemble "mood boards" and clippings for inspiration - so long as these aren't incorporated into the final work, that's fine.

That's just what the Hollywood writers bargained for in their historic strike over AI.

40/

They retained the right to use AI *if they wanted to*, but their bosses couldn't *force* them to:

https://pluralistic.net/2023/10/01/how-the-writers-guild-sunk-ais-ship/

The Writers Guild were able to bargain with the heavily concentrated studios because they are organized in a union. Not just any union, either: the Writers Guild (along with the other Hollywood unions) are able to undertake "sectoral bargaining" - that's when a union can negotiate a contract with *all* the employers in a sector at once.

41/

How the Writers Guild sunk AI’s ship – Pluralistic: Daily links from Cory Doctorow

Sectoral bargaining was once the standard for labor relations, but it was outlawed in the 1947 Taft-Hartley Act, which clawed back many of the important labor rights established with the New Deal's National Labor Relations Act. To get Taft-Hartley through Congress, its authors had to compromise by grandfathering in the powerful Hollywood unions, who retained their right to sectoral bargaining. More 75 years later, the sectoral bargaining right *still* protects those workers.

42/