"AI psychosis" is one of those terms that is incredibly useful and also almost certainly going to be deprecated in smart circles in short order because it is: a) useful; b) easily colloquialized to describe related phenomena; and c) adjacent to medical issues.There's a group of people who feel very strongly any metaphor that implicates human health is intrinsically stigmatizing and must be replaced with an awkward, lengthy phrase that no one can remember and only insiders understand.

1/

If you'd like an essay-formatted version of this thread to read or share, here's a link to it on pluralistic.net, my surveillance-free, ad-free, tracker-free blog:

https://pluralistic.net/2026/03/12/normal-technology/#bubble-exceptionalism

2/

Pluralistic: Three more AI psychoses (12 Mar 2026) – Pluralistic: Daily links from Cory Doctorow

So while we still can, let us revel in this useful term to talk about some very real pathologies in our world.

Formally, "AI psychosis" describes people who have delusions that are possibly induced, and definitely reinforced and magnified, by a chatbot. AI psychosis is clearly alarming for people whose loved ones fall prey to it, and it has been the subject of much press and popular attention, especially in the extreme cases where it has resulted in injury or death.

3/

It's possible for AI psychosis to be both a new and alarming phenomenon and also to be on a continuum with existing phenomena. Paranoid delusions aren't new, of course. Take "Morgellons Disease," a psychosomatic belief that you have wires growing in your body, which causes sufferers to pick at their skin to the point of creating suppurating wounds.

4/

Morgellons emerged in the 2000s, but the name refers to a 17th-century case-report of a patient who suffered from a similar delusion:

https://en.wikipedia.org/wiki/A_Letter_to_a_Friend

Morgellons is *both* a 400 year old phenomenon and an internet pathology. How can that be? Because the internet makes it easier for people with sparsely distributed traits to locate one another.

5/

A Letter to a Friend - Wikipedia

That's is why the internet era is characterized by the coherence of people with formerly fringe characteristics into organized blocs, for better (gender minorities, #MeToo) and worse (Nazis).

Morgellons is rare, but if you suffer from it, it's easy for you to locate virtually *every* other person in the world with the same delusion and for all of you to reinforce and egg on your delusional beliefs.

6/

Morgellons isn't the only delusion that the internet reinforces, of course. "Gang stalking delusion" is a belief in a shadowy gang of sadistic tormentors who sneak hidden messages into song lyrics and public signage and innuendo in overheard snatches of other people's conversations. It is an incredibly damaging delusion that ruins people's lives.

Gang stalking delusion isn't new, either - as with Morgellons, there are historical accounts of it going back centuries.

7/

But the internet supercharged gang stalking delusion by making it easy for GSD sufferers to find one another and reinforce one another's beliefs, helping each other spin elaborate explanations for why the relatives, therapists, and friends who try to help them are actually in on the conspiracy. The result is that GSD sufferers end up ever more isolated from people who are trying mightily to save them, and more connected to people who drive them to self-harm.

8/

Enter chatbots. Ready access to eager-to-please LLMs at every hour of the day or night means that you don't even have to find a forum full of people with the same delusion as you, nor do you have to wait for a reply to your anguished message. The LLM is always there, ready to fire back a "yes-and" improv-style response that drives you deeper and deeper into delusion:

https://pluralistic.net/2025/09/17/automating-gang-stalking-delusion/

9/

Pluralistic: AI psychosis and the warped mirror (18 Sep 2025) – Pluralistic: Daily links from Cory Doctorow

It's possible that there are delusions that are even more rare than GSD or Morgellons that AI is surfacing. Imagine if you were prone to fleeting delusional beliefs (and whomst amongst us hasn't experienced the bedrock certainty that we put something down *right here*, only to find it somewhere else and not have any idea how that happened?). Under normal circumstances, these cognitive misfires might be fleeting moments of discomfort, quickly forgotten.

10/

But if you are already habituated to asking a chatbot to explain things you don't understand, it might well yes-and you into an internally consistent, entirely wrong belief - that is, a delusion.

Think of how often you noticed "42" after reading *Hitchhiker's Guide to the Galaxy*, or how many times "6-7" crops up once you've experienced a baseline of exposure to adolescents.

11/

Now imagine that an obsequious tale-spinner was sitting at your elbow, helpfully noting these coincidences and fitting them into a folie-a-deux mystery play that projected a grand, paranoid narrative onto the world. Every bit of confirming evidence is lovingly cataloged, all disconfirming evidence is discounted or ignored. It's fully automated luxury QAnon - a self-baking conspiracy that harnesses an AI in service to driving you deeper and deeper into madness:

12/

That's the original "AI psychosis" that the term was coined to describe. As Sam Cole notes in her excellent "How to Talk to Someone Experiencing 'AI Psychosis,'" mental health practitioners are not entirely comfortable with the "psychosis" label:

https://www.404media.co/ai-psychosis-help-gemini-chatgpt-claude-chatbot-delusions/

13/

How to Talk to Someone Experiencing 'AI Psychosis'

Mental health experts say identifying when someone is in need of help is the first step — and approaching them with careful compassion is the hardest, most essential part that follows.

404 Media

"Psychosis" here is best understood as an *analogy*, not a diagnosis, and, as already noted, there is a large cohort of very persistent people who make it their business to eradicate analogies that make reference to medical or health-related phenomena. But these analogies are very hard to kill, because they do useful work in connecting unfamiliar, novel phenomena with things we already understand.

It's true that these analogies *can* be stigmatizing, but they *needn't* be.

14/

As someone with an autoimmune disorder, I am not bothered by people who describe ICE as an autoimmune disorder in which antibodies attack the host, threatening its very life. I am capable of understanding "autoimmune disorder" as referring to both a literal, medical phenomenon; *and* a figurative, political one. I have never found myself confusing one for the other.

15/

"AI psychosis" is one of those very useful analogies, and you can tell, because "AI psychosis" has found even *more* metaphorical uses, describing *other* bad beliefs about AI. Today, I want to talk about three of these AI psychoses, and how they relate to one another: the investor AI delusion, the boss AI delusion, and the critic AI delusion.

16/

Let's start with the investors' delusion. AI started as an investment project from the usual suspects: venture capitalists, private wealth funds, and tech monopolists with large cash reserves and ready access to loans during the cheap credit bubble. These entities are accustomed to making large, long-shot bets, and they were extremely motivated to find new markets to grow into and take over.

17/

Growing companies *need* to keep growing, but not because they have "the ideology of a tumor." Growing companies' imperative to keep growing isn't ideological at all - it's material. Growth companies' stock trade at a high multiple of their "price to earnings ratio" (PE ratio), which means that they can use their stock like money when buying other companies and hiring key employees.

18/

But once those companies' growth slows down, investors revalue those shares at a much lower PE multiplier, which makes individual executives at the company (who are primarily paid in stock) *personally* much poorer, prompting their departure, while simultaneously kneecapping the company's ability to grow through acquisition and hiring, because a company with a falling share price has to buy things with cash, not stock.

19/

Companies can make more of their own stock on demand, simply by typing zeroes into a spreadsheet - but they can only get cash by convincing a customer, creditor or investor to part with some of their own:

https://pluralistic.net/2025/03/06/privacy-last/#exceptionally-american

Tech companies have absurdly large market shares - think of Google's 90% search dominance - and so they've spent 15+ years coming up with increasingly absurd gambits to convince investors that they will continue to grow by capturing *other* markets.

20/

Pluralistic: Two weak spots in Big Tech economics (06 Mar 2025) – Pluralistic: Daily links from Cory Doctorow

At first, these companies claimed that they were on the verge of eating one another's lunches (Google would destroy Facebook with G+; Facebook would do the same to Youtube with the "pivot to video").

This has a real advantage in that one need not speculate about the potential value of Facebook's market - you only have to look at Facebook's quarterly reports.

21/

But the downside is that Facebook has its own ideas about whether Google is going to absorb its market, and they are prone to forcefully make the case that this won't happen.

After a few tumultuous years, tech giants switched to promoting growth via speculative new markets - metaverse, web3, crypto, blockchain, etc. Speculative new markets are *speculative*, and the weakness of that is that no one can say how big those markets might be.

22/

@pluralistic I don't mind using an analogy, but there's a difference—analogously, too!—between having a delusion and experiencing full-blown psychosis.

While your examples below sure seem like delusions, the psychosis term feela like a bad analogy that muddies the waters more than it fosters understanding.

@pluralistic
An entire school's faculty accusing a bullied kid of suffering from gang stalking delusion behind their back in order to abdicate their responsibility to grease the squeaky wheel can ruin lives too. Or just lead to the bullies concluding they can get away with anything, escalating to violence, and the teachers ignoring visible wounds until the parents who were going along with the grown-up conspiracy up until now yell at them.

Ask me how I know...