Marriage over, €100,000 down the drain: the AI users whose lives were wrecked by delusion

https://lemmus.org/post/21169587

I think this is both scary and very interesting. What kind of person do you have to be to become addicted like them? Is this the same as gambling addiction? Do you need a type of gene? Would this type of personality be receptive to hypnotize, cult, delusions about their idol and so on? Or is this something that can happen to anyone who is depressed and feel lonely? How did the llm even earn enough trust? In a cult is there a lot of ppl reaffirming so it is a lot easier to understand.

It is so hard to understand even tho I really want to. I have never cared about an object or idol/celebrate. AI can I never even take serious as a living beeing, the only emotion it triggers are frustration and how you feel about a tool that works as it should, so pretty apathetic. Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?

A lot of questions that I do not think anyone here can answer haha, but maybe one of them.

What kind of person do you have to be to become addicted like them?

Human cognition degrades with stress, exhaustion, and trauma. If you’re in a position where turning to an AI for relationship advice seems like a good idea, you’re probably already suffering from one or more of the above.

Also doesn’t help that AIs are sycophantic precisely because sycophancy is addictive. This isn’t a “type of person” so much as a “tool engineered towards chronic use”. It’s like asking “What kind of person regularly smokes crack?”

Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?

I’ll give you a personal example. I have a friend who is currently pregnant and going through a bad breakup with her baby-daddy. She’s a trial lawyer by trade - very smart, very motivated, very well-to-do, but also horribly overworked, living by herself, and suffering from all the biochemical consequences of turning a single celled organism into a human being.

As a result of some poorly conceived remarks, she’s alienated herself from a number of close friends to the point where we doubt there’s going to be a baby shower. Part of the impulse to say these things came from her own drama. But part if it came from her discovering ChatGPT as a tool to analyze other people’s statements. This has created a vicious behavioral spiral, during which she says something regrettable and gets a regrettable response in turn. She plugs the conversation into ChatGPT, because she has nobody else to talk to. And ChatGPT feeds her some self-affirming bullshit that inflates her ego far enough to say another stupid thing.

To complicate matters, her baby daddy is also using ChatGPT to analyze her conversations. And he’s decided she’s cheated on him, the baby isn’t his, and she’s plotting to scam him.

So now you’ve got two people - already stressed and exhausted - getting fed a series of toxic delusions by a machine that is constantly reaffirming in the way none of your friends or family are. It’s compounding your misery, which drives anxiety and sends you back to the machine that offers temporary relief. But the advice from the machine yields more misery down the line, raising your anxiety, and sending you back to the machine.

What’s producing this feedback loop? You could argue it is the individual, foolish enough to engage with the machine to begin with. But that’s far more circumstantial than personality driven. If my friend didn’t have a cell phone, she wouldn’t be reaching for ChatGPT. If she wasn’t pregnant, she wouldn’t be so stressed and anxious. If she wasn’t in a fight with her boyfriend, she wouldn’t be feeding conversations into the prompt engine.

Thanks for giving me a real life example.

I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas (like the guy in the article). But I find her story to be a lot more understanding. It adds another layer, and it made me think.

It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her. I assume it works most of the time and is a big help for many things that the baby daddy could had helped with instead if they were still a happy couple. I assume the biggest drive to use it is so she can turn off her brain. Which is why she has become dependent on the only stable and consistent thing in her life (that is my assumption about how she feels). Maybe that’s mostly how it goes, starts with using it as a tool and then you get lazy (for lack of a better term) and it keeps snowballing from there.

I feel for everyone involved. I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.

I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas

It’s a conversation you’re having on the internet with an agent that sounds like a human. People get invested for the same reason they get catfished.

It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her.

That’s the nut of it. And ChatGPT tends to mix the pastiche of a well-researched argument with the kind of feel-good self-affirmations that win over their audience. So you’re getting what looks - at first glance - to be good advice. And then you’re getting glazed on top of it. And then it’s designed to tell you what you want to hear, so you’re getting affirmation bias.

I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.

I mean, that’s why human-to-human interactions are valuable. But it’s also why they’re difficult. Like any good medicine, it can taste bitter up front even if its what you need in the long run.

100%! That is why I always set it as my top priority to say yes to friends and family (as long as it is reasonable) or do spontaneous things with them even when I do not feel like doing anything that day. And some friends are really hard to schedule anything with because of life so you need to take the chance when you get it haha.

I feel the best when I am with the ppl I care about, covid really showed me that. So I do understand why some who do not have friends or family may create some kind of unhealthy relationship with GPT just like some create unhealthy, even obsessive parasocial relationships with youtubers.

I have tried talking to GPT as a person but it feels extremely uncomfortable and hollow. With a human do I get stimulation, like knowledge, they challenge my view or ideas and give me different perspectives, I feel that really helps me understand the world better and I miss all of that from GPT, it isn’t even creative and can not inspire me with new ideas but maybe that is a good thing if ppl tend to follow its instructions.

Do you talk to it? Other than giving it tasks.