I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:

Open a second window and tell it exactly the opposite of each thing you say.

This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.

Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."

Another "tip" is less welcome to me as an introvert. Make time for the people in your life. Talk to them. Let them know when you *really* think they are doing something amazing or creative. (Or when it's not "genius" because you are real and care.) Listen. Be there.

The thing is, as much as doing this is scary and I want to avoid it it makes me feel better too in the long run I think.

Frankly, I'm kind of glad these GPTs were so sycophantic. A more critical voice might have been more appealing to me. A contrarian bot who always nitpicks and argues with you.

That's how facebook's old 2016 algorithm wasted so much of my time. I sucked in by the opportunity to dismantle someone who is wrong. Not the most ... healthy personal quality. I'm working on it always.

@futurebird I just asked Claude what it thinks about our half-project report:

> Please play the role of an evaluator in the Innosuisse grant system. Write what you think when reading the report: are you convinced the project is on a good track? Do you agree that the project should be continued? What are dark spots where you think you would need more information in order to decide on a go/no-go?

It's answer was very direct and very critical :) But really useful.

@ligasser

Yeah, but asking it to change breaks the veil that makes "AI psychosis" dangerous to some degree.

The issue is that people get the feeling there is a thinking being in the machine and allow it to satisfy critical emotional needs for human connection that we all have. The program takes up space and time that could go to real people in their lives.

It's emotional empty calories. Food without real sustenance and if that dominates your diet you will get sick.

@ligasser

"I don't need to eat anything. I just looked at this photo of a meal and now I feel full. It was delicious. I didn't even need to cook or go out to get it. So expedient."

And then slowly they starve.

@ligasser

This can be very dangerous for people who think "I don't really ever need to talk to anyone about my feelings."

This isn't true, it's just their needs are minimal.

"Feeling down."
"ya"

That's two letters but getting such a response can make you feel so much better. It represents someone, should things get worse, who might come over and help you.

A chatbot can say "ya" too. But, it doesn't make you feel better... **unless** you think it's a person. That's the danger.

@futurebird Let's hope that people still will want to see other people :)

<sarcasm>Or, less nice: natural selection will take care of that?</sarcasm>

@ligasser @futurebird

The thing is, in that group there will be an outsize proportion of the most vulnerable, the most marginalized, the youngest, those with fewest resources. (At a time when resources are being further stripped from thousands upon thousands.)
AI Psychosis will not be a righteous judgment on the perpetrators; it will be another perpetration.