I've been reading about what really helped people who had problems with "AI Psychosis" and one tip jumped out at me:

Open a second window and tell it exactly the opposite of each thing you say.

This helps to expose the sycophancy and shatters the illusion of sincerity and humanity.

Thought it was worth sharing. And frankly, it's exactly such an exercise that made me disgusted with the tech. "It just says ANYTHING is wonderful and genius. I'm not special."

Another "tip" is less welcome to me as an introvert. Make time for the people in your life. Talk to them. Let them know when you *really* think they are doing something amazing or creative. (Or when it's not "genius" because you are real and care.) Listen. Be there.

The thing is, as much as doing this is scary and I want to avoid it it makes me feel better too in the long run I think.

Frankly, I'm kind of glad these GPTs were so sycophantic. A more critical voice might have been more appealing to me. A contrarian bot who always nitpicks and argues with you.

That's how facebook's old 2016 algorithm wasted so much of my time. I sucked in by the opportunity to dismantle someone who is wrong. Not the most ... healthy personal quality. I'm working on it always.

@futurebird I just asked Claude what it thinks about our half-project report:

> Please play the role of an evaluator in the Innosuisse grant system. Write what you think when reading the report: are you convinced the project is on a good track? Do you agree that the project should be continued? What are dark spots where you think you would need more information in order to decide on a go/no-go?

It's answer was very direct and very critical :) But really useful.

@ligasser

Yeah, but asking it to change breaks the veil that makes "AI psychosis" dangerous to some degree.

The issue is that people get the feeling there is a thinking being in the machine and allow it to satisfy critical emotional needs for human connection that we all have. The program takes up space and time that could go to real people in their lives.

It's emotional empty calories. Food without real sustenance and if that dominates your diet you will get sick.

@ligasser

"I don't need to eat anything. I just looked at this photo of a meal and now I feel full. It was delicious. I didn't even need to cook or go out to get it. So expedient."

And then slowly they starve.

@ligasser

This can be very dangerous for people who think "I don't really ever need to talk to anyone about my feelings."

This isn't true, it's just their needs are minimal.

"Feeling down."
"ya"

That's two letters but getting such a response can make you feel so much better. It represents someone, should things get worse, who might come over and help you.

A chatbot can say "ya" too. But, it doesn't make you feel better... **unless** you think it's a person. That's the danger.

@futurebird That reminds me of a situation I had a couple of months ago. I have a childhood friend, who was my best friend for a long long time, but we kind of drifted apart after he moved cities. Nevertheless we at least congratulate each other on birthdays and write back and forth to talk about our lifes a bit.

The last time I wrote to him we exchanged our personal problems and feelings. I offered him that he can always write to me if he needs someone to talk to, but he dismissed it by saying that it's fine and that he has an AI which he uses for that. I got to be honest: That kind of hurt me since I sincerely wanted to help with his emotional burden and I felt like I just got pushed aside.

Sorry, had to think about that and I felt like I needed to let that out.

@flamecat @futurebird That sounds so upsetting, like you've been replaced and valued less than a robot. I'm sorry that happened to you.
@Akki @futurebird Yeah, it felt like it. I would be shocked if that was his intention, but it still sucks and just makes me feel like shit.
@flamecat @futurebird We've been told we burden others with our lives so we now default to "not troubling anyone" when actually that's what makes us human

we’ve been sold that lie by the same kind of predators selling AI.

@Akki @flamecat @futurebird