I sometimes wonder how people could cope with the fact that they actually enslaved people, but then I remember that they merely treated them as "tools", "just a tool" did they say when talking about them.

I wonder how big from a neurological pov the difference between enslaving a "just a tool" AI "agent" and an actual human "slave" is.

We also build fake relationships with random 2D characters, so it's not like it's entirely absurd, is it?

@karolherbst Its neurological in a sense that there has always been this huge gap between technically literate and non-literate people.

People who used to blatantly believe whatever on the internet could be helped by some sane way in the pre-LLM era. But this has become worse with chatbots because they are built to converse easily and can easily feed in to the fantasies of the delusional. I think a lot of us in helpdesk can relate to being confronted by stubborn users just because "ChatGPT said...". This is just the tip of the iceberg.

As painful as it is, my take is that the technically literate ones should help, one at a time and when they can, as they always did. 404media happened to post on the same topic yesterday, its a good read (however, you will need to sign up to read this piece).

https://www.404media.co/ai-psychosis-help-gemini-chatgpt-claude-chatbot-delusions/

How to Talk to Someone Experiencing 'AI Psychosis'

Mental health experts say identifying when someone is in need of help is the first step — and approaching them with careful compassion is the hardest, most essential part that follows.

404 Media