I sometimes wonder how people could cope with the fact that they actually enslaved people, but then I remember that they merely treated them as "tools", "just a tool" did they say when talking about them.

I wonder how big from a neurological pov the difference between enslaving a "just a tool" AI "agent" and an actual human "slave" is.

We also build fake relationships with random 2D characters, so it's not like it's entirely absurd, is it?

Dunno.. maybe it's just me, but I find it interesting to wonder what it does with people.

Like people actually got emotionally shattered when their AI "friend" out of the sudden changed character due to a model update.

And I'm curious how many sincere emotions and character an AI agent could trigger in humans. Like I don't believe for a second that people "fall" for AI agents because they think they have AGI in front of them, but rather because it feels human enough to the brain.

@karolherbst I don’t think people “fall in love” with LLMs because it’s so much like a human, but because it acts nothing like a human and these people are just that emotionally stunted. They do just want a slave. They want to always be validated and coddled and catered to without having to give anything back or introspect or consider they might be wrong or do anything that it takes to form a genuine connection with a real person
@karolherbst it’s the same as people who have children because they’re lonely and want a best friend or because they want a legacy or whatever. They want to own another person. They don’t want a genuine connection