As mentioned before, I hate bringing this up because I have no evidence or expertise here, just a gut feeling. But I just can't help feeling like, aside from everything else aside about LLM chatbots, they're quickly becoming the leaded gasoline of our time.

Something doing real damage to human cognition, but in this diffuse and difficult to measure kind of way.

Many, not nearly all but *many*, folks using this things seem (again, as a gut feeling) to just talk differently after contact with chatbots? I can't even quite put my finger on it, but it scares the shit out of me.

It's not even an argument against chatbots, I have plenty of arguments that are far better substantiated, it's a personal fear about what they're doing.

I've gotten a number of replies and seen a fair bit of discussion elsewhere to the extent that this is a consequence of having an automated yes-man at your beck and call.

I don't think that's wrong, but it's also not what I'm getting at. Yes-men will validate your bad ideas, pushing you towards not losing the criticality required to distinguish good ideas and bad ideas. But what I've casually observed (again as a non-expert) is people losing the ability to express ideas *at all*.

Someone yes-manned to hell might make a bad movie because no one is around to tell them that the idea for that movie sucks. We've definitely seen that in any number of walks of life, but I suspect (as a non-expert making observations entirely devoid of rigor) that we're seeing something different and significantly worse still.

I beg you, please take this whole thread and others that I post along the same lines with a massive grain of salt. I do *not* know what I'm talking about here. I come from a place of seeing a thing, not knowing what in the fuck it is, and seeing comparatively little in the ways of expert analysis that I could use to understand what I'm seeing.

Normally if I don't have a fucking clue, I try to shut the fuck up. But there's something *missing* here, and I'm trying to express why that scares me.

@xgranade I have been thinking of it like gambling or microtransactions in the sense of how they are addictive. They provide a reward (a good answer) sometimes but not everytime? It seems like how they are addictive is similar.

But the reward is "not having to think" so I feel like it doesn't necessarily need someone to do a study.

@xgranade When something is an obvious cognitohazard (just empirically, seeing how many people have been completely messed up by it) and nobody who should be talking about that is talking about it, I don't think you need to apologize for not knowing what you're talking about. You don't have to have the answers or expertise to study or explain it thoroughly to know there's something that people need to be paying attention to and that folks with the qualifications to study need to be studying (and aren't, because funding dries up immediately if you try).
@xgranade Your personal observations are absolutely valid. The impact on users and society is what's been missing in its development

@xgranade
I see the yes-man thing for sure, and having something enthusiastically giving one permission to believe whatever one wants to think is essentially Q-Anon, but automated and immediate.

I also see something more subtle - encouragement to abandon thinking. Outsourcing problem-solving, creativity, thought itself is already known to be attractive and corrosive, and is what often kills retirees.

I think we may be seeing premature mental senescence automated at scale

@screwturn @xgranade

I'd love to tell you you're imagining things.

You're not.

It's a weapon.

The fucking thing is a weapon. On a whole lot of different levels.

Remember everything is about power and control. One time I was looking at medical studies to participate in cause I'm poor, and one turned out to be a dog food company wanting to stick electrodes to people's heads to measure their reactions to ads. The CIA's torture program explicitly researches things like learned helplessness. What the hell WOULDN'T this system try to do, to warp people's minds and push social engineering? Thats what the LLM systems DO-- studying conversations, what js related to what and leads to these reactions, this engagement. How to make people interested, how to make them angry. Patterns of addiction and despair.

I work in mental health, and this shit makes my hair stand on end.

https://sightlessscribbles.com/posts/the-colonization-of-confidence/

The Colonization of Confidence., Sightless Scribbles

A fabulously gay blind author.

@xgranade LLMs are lending a lot of credence to the idea that consciousness is a social phenomena. We learn not only what to think, but also *how to think at all* from those around us.

It does not surprise me even slightly that mirror neurons attempting to mirror an empty mask degrades cognition.

----

I am not a professional, but I am an expert in philosophy and philosophy of mind. I have degrees to prove it, for whatever they're worth (not as much as one might hope).