I often discuss in therapy the problems we face in #FOSS w/ #LLM-backed #AI (no surprise there).

My therapist told me today that one of her colleagues who was early in focusing on LGBTQIA+ therapy is ending their 20 year practice. In their top 3 reasons? AI.

My therapist also noted that she appreciates that I'm now one of her few patients who doesn't come to her with “Well, I asked AI & it said…” slop.

LLMs may have value for medical uses, but warmed over Eliza does not a therapist make.

My therapist had not heard of Eliza. When I explained, she immediately pointed out the trick: mirroring what someone is saying is powerfully validating. It's a tool therapists use to make patients comfortable and feel heard, and it helps build rapport.

But again, it is not in itself therapy.

I suppose some might conclude we are better off with BigTech as therapists rather than Real Humans, & I suppose we aren't too far from USA health insurers refusing to cover human therapy.
We should resist.

@bkuhn Tangential, but striking to me: if a person takes the time to reflect something back, even imperfectly, the understanding is real. For ai it's the opposite.

@ptvirgo

Yup.

#BigTech is ready to sell us EaaS: #Empathy as a service.

My bigger worry is all these #LLM-backed #AI “therapy solutions” surely have nasty terms of service that will curtail the class action lawsuits that should follow when we figure out how much harm they've caused patients.

@bkuhn
Like everything else an LLM does, it superficially resembles what humans do, and non-experts maybe unable to tell the difference directly, but it is ultimately hollow.

@bkuhn Slightly outdated, but I often think about this article from a couple of years back, likening LLMs to fortune telling (which is also basically unlicensed therapy).

https://softwarecrisis.dev/letters/llmentalist/

The details have shifted a bit since they wrote it, but the core idea about asking the audience to carry the real load while faking depth certainly hasn't.

The LLMentalist Effect: how chat-based Large Language Models rep…

The new era of tech seems to be built on superstitious behaviour

Out of the Software Crisis