I often discuss in therapy the problems we face in #FOSS w/ #LLM-backed #AI (no surprise there).

My therapist told me today that one of her colleagues who was early in focusing on LGBTQIA+ therapy is ending their 20 year practice. In their top 3 reasons? AI.

My therapist also noted that she appreciates that I'm now one of her few patients who doesn't come to her with “Well, I asked AI & it said…” slop.

LLMs may have value for medical uses, but warmed over Eliza does not a therapist make.

@bkuhn
> My therapist told me today that one of her colleagues who was early in focusing on LGBTQIA+ therapy is ending their 20 year practice. In their top 3 reasons? #AI.

Can you (did your therapist) elaborate on the sequence there? I can imagine various ways “AI” might be attributable for ending a therapist career, but what was the connection in this case?

@bignose

See my next reply in the thread for more.

As I understood it, basically people are choosing to talk to bots and report it is helping, so they are going to human therapy less.

But it isn't therapy, it's a trick that LLMs pull on us all the time.

That's horrifying @bkuhn, given what we know of how LLMs operate and the absence of a mind there.

It's bad when programmers are abdicating responsibility. It's worse when an LLM demonstrates how awful Google search has become, by comparison. But seeking therapy from a random-sentence generator? I would not have imagined it.

That it has become so prevalent a therapist decides their career may as well end? We are losing the institutions that can actually help people.