TIL that saying "holy shit don't use ChatGPT for medical advice" is a "purity test". i didn't know that before. in fact I still don't.

@davidgerard this one hit close to my heart because I’ve had two family members die in large part because their caretaker ignored medical advice and used awful alternative medicine information from the internet to try and treat them.

an LLM can’t do critique. as you’ve said, truth is not a data type in an LLM. all of these models suck in every form of medical crankery available on the internet, mix it with words from authentic medical sources, and present it all as credible.

@davidgerard I know that alternative medicine has a body count; I’ve seen it in the flesh. I know what some of the horseshit on the Internet can do if you’re very desperate or very trusting.

the LLM lowers the trust barrier because the crank information is no longer crank flavored, but it’s still dangerous as fuck to follow the advice.

I keep seeing LLMs be presented as better than nothing and that’s wrong. I wish the people who needed help could get it, but the LLM is worse than nothing.

@zzt @davidgerard I'm pleased to inform you the body counters at http://whatstheharm.net are still online

edit: wow though, no https ... now that's what I call web 1.0

What's The Harm?

This is a list of topics in which we have found stories where a lack of critical thinking has caused unnecessary harm, death, injury, hospitalizations, major financial loss or other damages.