This question is making me wonder how many people have already been straight up killed by chatgpt
@hannah This is terrifying.
@hannah it is depressing seeing it fill the gaps (massive) of our health system. Ugh
@hannah it did recently tell a recovering addict to have a little meth, as a treat, so,
@hannah Of course, the counter question is: is it more people than have been killed by WebMD, Goop, et al?
@fuzzychef i see your point but neither of those will confidently tell you specifically the wrong number of mg to take

@hannah @fuzzychef WebMD, Goop et al also publish their misinformation on web pages, where everyone can see it. This at least creates the possibility that their misinformation can be spotted by third parties, and pressure applied to correct it.

Other than OpenAI, nobody sees what ChatGPT is whispering into the ears of vulnerable people

@fuzzychef @hannah thing is, those sites just kinda vaguely say your symptoms could be X number of things without giving you any sort of concrete answers.

it runs you in circles that dont lead anywhere (good or bad) and ultimately are worthless without proper test to rule out stuff half of that stuff
@hannah wheww man people's brains really are leaking out of their ears now huh
@kirakira @hannah I think this is a symptom of a bigger problem: "... my doctor isn't responsive..." People are isolated. They don't know where to go for help, advice, and guidance. The entire human social fabric is falling apart and rather than go "wow, we should fix this!" the response is "fuck it, automate that shit."
@hannah Sam Altman should be forced to use ChatGPT instead of going to any doctor. He keeps saying LLMs can replace doctors for “those who can’t afford it”
@hannah
Good data in, garbage out!
@hannah I'm a mod in a large diabetes discord community. We have an info card literally telling people not to get medical advice from AI, and we especially discourage getting dosing advice from AI given insulin can be deadly when misused.

Can anyone join that Discord? I could recommend it to the person who asked that question (it’s recent).

@bedast @hannah

@clew @hannah Absolutely!

https://discord.com/invite/diabetes

It's an open and welcoming community, and is LGBTQ+ friendly.

Join the r/Diabetes Discord Server!

This server is a support server all about diabetes and other things that diabetes affects. | 4977 members

Discord

Being very cautious -- is it okay to publish this link on the public website in question? I know here is kind of public, but Ask Metafilter gets indexed pretty heavily.

I'm not a Discord user and don't know if that's what "open" means.

Final stupidity, I went and posted it and then worried and had it taken down. But I can post it again! Or just put in a private message to the original poster.

@bedast @hannah

@clew It's a completely public community. Feel free to post the link anywhere!

So long as the rules are followed (and they ARE enforced pretty aggressively), anyone is welcome with the understanding that this is a support community. We do gatekeep it to a degree, with a healthy chunk of the server hidden behind a manually assigned role.

@hannah

@hannah I feel for this person. I was diagnosed with type 1 diabetes and started on long- and short- acting insulin in February. It is absolutely an overwhelming experience, and I can imagine how someone might resort to a LLM for help. But really this seems like more of an indictment of this person’s doctor/care team. You don’t just hand someone two types of insulin and a glucometer and shoo them out the door. #diabetes

@garland @hannah

this was my thought as well. sad to see another grifter join the party but until ppl have access to adequate health resources this is inevitable

@garland @hannah OP said in another answer that they trusted ChatGPT because it "seemed to know and be confident" and arrgh that person would get fooled by a Confidence Man so hard

They got conned in the most literal sense of the word

@hannah ChatGPT is our bloodletting bowl

@hannah lmao at the comment that was "You need Baldur Bjarnason at least as much as you need insulin."

(cc @baldur)

Generative AI: What You Need To Know

Become an expert detector of AI bullshit

@hannah it makes me so sad. lonely people desperate for help who are being failed by every single system
@aparrish it’s hard to think of a problem in america that widespread llm adoption doesnt make worse, but at least we were able to summarize some emails

@hannah @socketwench

As someone who has been battling Type 2 Diabetes with two doctor, two endocrinologists, and has 24 hour monitoring of blood sugar...JESUS FREAKING CHRIST NO. NO NO NO NO. GO TO YOUR DOCTOR AND ASK FOR A CORRECTIONS SHEET. OH SWEET CRISPY WALNUTS DON'T USE CHATGPT FOR SOMETHING LIKE THIS.

@hannah @puppygirlhornypost2 There has already been half a dozen articles featuring people who killed themselves or caused serious harm to themselves as a result of going in really deep with these things.

It may be pointed out that some people are already liable to go down a rabbit hole and ChatGPT is just what they used, but we should also consider how these kinds of systems enable that, like with how they are natural at "yes, and"-ing.

This is also happening because of wider societal alienation and holes in support from lack of sufficient medical care and other kinds of expertise because of capitalism reasons. It's not great.
@hannah a hacker saying "A computer cannot find out, therefore a computer must not fuck around" GPT cannot be held responsible, so you must not trust anything that it says because if it's wrong it's your fault for trusting it. Even if you give in to using it, you have to check everything it says against authoritative sources.

@hannah

He might as well have rolled some dice to decide what to do

@hannah that's also a problem of inadequate healthcare! People turn to these solutions because they do not get the care they need!!

@hannah this is just the latest round of what happens when the medical system fails to meet people's actual needs (esp. when those needs are social/interpersonal and not merely related to access to specific molecules)

similar has happened before with women's issues, chronic illness, etc. getting consumed by quackery, MLMs, antivax, etc.

@hannah Oh gods that's horrifying, that poor person
@hannah I just read about people needing to be rescued from chatgpt-planned hikes up snowy mountains. People will definitely have died from shit like that.
@hannah pinging @davidgerard this is what actually I was horrified with when my friend consulted chatgpt about his chronic pain and ended up being nearly suicidal

@hannah jfc it doesn't get "confused"

Please please please stop treating it like a mind

@hannah I'm rapidly approaching the point of "if you do stupid shit with a gun and die as a result, the gun didn't kill you" when it comes to LLM misuse.

To try and counter that impulse, I'm working on a handout/zine on why LLMs aren't "truth engines"

@hannah The current narrative coming from a lot of supporters (including the company I work for!) is “you should use it like a search engine, but don’t trust it and be sure to verify everything it says.”

…then I’ll just use a search engine to start with why even bother.

@hannah Dangerous medical advice. Dangerous hiking advice.
There won't ever be a count of how many LLMs have endangered or killed.
https://m.ai6yr.org/@ai6yr/114716031622310118
AI6YR Ben (@ai6yr@m.ai6yr.org)

The Trek: They Trusted ChatGPT To Plan Their Hike — And Ended Up Calling for Rescue "Two hikers were rescued this spring from the fittingly named Unnecessary Mountain near Vancouver, Canada after using ChatGPT and Google Maps to plan their route and finding themselves trapped by unexpected snow partway up..." h/t @researchbuzz@researchbuzz.masto.host https://thetrek.co/they-trusted-chatgpt-to-plan-their-hike-and-ended-up-calling-for-rescue/ #hiking #SearchAndRescue #AI #LLMs #SAR

AI6YR's Mastodon
@hannah

Advice #1: get a new doctor.
@hannah people are getting psychosis from it for sure.

@hannah

Holy Shit !
Don't ever trust #ChatGPT in suggesting the right #medication on #serious and possible #dangerous #medicine. Change the fucking #Doctor, if he is not #responsive and tell him this before. It is YOUR #Health an not the Doctor's.

@hannah

Imagine paying for a #Service, which gives you possibly #lethal #Medications. Hard to #believe, but obvious #true.

@hannah Christ. It reminds me of this recent YouGov poll from the UK that showed that the general public has pretty much a perfectly inaccurate sense of what LLMs are and aren't actually good for. Pretty terrifying.

@jsbarretto @hannah

Yeah, but a lot of British newspapers print horoscopes. As a nation, it seems we're easily led. 😟

@jsbarretto @hannah

To be fair, the general public has a very very bad track record of who to trust on lots of these questions. People listen to celebrities, confident shysters, etc.

@hannah tech corpos need to stop being allowed to market this garbage technology as anything other than bullshit generators