i went to a medical appointment the other day

the doctor asked if he could use AI during the appointment

i said 'no'

he said i was the only one who had refused

pro tip: if what you say to AI is shared with an insurance company, if what you said is not what you said because AI made stuff up, if what you said is revealed in a breach

guess what, your insurance company will use that against you forever and ever

just say no to AI

@samiamsam I have talked to a LOT of people IRL that think AI is some kind of great invention.

I talked to one person about its flaws and he was convinced that it will only get better. The idea being that software always gets better I guess…

I asked him “Is Microsoft Word better now than it was 20 years ago?”

@mls14 @samiamsam

Who can ever forget 'clippy' 😂

@maddad @mls14 @samiamsam It looks like you're trying to use AI. You won't need my help stuffing up.
@maddad @mls14 @samiamsam Clippy has been gone for more than 20 years…
(also, I really don't get the hate against it – it was easy to disable, and it seemed like a genuinely useful feature for new users)

@jernej__s @mls14 @samiamsam

I will never forget clippy,
it was always there when I needed it,
it comforted me when things were bad,
it boosted my spirits when I was sad,
it was a good friend to all.
it will live in my heart forever...😂

@mls14 @samiamsam Unreal how easily people forget the bad software from the previous 40+ years, Like QuarkXPress...

@julescelt01 @mls14 @samiamsam

bad software from last week 😫

@julescelt01 @mls14 @samiamsam

I mean at this point it's just freaking constant

@cavyherd @mls14 @samiamsam oh and that's just so sad. It shouldn't be that way.

@julescelt01 @mls14 @samiamsam

Not to mention blindingly frustrating

awh Quark Xpress awhh... I used to spend days dequarking documents

@julescelt01 @mls14 @samiamsam

@mls14 @samiamsam

In fairness, MS Word from 30 years ago was great (on the Mac, at least.) AND you could get a real live human on the phone for support.

That you could even talk to consistently enough to be on a first name basis.

Lots of people think AI is an invention (or even a thing) because so much is being spent on unrelenting PR, I'd imagine the lions share of VC money is being spent on PR. The accountancy and management consultants all are promoting it as the next big thing because they just want their snouts in the trough. The whole "it's just a man behind a curtain like the wizard of oz" is lost on people who just want to read forms and go home.

@mls14 @samiamsam

@samiamsam Exactly. However, most people either don’t understand, don’t want to be seen to be behind the times, or don’t want to say no. I’ve seen it time and time again with family members, old and young. It’s hard to know how to protect them from any of this.
When a medical practitioner asks me this question, the reply will be the most resounding HELL NO anyone in any medical centre has ever heard!
@samiamsam Might be time to ask if you even need to see the doctor because you can just use google ai if you want crap medical advice?
@samiamsam I had to refuse this as well, just the other day. It's so 'fun' being the doctor-cop all the time - Yes, I need you to mask, as I myself am masked. No, I do not wish my information shared with whomever provided the largest gift basket to the people who subscribed you to this AI service. I heard a funny story the other day about a Law that protects Health Information, but apparently it was just a delirious rumor.
@samiamsam watch closely, because one medical group near me doesn't ask, they simply put it in their terms of use that they may use it when they damn please.

@samiamsam I just said no to this yesterday. Doc asked to record the whole visit for ingestion into Epic.

Told me not to worry because “it’s protected under HIPAA.”

LOL. No.

@toddz @samiamsam Fundamentally, the whole thing is actually laughable as well-established and protective copyright law has been absolutely futile thus far in fighting off rampant AI-based theft of original works.

Medical record data will absolutely be abused.

@toddz @samiamsam HIPPA isn’t what people think it is. It doesn’t guarantee privacy, just hands you a g string of minimal coverage.

@samiamsam
Just wondering two things:

1 would doctors here in Portugal use AI

2 Would they ask you permission to use it. Is that an obligation for them where you live?

@samiamsam a doctor told me she going to use AI for taking her notes. I told her I was uncomfortable with that and she smirked and used it anyway 😕

@samiamsam
I suggest being a bit more proactive.

The last two medical appointments I went to I asked at the beginning whether this offfice/this provider uses AI. In both cases the answer was no.

I then followed up with "I recommend that this office/this provider never use AI due to the inherent problems and unethical situations it creates."

Speak out. Don't let them define the agenda. Define it for them.

#Healthcare #Privacy #AI

@shansterable @samiamsam While this is a great approach it's worth keeping in mind that there are very valid cases of "AI" usage in medicine - not chat bots, mind you, but for example machine learning based recognition of cancer in photos and x-ray images.
@shansterable @samiamsam I worked in healthcare for almost 30 years (USA). I helped implement HIPAA across a group of hospitals. A.I. may be useful in diagnosis, but the potential for leaking personal medical information is huge. I am glad I am out of the industry now. I.A. is going to change the industry, and much of the change is NOT going to be for the better.
@samiamsam I’m my doctor asked me that, I would find another doctor

@andrew_deridder

they all use it now in my city

@andrew_deridder

p.s. they used to use human scribes and that was fine, but this AI bullshit is bullshit

@andrew_deridder @samiamsam It’s usually not the doctor’s fault, though. Often it’s the medical organization that contracts with the scribe services.

@drahardja @andrew_deridder

yes, the doctor i saw works for a hospital system

@samiamsam

"Our records show you've had a hysterectomy, Mr Sam, so your request for a vasectomy is denied."

"pro tip: if what you say to AI is shared with an insurance company, if what you said is not what you said because AI made stuff up, if what you said is revealed in a breach

"guess what, your insurance company will use that against you forever and ever"

@samiamsam physician here. I use an AI scribe (which is HIPAA compliant). Refusing is totally fine, but don't complain about the progress note getting posted late, or orders put in getting scheduled later (people do that all the time [edit for the concerned: no, these don't impact patient care).
@P__X @samiamsam Are you surprised people are so negative about it? My computer science lecturer back in the ‘80s said “rubbish in, rubbish out
“, that’s what I think of AI when you think of what they have scraped into it. I wouldn’t trust it but, sadly, I think people do.

@sarahejfraser @P__X

yep, in the US we said GIGO

garbage in garbage out

it still works, but especially for AI

@sarahejfraser @samiamsam

The input is definitely not garbage. The output has to be edited by the provider (so as a human scribe's entries), but it's generally OK for the HPI, I have to yell at it to record the exam, and the assessment/plan section is just a vague reminder that usually requires full rewrite.

It is incredibly helpful though when something urgent hits and the provider gets swamped with the notes.

@P__X @samiamsam I have a question. Before, ai. recently, PP spent a lot of time entering notes during visit. What did they do before? More support staff?

@gabbywheels @P__X

when i went through chemo 5 years ago my oncologist used 'scribes'

people who are in the room typing the conversation

@samiamsam @gabbywheels i've never worked in a clinic that used scribes. not a huge fan of the idea. some other departments used to have them. they had more money...

@gabbywheels @samiamsam

1/ documentation burden was much lower. I've seen the evolution of this in the past 10-15 years. When we had paper charting and prescription pads, documentation was dangerously poor, but we could finish all paperwork before end of day. Now let's say I saw 12 patients a day, spent 8H with them, I'd need ~10-20 more minutes per patient to finish the note and get the electronic orders in/do care coordination. And that's an extremely smooth day.

@gabbywheels @samiamsam

2/ A day when none of 12 people happen to be sick (eg, sent to the ER) or no outside interruption happens (like a pts not on the schedule getting sick) is extraordinarily rare. With the AI scribe, I can do a better exam, care-coordinate, place orders realtime and finish the encounter with a summary for the patient. In other countries (with not necessarily covetable healthcare systems), medical assistants do just this, but not in the practices I've worked at in the US.

@gabbywheels @samiamsam

3/ What also changed is that notes are now not only available to patients by request, but are immediately released to their phones. Some complain that they have to wait a long time for it. Some complain that something's inaccurate (which often is correct IRL...).

The shift towards electronic health records has been shown to be a primary driver for physician burnout rates which are the highest ever: https://pmc.ncbi.nlm.nih.gov/articles/PMC10134123/

Burnout Related to Electronic Health Record Use in Primary Care

Physician burnout has been increasing in the United States, especially in primary care, and the use of Electronic Health Records (EHRs) is a prominent contributor. This review article summarizes findings from a PubMed literature search that shows ...

PubMed Central (PMC)
@P__X @samiamsam Thx so much for your thoughtful reply. That makes sense.
@P__X @samiamsam HIPAA compliant - an AI? Unless its running locally on your computer it is almost certainly sending and storing the query with a third party provider that the patient has not consented to share their medical data with.

@nf3xn @P__X

i work in HIM and HIPAA is only as good as the persons doing ROI

and AI is not a person and all that date goes somewhere

@nf3xn @samiamsam I share some of the the skepticism, but yes for AI to get approval to be used in healthcare it has to meet HIPAA requirements for storage, data security and access etc, and unlike peoples chatGPT or Grok queries it has actual legal consequences.
Silicon Valley Was Woke. Now They Want Blood.

The Big Data firm Palantir spent years developing lethal military tech. Now it’s leading a transformation in Silicon Valley, with tech giants abandoning their progressive posturing to join the battle for American military supremacy.