Just discovered part of my HRT has been discontinued due to an AI note taking error. Gynae did not say discontinue, and I have letters to prove it. Now I may have to be without testosterone until this is proven, which may mean another 6 month wait to resee a gynaecologist via NHS. Hopefully my letters will suffice, but currently unclear.

I am beyond furious. Also, wow. Fuck living in this dystopia.

Keep every medical record you get, you're going to need it in this new terrifying timeline.

@Thayer JFC I am *so* far past AI replacing people at this stage. Smash the fucking looms.

@rgarner  

This further resolves me to never say yes to AI note taking. They really pressure you at this point at the doctors I've gone to. My mom let them do it, and dutifully checked the notes later, where she saw that the AI notes diagnosed her with something that the doctor explicitly said she did NOT have.

FFS

@Thayer I'm sorry that's happening to you. What a shitshow. :(

@alisynthesis @rgarner I didn't consent to it. GPs just use it these days. This is an AI summary of a letter that the gynae sent to both me and the doctor. Thankfully I keep all my records in hard copy due to deep medical mistrust, and also my reference. Which seems to have worked out on this occasion!
@Thayer @alisynthesis I'm going to start saying "I'm recording this" the moment AI gets involved at this stage
@rgarner @Thayer I'm just going to get myself seven "I DO NOT CONSENT TO BEING RECORDED" shirts and wear them everywhere at all times
@alisynthesis @rgarner sounds like a plan honestly
@rgarner @Thayer @alisynthesis with how puritanical AI models are, going shirtless would likely cause them to refuse to process imagery of you.
@tedmielczarek @rgarner @alisynthesis trudat,I could go in with my head under my arm it'll be fine but boobs out IS RIGHT OUT. Not a nipple!!! Saveeeee usssssssss
@rgarner @Thayer @alisynthesis I’d do it with a 90s dictaphone…basically anything but your phone. Sounds a bit far, but…
@alisynthesis @TheWolfOfSouthEnd @Thayer please tell me we're not going to have to bring back the "I use my dictaphone" gag...
@rgarner @alisynthesis @Thayer lol. I can’t even remember that. But, probably…

@Thayer oh wow! I thought they had to ask if you consent to have the visit recorded? (This is the practice in my part of the US...usually the US has the least consumer-friendly practices on this front.)

That's so fucked up that they just...use it. Being recorded in the doctor's office is very invasive in and of itself!

(tiny edit for clarity)

@rgarner

@alisynthesis @rgarner wasn't a visit, it was a letter. I'm in the UK. I assume they use it in the practise for admin like this.

@Thayer Ah, sorry, I misunderstood. Either way, what a shitshow. :(

@rgarner

@Thayer @alisynthesis @rgarner Yup. For some reason my Dr. releases literally all of my records into provider portal and before my last consult there was a note from an "AI prescription evaluator" deciding that I was worthy of not having an aneurysm or losing some toes (lisinopril and metformin).
@Thayer oh ffs <virtual hugs>

@Thayer

Wow, that's really bad. Thanks for sharing.

Now I'm thinking about this happening with people who aren't knowledgeable about their meds, and maybe won't even notice.

@unchartedworlds exactly that, hence always keep notes. I'm lucky (!ha) that my distrust of the medical profession means I keep them anyway. But yeah. Not good at all.
@Thayer Hugs. All things crossed you can get this sorted quickly

@Thayer

I’m not sure if this will help, but the same NHS guidelines that require medical practitioners to use this AI transcription nonsense also requires them to accept personal liability for all errors in the transcription. If you remind the doctor of this and give them the opportunity to correct it, they may choose to do this.

@david_chisnall oh that's very good advice thank you. I'm going to work out all this tomorrow when I'm less fuming so I appreciate this.
@Thayer Please write to your MP about this.
@tim @Thayer also write to GP - either the GP or you can file a yellow card on the software with the MHRA, and this is crucial for measuring performance in the field. Medication error is a classic and not-infrequent issue.
@Thayer @david_chisnall If they decline to fix it urgently and without good reason then the next magic words are "professional body".
@Thayer Ouch. I hope you are able to get this sorted soon
@Thayer hmm, is that something you can fast track via going through data-protection stuff?
@Thayer ahhh that sucks and I am now vicariously feeling that "suffering at the hands of stupid bureaucracy with no simple recourse despite their mistake having real consequences for you" feeling, which I last felt a week ago when the government made schools ban mobile devices (despite my child having dyslexia+ADHD coping mechanisms that rely on her phone) 🤬
@Thayer god, what the actual fuck. I'm so sorry. Hope it won't take too long to fix this. What a mess. I hate this world.
@sophteis thanks appreciate it. The thing itself is an annoyance, but the fact this can happen is the real rage bait for me.
@Thayer jfc - I’ve seen meeting transcripts at work capture the exact opposite point, but that has no impact on anything important. Tantamount to negligence, surely?
@Thayer Oh, fuck everything about this. I’m so sorry.

@Thayer And just wait until the AI companies have to start taking in state money to cover for their losses, which is when these stop being mistakes and become politically deliberate skews of the results...

Total dystopia just starting.

@Thayer that’s terrifying. How stressful for you
@Thayer oh that's horrifying
@Thayer Good grief. So glad that you a) checked it, and b) have the documents to argue the correction. I wonder how many people have had this sort of thing happen and not noticed or were not in a position to get it fixed.
@Thayer but it looks as though it was labelled as ‘AI’? It would be a lot scarier if it wasn’t marked as such.
@wikitect it's scary enough for me as it is tbh. Fwiw I did get an email from a Dr last year that was clearly ai and not labelled such, so we're probably already there.
@Thayer You might be able to see the practice manager of whoever generated this a lot sooner than six months and get it corrected that way.
@Thayer Wow, that's so dumb and frustrating. Hope an actual human with sense can straighten things out for you.
@sanityinc I have little hope in the medical profession but aye, me too! Fingers crossed for tomorrow. Thank you!
@Thayer this shit is gonna kill people. I'm so sorry this is something you have to deal with now. 😭
@stylus yeah it really is. Thanks appreciate that. Hopefully some of the fight I'm about to bring will help others down the line.
@Thayer
Since it's the NHS, may I suggest PALS? Wrong records need to be corrected
@ancientsounds thanks I'll look into that
@Thayer this shit shouldn't be happening, but really, is it not human error? its not necessarily the ai that is bad, sometimes even human notetakers get stuff wrong. It's up to the doctor or nurse to verify the information the notetaker took down. Again, not trying to minimize your situation at all, but what's dystopian is needing to wait for healthcare and the doctor not verifying orders, not necessarily the ai's involvement.
@Thayer This is exactly why I am so resistant to the use of generative AI in any form in my practice. I'm sorry this has happened and I hope that the person responsible is accessible (try their secretary rather than waiting to go through GP) and willing to rectify the matter promptly; if not I would suggest immediately contacting the complaints department of whichever organisation is responsible for the clinic.
@Thayer holy shit, I would be beyond furious. That would result in me being bedridden again until resolved, that is *not* a little issue. -.-

@Thayer

I doubt this is an error, tbh.

The AI bros hate trans people. The UK government hates trans people. I would put good money on this being deliberate programming on the part of someone trying to fuck people over.

@TinyGamerTris I'm peri not trans but they hate us too so maybe!