Something a bit worrying to note about using Ai in healthcare.

Iโ€™ve had two specialist appointments recently, both using ai to transcribe. Both sent report letters with inaccuracies about my diagnoses and past medical history. Even my GP was like, โ€œhuh, that directly contradicts what I put in the referrals.โ€

I have followed up both and requested amendments (which were done) but if I hadnโ€™t, these inaccuracies could have significantly damaged ongoing care, further treatment or insurance claims.

Human error has always been a factor, but both doctors were clearly using the ai software and assuming what it spat out was correct. They made no other notes during the appointments to cross-reference and double check. This is how Very Bad Things can happen.

@bloodflowersburning ๐Ÿ™„ yeah seeing a lot of this too. I worked in healthcare data (credentialing) and just lost my job to an AI thing that does it, with a team, supposedly. Hate the way all of this is going. It's weird and dangerous
@jake4480 Iโ€™m really sorry to hear that.
I used to work for a translation service that supported disabled people. Itโ€™s being gradually nudged out by ai services that absolutely cannot do what a human does with accuracy. It frequently translates things incorrectly or in ways that make the information more confusing. But itโ€™s cheaper than human labour, eh.