RE: https://mastodon.nz/@leighelse/116149727745113480

I worked on a large scale project testing medical transcription. (Maybe one of the largest ones) Hundreds of doctors reviewed the output and called out the issues.

It was not, and still is not, ready. Public health teams that roll this out without red teaming and remediation and feedback and a way to influence the weights of models are irresponsible.

@skinnylatte @aral My doctor uses it. My primary. She thankfully has struck me as someone who isn’t offloading responsibility to it or using it for other than review.

I’m well aware of that kind of tool, inaccurate at best, relied on by people with less ability than her though. :/

@josh
I have seen a take that it can dull a medical practitioners diagnosing and analytical skills. Potential explanation is that the summarising for notes is part of those processes. Maybe by refusing I am helping keep my GPs skills fresh? The emergency vet who used AI note taking didn't do a great job diagnosing my dog's woes, but was also working with the handicap of not being able to access the dog's regular notes and having to take the owner's word for things.
@skinnylatte @aral
@RedRobyn @josh @aral overreliance on these tools is also something that we test in this type of evals, and yes, we see that a lot in clinical ai