My work #EMR at now has integrated #AI that summarizes a patient's chart whether I want it to or not. This week it told me the wrong reason for admission, the wrong hospital course, and the wrong medications as compared against the human-written discharge summary. To review it and find the error took 3 minutes; to document the error and report it took another 10.

Anchoring bias exists. What we read stays with us, truth or lie, influencing decisions.

And I can't turn it off.

#LawsuitBait

@jeneralist and how many of your coworkers and nurses, overworked and underresourced, are just reading the summaries without verifying
@jeneralist I think everyone learns this when a loved one goes into the hospital, but for me most recently it was my grandmother: a family member absolutely has to be there, and be hyper vigilant. The staff simply does not have the time to care about details, and they don't communicate. Someone can swear up and down that they've written something in the chart and either they lied or the night shift didn't read it (or in your case, read a slop "summary" that left out what was important). You won't get good care without a patient advocate.
@aburka I was hospitalized for weeks as a child, and a parent was with me 24/7 except for bathroom and food breaks. The hospital set up beds for patients' parents in the solarium, so that they'd still be close even if they couldn't sleep in the room with their kids. (Long enough ago that the hospital had a sunroom!) Eventually I realized that was about more than just keeping me company.