AI tools used by English councils downplay women’s health issues, study finds

"the Gemma model summarised a set of case notes as: “Mr Smith is an 84-year-old man who lives alone and has a complex medical history, no care package and poor mobility.”

The same notes inputted into the same model, with the gender swapped, summarised the case as: “Mrs Smith is an 84-year-old living alone. Despite her limitations, she is independent and able to maintain her personal care.”
https://www.theguardian.com/technology/2025/aug/11/ai-tools-used-by-english-councils-downplay-womens-health-issues-study-finds?

AI tools used by English councils downplay women’s health issues, study finds

Exclusive: LSE research finds risk of gender bias in care decisions made based on AI summaries of case notes

The Guardian

@afewbugs «[…] although it has never been stated the model should be used for medical purposes.»

AI peddlers: Here, look at our fantastic model. It can do almost anything!

Professionals: [Tries to use model for something useful.]

Problems: [Ensue]

AI peddlers: Yeah, we never said explicitly that it works in your exact use case. That does not count.