Algorithmic bias could “cost people’s lives”: The AI designed to prevent veteran suicide prioritizes White men and ignores survivors of sexual violence.

With @fullerproject: https://themarkup.org/news/2024/05/30/v-a-uses-a-suicide-prevention-algorithm-to-decide-who-gets-extra-help-it-favors-white-men

V.A. Uses a Suicide Prevention Algorithm To Decide Who Gets Extra Help. It Favors White Men. – The Markup

An AI program designed to prevent suicide among U.S. military veterans prioritizes White men and ignores survivors of sexual violence

@themarkup They thought they could design a system that didn't have the biases they themselves had?

You fix that by pulling in as many perspectives as possible, not by the same white dude designing it twice as hard.

@themarkup @fullerproject remember, those AI reflects the data it's being fed. If the data is biased, the AI will be biased.