NEW: Every year, governments use algorithms to flag people receiving welfare benefits as "high risk" of committing fraud. Today, for the first time, a joint investigation by Lighthouse Reports and WIRED can reveal how one of these algorithms works. We obtained the full algorithm code and the training data and recreated the system. What we found was discrimination based on gender and ethnicity. Part 1 is here: https://www.wired.com/story/welfare-state-algorithms/
Inside the Suspicion Machine

Obscure government algorithms are making life-changing decisions about millions of people around the world. Here, for the first time, we reveal how one of these systems works.

WIRED

@couts

Wow super-cool piece covering the enraging workings of software that makes decisions about people's real lives. The algorithm discriminates but in opaque ways so it's hard to find recourse against this "suspicion machine".

It is an example of what Cathy O'Neil calls "Weapons of Math Destruction".

https://www.wired.com/story/welfare-state-algorithms/

Inside the Suspicion Machine

Obscure government algorithms are making life-changing decisions about millions of people around the world. Here, for the first time, we reveal how one of these systems works.

WIRED
@CelloMomOnCars @couts "Weapons of Math Destruction" is perfect! #AI #AIhype

@erchanda @couts

I'm a big fan of this book. O'Neil makes the concepts around the misuse of Big Data accessible; she's funny, too, and makes you laugh even as you're boiling inside. Worth a read, imo.

Because algorithms are used everywhere, from teacher evaluations to policing to college admissions, on and on.