Police used AI facial recognition to wrongly arrest TN woman for crimes in ND

https://www.cnn.com/2026/03/29/us/angela-lipps-ai-facial-recognition

Police used AI facial recognition to arrest a Tennessee woman for crimes committed in a state she says she’s never visited

A Tennessee grandmother spent more than five months in jail after police used an AI facial recognition tool to link her to crimes committed in North Dakota – a state she says she’d never been to before.

CNN
Now cruel people wield a two-tiered shield. It's not an accident that this happened to a woman, but make no mistake they are coming for men next.
You think they deliberately chose to do this to a woman? Why?

Probably just reading the room, with States like texas making abortions illegal and allowing random citizens from enforcing that.

Famously, abortions are a woman thing.

Anyway, looking through the facts, it's just some random woman. There's better evidence that these facial recognition systems are much worse at minorities rather than genders.

Interesting biases are own-gendeR:
https://pmc.ncbi.nlm.nih.gov/articles/PMC11841357/

Racial bias:

https://mitsloan.mit.edu/ideas-made-to-matter/unmasking-bias...

Miss rates:

https://par.nsf.gov/servlets/purl/10358566

Although you can probably interpret the facts differently, we've seen how any search function gets enshittified: Once people get used to searching for things, they tend to select something that returns results vs something that fails to return results.

Rather than the user blaming themselves, they blame the searcher. As such, any search system overtime will bias towards returning search (eg, Outlook), rather than accuracy.

So if these systems easily miss certain classes of people, women, minorities, they'll more likely be surfaced as inaccurate matches rather than men who'll have a higher confidence of being screened out.

That's how I interpret this 2 second commment.

Checking your browser - reCAPTCHA