Police used AI facial recognition to wrongly arrest TN woman for crimes in ND

https://www.cnn.com/2026/03/29/us/angela-lipps-ai-facial-recognition

Police used AI facial recognition to arrest a Tennessee woman for crimes committed in a state she says she’s never visited

A Tennessee grandmother spent more than five months in jail after police used an AI facial recognition tool to link her to crimes committed in North Dakota – a state she says she’d never been to before.

CNN
AI is a liability issue waiting to happen. And this is just another example.
It’s a tool. Used incorrectly will lead to errors. Just like a hammer, used incorrectly could hit the users finger.

Only one small little problem --- there is no way to tell if you are using it "correctly".

The only way to be sure is to not use it.

Using it basically boils down to, "Do you feel lucky?".

The Fargo police didn't get lucky in this case. And now the liability kicks in.