Police used AI facial recognition to wrongly arrest TN woman for crimes in ND

https://www.cnn.com/2026/03/29/us/angela-lipps-ai-facial-recognition

Police used AI facial recognition to arrest a Tennessee woman for crimes committed in a state she says she’s never visited

A Tennessee grandmother spent more than five months in jail after police used an AI facial recognition tool to link her to crimes committed in North Dakota – a state she says she’d never been to before.

CNN
AI is a liability issue waiting to happen. And this is just another example.
It’s a tool. Used incorrectly will lead to errors. Just like a hammer, used incorrectly could hit the users finger.

There is enormous variability in how hard a tool is to use correctly, how likely it is to go wrong, and how severe the consequences are. AI has a wide range on all those variables because its use cases vary so widely compared to a hammer.

The use case here is police facial recognition. Not hitting nails. The parent wasn't saying "AI is a liability" with no context.

When somebody uses a tool to hurt somebody, they need to be held accountable. If I smack you with a hammer, that needs to be prosecuted. Using AI is no different.

The problem here is incidental to the tool; it was done by the cops and therefore nobody will be held accountable.

Dynamite is a tool. But we don't hand it out to anyone who wants to play with it.
We used to until quite recently. Anybody could buy dynamite at the hardware store. We had to end this because of criminals using it to hurt people.
Look for AI to follow a similar trajectory over time.
Yes, regulation is inevitable.
Impossible at this point. You cannot download dynamite.
AI feels closer to a firearm than a hammer when accessing law enforcement's ability to quickly do massive, unrecoverable harm.

Only one small little problem --- there is no way to tell if you are using it "correctly".

The only way to be sure is to not use it.

Using it basically boils down to, "Do you feel lucky?".

The Fargo police didn't get lucky in this case. And now the liability kicks in.

This tool, however, is specifically built for mass surveillance. It serves no other purpose. The tool is broken, and everybody knows it. The tool makers are at least as guilty as those who use it.
The tool, like Google search, is likely biased towards returning results regardless of confidence.