When the facial recognition algorithm says
You are guilty. Sorry if a women in Manchester wrongly accused of shoplifting and getting banned from shops despite being innocent. https://www.bbc.co.uk/news/articles/cdr510p7kymo #biometrics #news #survillance
Facial recognition error sees woman accused of theft

An apparent mix-up with the technology led to Danielle Horan being wrongly accused of shoplifting.

BBC News
@JamesBaker this shop and the facial recognition firm are just guilty of libel aren't they?
@otfrom I think for libel it has to be a publication that defames someone. Maybe it would be slanderous if the manager said in front of other customers they were a shoplifter? It seems like someone ought to have some redress or compensation from mistakes like this. Maybe there is something in losses arising from failures to abide by data protection accuracy requirements
@JamesBaker if it is going into the central database of the firm and then being sent to multiple shops isn't that publishing?
@JamesBaker "facial recognition error" seems slightly off the mark. Facial recognition appears to have worked quite well, but her face was wrongly registered for the theft of products she actually bought. So that's another way this can go wrong.
@hllizi Yes that’s a good point. I guess it depends on how you define the system. If it requires human involvement then as with many systems human error will be the problem, even if the actual tech is working.
@JamesBaker - Yes; Facial Recognition AI is creepy and Bull S**t to do this to her. Even the innocent get harmed.
@JamesBaker
Looks like it was the store staff that made the error, mistakenly reporting her as a shoplifter. The tech then did exactly what it was designed to do and flagged her when it spotted her.
@robert Yes just shows how a simple human error could label someone guilty under such a system
@JamesBaker at this point Facewatch is like Clearview AI but much worse, and more inaccurate