FRT has led to yet another wrongful arrest in the UK. This time, a man was detained for hours after the technology matched him to a burglary 100 miles away in spite of clear differences.

The police retained an image of the man after he was attacked 5 years prior. Even an innocent person’s face can remain searchable in police databases, potentially triggering future ‘matches’ and further scrutiny.

https://www.theguardian.com/technology/2026/feb/25/facial-recognition-error-prompts-police-to-arrest-asian-man-for-burglary-100-miles-away

Facial recognition error prompts police to arrest Asian man for burglary 100 miles away

Exclusive: Alvi Choudhury claiming damages against Thames Valley police after biased technology confused him with man looking ‘10 years younger’

The Guardian
A high court ruling from back in 2012 ruled that holding data of innocent people was unlawful, yet 14 years on, that seems to have been ignored. https://www.theguardian.com/uk-news/2017/feb/24/police-told-to-delete-on-request-images-of-innocent-people
Police told to delete on request millions of images of innocent people

Home secretary says any ‘unconvicted persons’ can request that police delete their images from national database

The Guardian

What’s more, FRT has repeatedly been shown to misidentify people, disproportionately impacting racialised communities.

When police treat a match like this as sufficient grounds for arrest, the presumption of innocence is weakened and due process is undermined.

This is not just a one off either. It’s a consequence of deploying biased, opaque biometric systems at scale with little legislation governing it.

The UK Home Office has just undertaken a public consultation on governing biometric technologies in policing, including FRT.

We hope this consultation leads to meaningful change to ensure this situation does not happen again.