So-called 'crime predicting' tech automates racism in the UK criminal justice system.

Even the police have admitted that it's biased, with it being based on flawed police data 🤷

We need to stop pouring in good money after bad.

We need an outright ban on its use.

https://www.theguardian.com/technology/2026/feb/24/police-ai-chief-admits-crime-fighting-tech-will-have-bias-but-vows-to-tackle-it

#SafetyNotSurveillance #precrime #AI #policing #automatedinjustice #ukpolitics #ukpol

Police AI chief admits crime-fighting tech will have bias but vows to tackle it

Exclusive: NCA’s Alex Murray says he hopes new £115m police AI centre can limit unfairness found in tools

The Guardian

We have a right to be presumed innocent. Not predicted guilty.

AI that claims to 'predict' crime will only replicate the historic discrimination and abuse already experienced by over-policed communities.

Sign the petition to ban it in the UK ⬇️

https://you.38degrees.org.uk/petitions/ban-crime-predicting-police-tech

#SafetyNotSurveillance #precrime #AI #policing #automatedinjustice #ukpolitics #ukpol

Ban ‘Crime Predicting’ Police Tech

The Lie  AI and police tech don’t predict crime - they predict policing. These technologies are built on existing, flawed, police data. So communities who have historically been over policed are more likely to be identified as at ‘risk’ of future criminal behaviour. This leads to more racist policing and more surveillance, particularly for Black and racialised communities, lower income communities and migrant communities. Instead of making us safer, this tech leads to: • Over-policing •...

38 Degrees