We have a right to be presumed innocent. Not predicted guilty.

AI that claims to 'predict' crime will only replicate the historic discrimination and abuse already experienced by over-policed communities.

Sign the petition to ban it in the UK ⬇️

https://you.38degrees.org.uk/petitions/ban-crime-predicting-police-tech

#SafetyNotSurveillance #precrime #AI #policing #automatedinjustice #ukpolitics #ukpol

Ban ‘Crime Predicting’ Police Tech

The Lie  AI and police tech don’t predict crime - they predict policing. These technologies are built on existing, flawed, police data. So communities who have historically been over policed are more likely to be identified as at ‘risk’ of future criminal behaviour. This leads to more racist policing and more surveillance, particularly for Black and racialised communities, lower income communities and migrant communities. Instead of making us safer, this tech leads to: • Over-policing •...

38 Degrees

So-called 'crime predicting' tech automates racism in the UK criminal justice system.

Even the police have admitted that it's biased, with it being based on flawed police data 🤷

We need to stop pouring in good money after bad.

We need an outright ban on its use.

https://www.theguardian.com/technology/2026/feb/24/police-ai-chief-admits-crime-fighting-tech-will-have-bias-but-vows-to-tackle-it

#SafetyNotSurveillance #precrime #AI #policing #automatedinjustice #ukpolitics #ukpol

Police AI chief admits crime-fighting tech will have bias but vows to tackle it

Exclusive: NCA’s Alex Murray says he hopes new £115m police AI centre can limit unfairness found in tools

The Guardian

Facial recognition is biased. The police know it. The UK Home Office knows it. And they don't care.

This dangerous, intrusive tech produces more false positives for women, young people and members of ethnic minority groups.

We need Parliamentary scrutiny now!

https://www.theguardian.com/technology/2025/dec/10/police-facial-recognition-technology-bias

#facialrecognition #surveillance #policing #privacy #ukpolitics #ukpol #safetynotsurveillance

UK police forces lobbied to use biased facial recognition technology

Exclusive: System more likely to suggest incorrect matches for images of women and Black people

The Guardian

Facial recognition not only harms people right now, but feeds into’ crime-predicting’ tech that turbo charges existing bias.

Sign our petition to ban Predictive Policing in the UK to push back on this dystopian vision of a fully monitored Pre-Crime future.

Act now ⬇️

https://you.38degrees.org.uk/petitions/ban-crime-predicting-police-tech

#facialrecognition #surveillance #policing #privacy #ukpolitics #ukpol #precrime #bancrimepredictingtech #safetynotsurveillance #predictivepolicing

Ban ‘Crime Predicting’ Police Tech

The Lie  AI and police tech don’t predict crime - they predict policing. These technologies are built on existing, flawed, police data. So communities who have historically been over policed are more likely to be identified as at ‘risk’ of future criminal behaviour. This leads to more racist policing and more surveillance, particularly for Black and racialised communities, lower income communities and migrant communities. Instead of making us safer, this tech leads to: • Over-policing •...

38 Degrees

Police tech doesn't predict crime. It predicts policing.

Fed on flawed police data, so-called 'crime-predicting' tech automates racism and discrimination against overpoliced communities.

The Safety Not Surveillance coalition spread the word that this tech must be banned!

Watch now ➡️ https://peertube.openrightsgroup.org/w/5wPH31FDt7Lzm17jZ7Musk

#SafetyNotSurveillance #BanCrimePredictingTech #police #policing #ukpolitics #ukpol #precrime #surveillance #justice

Safety Not Surveillance DigiVan

PeerTube
Safety Not Surveillance DigiVan

PeerTube

Last week the Safety Not Surveillance coalition hit the streets of London.

We sent the message that so-called 'crime-predicting' tech must be BANNED!

Because we have the right to be presumed innocent, not predicted guilty.

This tech fuels racism, enables unchecked police power and threatens all our rights. Ban it!

Sign up for campaign updates ➡️ https://action.openrightsgroup.org/join-fight-against-crime-predicting-tech

#SafetyNotSurveillance #BanCrimePredictingTech #PreCrime #police #policing #justice #ukpol #ukpolitics #criminaljustice

This summit promised “safer communities”.

But it delivered something else: speed-dating with surveillance vendors, complete with a visit from the Home Secretary talking up artificial intelligence as the future of policing.

“Crime predicting” technology risks creating mass human rights violations.

We deserve real safety, not more surveillance.

Sign up for campaign updates ⬇️

https://action.openrightsgroup.org/join-fight-against-crime-predicting-tech

#SafetyNotSurveillance #PreCrime #police #ukpolitics #ukpol #BanCrimePredictingTech #policing

Join the fight against crime-predicting tech

Why is this important? The police say AI will make us safer. The truth? It automates injustice and racism. Predictive policing tools don’t predict crime — they predict policing. Built on flawed police data, they target the same communities that have always been over-policed: Black and racialised people, low-income neighbourhoods, and migrants. This leads to: • Over-policing • Unjust stop and searches • Harassment, handcuffing, and use of force against targeted people We all deserve safety, not surveillance.

Open Rights Group

We went outside the national police summit (UK) and this is what happened...

When we tried to ask delegates to talk on camera, no one wanted to.

Why? Because constant monitoring doesn’t feel safe. It feels intrusive 🤷‍♂️

Watch now ⬇️ https://peertube.openrightsgroup.org/w/3e5MGaPnhRWUJZnCcrghQv

#SafetyNotSurveillance #PreCrime #BanCrimePredictingTech #police #policing #ukpolitics #ukpol #spycops #AI

Safety Not Surveillance Vox Pops

PeerTube
Safety Not Surveillance Vox Pops

PeerTube