ORG called for unjust 'crime-predicting' tech to be banned from UK policing 🚨

Police tech doesn’t predict crime. It predicts policing. Based on biased data, it brings more of the same – racist policing and poverty punishment.

Find out more ➡️ https://www.openrightsgroup.org/campaign/resist-pre-crime/

#ORG2025 #digitalrights #PreCrime #BanCrimePredictingTech #policing #police #ukpolitics #ukpol

ORG and the Safety Not Surveillance coalition hit the streets of London.

We sent the message that so-called 'crime-predicting' tech must be BANNED!

Because we have the right to be presumed innocent, not predicted guilty.

Read more ➡️ https://www.openrightsgroup.org/blog/why-predictive-policing-must-be-banned

Watch our video ➡️ https://peertube.openrightsgroup.org/w/5wPH31FDt7Lzm17jZ7Musk

#ORG2025 #digitalrights #PreCrime #BanCrimePredictingTech #policing #police #ukpolitics #ukpol

Why ‘Predictive’ Policing Must be Banned

The UK Government is trying to use algorithms to predict which people are most likely to become killers using sensetive personal data of hundreds of thousands of people.

Open Rights Group

The UK police claim AI and tech can ‘predict’ crimes 🎱

Our liberties mustn't be taken away in a biased game of probability that automates unjust stop and searches, harassment and use of force on over-policed communities.

Sign and share our petition to BAN it ⬇️

https://you.38degrees.org.uk/petitions/ban-crime-predicting-police-tech

#ORG2025 #digitalrights #PreCrime #BanCrimePredictingTech #policing #police #ukpolitics #ukpol

Ban ‘Crime Predicting’ Police Tech

The Lie  AI and police tech don’t predict crime - they predict policing. These technologies are built on existing, flawed, police data. So communities who have historically been over policed are more likely to be identified as at ‘risk’ of future criminal behaviour. This leads to more racist policing and more surveillance, particularly for Black and racialised communities, lower income communities and migrant communities. Instead of making us safer, this tech leads to: • Over-policing •...

38 Degrees
@openrightsgroup 'Who have we abused enough to make them possibly want to get high, steal something, or physically assault someone?'
@openrightsgroup scapegoating machine

@openrightsgroup

The #ukgovenment should apply the prediction to which of them will be corrupt, then lock themselves up.

@openrightsgroup I've seen some (reportedly) very effective "crime-predicting tech".

On a notice board in a private area of a police station there were some photos and bios of prolific offenders with their upcoming release dates. The police liked to keep an eye on them so as to be able to catch them and put them back inside before they'd done more than two or three new burglaries.