ORG called for unjust 'crime-predicting' tech to be banned from UK policing 🚨

Police tech doesn’t predict crime. It predicts policing. Based on biased data, it brings more of the same – racist policing and poverty punishment.

Find out more ➡️ https://www.openrightsgroup.org/campaign/resist-pre-crime/

#ORG2025 #digitalrights #PreCrime #BanCrimePredictingTech #policing #police #ukpolitics #ukpol

ORG and the Safety Not Surveillance coalition hit the streets of London.

We sent the message that so-called 'crime-predicting' tech must be BANNED!

Because we have the right to be presumed innocent, not predicted guilty.

Read more ➡️ https://www.openrightsgroup.org/blog/why-predictive-policing-must-be-banned

Watch our video ➡️ https://peertube.openrightsgroup.org/w/5wPH31FDt7Lzm17jZ7Musk

#ORG2025 #digitalrights #PreCrime #BanCrimePredictingTech #policing #police #ukpolitics #ukpol

Why ‘Predictive’ Policing Must be Banned

The UK Government is trying to use algorithms to predict which people are most likely to become killers using sensetive personal data of hundreds of thousands of people.

Open Rights Group

@openrightsgroup

The #ukgovenment should apply the prediction to which of them will be corrupt, then lock themselves up.