We have a right to be presumed innocent. Not predicted guilty.

AI that claims to 'predict' crime will only replicate the historic discrimination and abuse already experienced by over-policed communities.

Sign the petition to ban it in the UK ⬇️

https://you.38degrees.org.uk/petitions/ban-crime-predicting-police-tech

#SafetyNotSurveillance #precrime #AI #policing #automatedinjustice #ukpolitics #ukpol

Ban ‘Crime Predicting’ Police Tech

The Lie  AI and police tech don’t predict crime - they predict policing. These technologies are built on existing, flawed, police data. So communities who have historically been over policed are more likely to be identified as at ‘risk’ of future criminal behaviour. This leads to more racist policing and more surveillance, particularly for Black and racialised communities, lower income communities and migrant communities. Instead of making us safer, this tech leads to: • Over-policing •...

38 Degrees

So-called 'crime predicting' tech automates racism in the UK criminal justice system.

Even the police have admitted that it's biased, with it being based on flawed police data 🤷

We need to stop pouring in good money after bad.

We need an outright ban on its use.

https://www.theguardian.com/technology/2026/feb/24/police-ai-chief-admits-crime-fighting-tech-will-have-bias-but-vows-to-tackle-it

#SafetyNotSurveillance #precrime #AI #policing #automatedinjustice #ukpolitics #ukpol

Police AI chief admits crime-fighting tech will have bias but vows to tackle it

Exclusive: NCA’s Alex Murray says he hopes new £115m police AI centre can limit unfairness found in tools

The Guardian

»#AutomatedInjustice – Mein Bericht über Algorithmen im Polizeieinsatz @algorithmwatch

#Kriminalitätsschwerpunkte oder #Risikoprofile: In Deutschland setzen Polizei & Justiz zunehmend Algorithmen ein, um Straftaten „vorherzusagen“ und zu „verhindern"«

https://algorithmwatch.org/de/predictive-policing-deutschland/

Automatisierte Polizeiarbeit: Wie Algorithmen in Deutschland Straftaten „voraussehen“ sollen - AlgorithmWatch

Die Polizei, Strafverfolgungsbehörden und Justizvollzugsanstalten in Deutschland versuchen immer stärker, Straftaten digital „vorherzusagen“ und zu „verhindern“. Der Bericht „Automating Injustice“ gibt einen Überblick über solche algorithmischen Systeme, die in Deutschland entwickelt und eingesetzt werden.

AlgorithmWatch

"According to the ChatGPT narrative, humans as a species are responsible for climate change and specific economic activities or actors associated with carbon emissions play no role. Analogously, the social structuration of vulnerability to climate impacts and issues of climate justice are hardly addressed. ChatGPT’s narrative consists of de-politicized stories that are highly optimistic about technological progress."

#GenerativeGreenwashing
#AutomatedInjustice

http://link.springer.com/10.1007/s13280-024-01997-7

“In the end, the story of climate change was one of hope and redemption”: ChatGPT’s narrative on global warming - Ambio

AI chatbots such as ChatGPT help people produce texts. According to media reporting, these texts are also used for educational purposes. Thus, AI influences people’s knowledge and perception of current issues. This paper examines the narrative of ChatGPT's stories on climate change. Our explorative analysis reveals that ChatGPT’s stories on climate change show a relatively uniform structure and similar content. Generally, the narrative is in line with scientific knowledge on climate change; the stories convey no significant misinformation. However, specific topics in current debates on global warming are conspicuously missing. According to the ChatGPT narrative, humans as a species are responsible for climate change and specific economic activities or actors associated with carbon emissions play no role. Analogously, the social structuration of vulnerability to climate impacts and issues of climate justice are hardly addressed. ChatGPT’s narrative consists of de-politicized stories that are highly optimistic about technological progress.

SpringerLink

I have been assessed as HIGH risk of committing crime in the future. Take @fairtrials quiz to see how police or criminal justice authorities could use data & AI to profile you. Ban predictive policing and justice systems in the #AIAct #AutomatedInjustice

https://www.fairtrials.org/predictive-policing/

’Predictive’ policing and criminal ‘prediction’ systems - Fair Trials

Take Fair Trials' quiz below to see if predictive systems would profile you as a ‘risk’ of committing a crime in the future.

Fair Trials

I have been assessed as MEDIUM risk of committing crime in the future. Take @fairtrials quiz to see how police or criminal justice authorities could use data & AI to profile you. Ban predictive policing & justice systems in the #AIAct #AutomatedInjustice https://www.fairtrials.org/predictive-policing/

🐦🔗: https://n.respublicae.eu/karmel80/status/1622919336384929795

’Predictive’ policing and criminal ‘prediction’ systems - Fair Trials

Take Fair Trials' quiz below to see if predictive systems would profile you as a ‘risk’ of committing a crime in the future.

Fair Trials

I have been assessed as HIGH risk of committing crime in the future. Take fairtrials quiz to see how police or criminal justice authorities could use data & AI to profile you. Ban predictive policing and justice systems in the #AIAct #AutomatedInjustice

https://www.fairtrials.org/predictive-policing/

’Predictive’ policing and criminal ‘prediction’ systems - Fair Trials

Take Fair Trials' quiz below to see if predictive systems would profile you as a ‘risk’ of committing a crime in the future.

Fair Trials

I have been assessed as being HIGH risk of committing crime in the future.

Do you know what predictive or profiling systems could think about you?

Take Fair Trials’ quiz and find out how the police and criminal justice authorities could use data, algorithms and artificial intelligence to profile you and decide whether or not you are at ‘risk’ of criminal behaviour: https://www.fairtrials.org/predictive-policing/

Ban predictive policing and justice systems in the #AIAct. #AutomatedInjustice

’Predictive’ policing and criminal ‘prediction’ systems - Fair Trials

Take Fair Trials' quiz below to see if predictive systems would profile you as a ‘risk’ of committing a crime in the future.

Fair Trials
Are you a potential criminal?

I have been assessed as being HIGH risk of committing crime in the future. Do you know what predictive or profiling systems could think about you? Take Fai...

I have been assessed as *MEDIUM* risk of committing crime in the future.
Take the Fair Trials quiz to see how police or criminal justice authorities could use data & AI to profile you. Ban predictive policing & justice systems in the #AIAct #AutomatedInjustice

https://www.fairtrials.org/predictive-policing

’Predictive’ policing and criminal ‘prediction’ systems - Fair Trials

Take Fair Trials' quiz below to see if predictive systems would profile you as a ‘risk’ of committing a crime in the future.

Fair Trials