Police tech doesn't predict crime. It predicts policing.

Fed on flawed police data, so-called 'crime-predicting' tech automates racism and discrimination against overpoliced communities.

The Safety Not Surveillance coalition spread the word that this tech must be banned!

Watch now ➡️ https://peertube.openrightsgroup.org/w/5wPH31FDt7Lzm17jZ7Musk

#SafetyNotSurveillance #BanCrimePredictingTech #police #policing #ukpolitics #ukpol #precrime #surveillance #justice

Safety Not Surveillance DigiVan

PeerTube
Safety Not Surveillance DigiVan

PeerTube

Last week the Safety Not Surveillance coalition hit the streets of London.

We sent the message that so-called 'crime-predicting' tech must be BANNED!

Because we have the right to be presumed innocent, not predicted guilty.

This tech fuels racism, enables unchecked police power and threatens all our rights. Ban it!

Sign up for campaign updates ➡️ https://action.openrightsgroup.org/join-fight-against-crime-predicting-tech

#SafetyNotSurveillance #BanCrimePredictingTech #PreCrime #police #policing #justice #ukpol #ukpolitics #criminaljustice

This summit promised “safer communities”.

But it delivered something else: speed-dating with surveillance vendors, complete with a visit from the Home Secretary talking up artificial intelligence as the future of policing.

“Crime predicting” technology risks creating mass human rights violations.

We deserve real safety, not more surveillance.

Sign up for campaign updates ⬇️

https://action.openrightsgroup.org/join-fight-against-crime-predicting-tech

#SafetyNotSurveillance #PreCrime #police #ukpolitics #ukpol #BanCrimePredictingTech #policing

Join the fight against crime-predicting tech

Why is this important? The police say AI will make us safer. The truth? It automates injustice and racism. Predictive policing tools don’t predict crime — they predict policing. Built on flawed police data, they target the same communities that have always been over-policed: Black and racialised people, low-income neighbourhoods, and migrants. This leads to: • Over-policing • Unjust stop and searches • Harassment, handcuffing, and use of force against targeted people We all deserve safety, not surveillance.

Open Rights Group

We went outside the national police summit (UK) and this is what happened...

When we tried to ask delegates to talk on camera, no one wanted to.

Why? Because constant monitoring doesn’t feel safe. It feels intrusive 🤷‍♂️

Watch now ⬇️ https://peertube.openrightsgroup.org/w/3e5MGaPnhRWUJZnCcrghQv

#SafetyNotSurveillance #PreCrime #BanCrimePredictingTech #police #policing #ukpolitics #ukpol #spycops #AI

Safety Not Surveillance Vox Pops

PeerTube
Safety Not Surveillance Vox Pops

PeerTube
Safety Not Surveillance Stunt

PeerTube
I’m loathed to post this as it just seems like someone saying ridiculous things to get coverage. But Musk now talks about his robots following offenders around to stop them re-offending. Like a prison guard for the naughty humans. https://www.newsweek.com/elon-musk-tesla-robots-prevent-future-crime-11028660 #musk #robotuprising #PreCrime
Elon Musk Says Tesla Robots Can Prevent Future Crime

Elon Musk said that Tesla's Optimus robots could follow people around and stop them from committing crimes.

Newsweek

"Bondi’s directive makes clear that when it comes to protecting ICE, federal law enforcement is now in the business of pre-crime, to borrow a term from Minority Report.

“The charging priorities directed by this memorandum are not limited to those criminals who are caught red-handed committing acts of violence against ICE facilities and personnel,” Bondi says.

In other words, you don’t have to have committed a crime to be investigated or even arrested and detained. You just have to have committed wrongspeech, as defined by NSPM-7’s broad indicators.

As Bondi’s directive goes on to quote from NSPM-7 specifically, targeting for arrest and prosecution “every person who aids, abets, or conspires” (whatever that means) with these anti-ICE forces of domestic terrorism."

https://www.kenklippenstein.com/p/no-kings-protest-and-arrests-begin

#USA #Trump #PoliceState #NSPM7 #CivilLiberties #FreeSpeech #Authoritarianism #ICE #PreCrime

"No Kings" Protest (and Arrests) Begin

NSPM-7 is already being used to detain protesters over speech

Ken Klippenstein
Evolving ontology and the case against predictive algorithms in the U.S. justice system #AI #artificialintelligence #precrime #predictivepolicing #ethics

Evolving Ontology and the Case...
Evolving Ontology and the Case Against Predictive Algorithms in the U.S. Justice System

In the United States today, predictive algorithms, a particular form of artificial intelligence (AI), permeate all segments of society, including the criminal justice system. With the help of AI, judges are able to “plug” defendants into algorithms and generate outputs that produce a deterministic and homogenous view of defendants. Proponents of predictive algorithms cite accuracy, neutrality, fairness, and efficiency, but what is missing from their calculus is the human person, the central figure who shapes and is shaped by algorithms. Using the story of defendant Darnell Gates as a case study, this article seeks to contribute to the development of AI as a topic of theological ethics, offering a moral reflection on AI in the American justice system and its implications for theological ontology. Heeding Alexander Filipović’s call to employ a social-ethical perspective of “justice to people as persons,” I will explicate what I call the theory and praxis of “evolving ontology,” building upon (1) Roberto DellʾOro’s vision of the human person; (2) Darlene Fozard Weaver’s discussion of human dignity; and (3) Pope Francis’ notion of a “culture of encounter.” This paradigm captures the conflux of potentiality, relationality, and dignity that is human existence, illumining the power of human agency and the enduring promise of human redemption that God creates, gives, and sustains. AI is a urgent matter of justice, and it is only by seeing and encountering one another in our humanity that we will be able to redeem our mechanized world.

Digital Commons at Loyola Marymount University and Loyola Law School