Another Police Use of AI

#DraftOne ” product, which uses body camera recordings to generate a 1st draft of a police report for officers after an incident…has another #AI product called “ #PolicyChat ” that I haven’t seen much discussion of.

The product uses large language models ( #LLMs ), combined with an AI technique called Retrieval-Augmented Generations, or #RAG , to provide officers in the field answers about their department’s official procedures & policies.

https://www.aclu.org/news/privacy-technology/ai-policy-guidance-police

Another Police Use of AI | ACLU

Axon’s “Policy Chat” product seeks to give officers guidance on department policies and procedures

American Civil Liberties Union

"[N]ow, California has an important chance to join with other states like Utah that are passing laws to reign in these technologies, and what minimum safeguards and transparency must go along with using them.

S.B. 524 does several important things: It mandates that police reports written by AI include disclaimers on every page or within the body of the text that make it clear that this report was written in part or in total by a computer. It also says that any reports written by AI must retain their first draft. That way, it should be easier for defense attorneys, judges, police supervisors, or any other auditing entity to see which portions of the final report were written by AI and which parts were written by the officer. Further, the bill requires officers to sign and verify that they read the report and its facts are correct. And it bans AI vendors from selling or sharing the information a police agency provided to the AI.

These common-sense, first-step reforms are important: watchdogs are struggling to figure out where and how AI is being used in a police context. In fact, Axon’s Draft One, would be out of compliance with this bill, which would require them to redesign their tool to make it more transparent—a small win for communities everywhere.

So now we’re asking you: help us make a difference. Use EFF’s Action Center to tell Governor Newsom to sign S.B. 524 into law!"

https://www.eff.org/deeplinks/2025/09/california-tell-governor-newsom-regulate-ai-police-reports-and-sign-sb-524

#USA #California #AI #GenerativeAI #Axon #DraftOne #AIPoliceReports

California, Tell Governor Newsom: Regulate AI Police Reports and Sign S.B. 524

Californians should urge Gov. Gavin Newsom to sign S.B. 524: a common-sense bill that takes important first-step reforms to regulate police reports written by generative AI. This is crucial, as watchdogs struggle to figure out where and how AI is being used in a police context. S.B. 524 does several important things: mandates that police reports written by AI include disclaimers on every page or within the body of the text that make it clear that this report was written in part or in total by a computer, requires reports written by AI must retain their first draft, and requires officers to sign and verify that they read the report and its facts are correct. It also bans AI vendors from selling or sharing the information a police agency provided to the AI.

Electronic Frontier Foundation

https://www.eff.org/press/releases/eff-investigation-ai-product-police-reports-designed-hinder-audits

(icymi)

EFF Investigation: AI Product for Police Reports is Designed to Hinder Audits

Axon Enterprise's #DraftOne product, which uses #generativeAI to write police report narratives based on body-worn camera audio stymies attempts at auditing, transparency, and accountability.

EFF Investigation: AI Product for Police Reports is Designed to Hinder Audits

SAN FRANCISCO – Axon Enterprise's Draft One product, which uses generative artificial intelligence to write police report narratives based on body-worn camera audio, seems designed to stymie any attempts at auditing, transparency, and accountability, an Electronic Frontier Foundation (EFF)...

Electronic Frontier Foundation

"Everyone should have access to answers, evidence, and data regarding the effectiveness and dangers of this technology. Axon and its customers claim this technology will revolutionize policing, but it remains to be seen how it will change the criminal justice system, and who this technology benefits most.

For months, EFF and other organizations have warned about the threats this technology poses to accountability and transparency in an already flawed criminal justice system. Now we've concluded the situation is even worse than we thought: There is no meaningful way to audit Draft One usage, whether you're a police chief or an independent researcher, because Axon designed it that way.

Draft One uses a ChatGPT variant to process body-worn camera audio of public encounters and create police reports based only on the captured verbal dialogue; it does not process the video. The Draft One-generated text is sprinkled with bracketed placeholders that officers are encouraged to add additional observations or information—or can be quickly deleted. Officers are supposed to edit Draft One's report and correct anything the Gen AI misunderstood due to a lack of context, troubled translations, or just plain-old mistakes. When they're done, the officer is prompted to sign an acknowledgement that the report was generated using Draft One and that they have reviewed the report and made necessary edits to ensure it is consistent with the officer’s recollection. Then they can copy and paste the text into their report. When they close the window, the draft disappears.

Any new, untested, and problematic technology needs a robust process to evaluate its use by officers. In this case, one would expect police agencies to retain data that ensures officers are actually editing the AI-generated reports as required...

https://www.eff.org/deeplinks/2025/07/axons-draft-one-designed-defy-transparency

#USA #Axon #AI #GenerativeAI #DraftOne #ChatGPT #PoliceState

Axon’s Draft One Is Designed to Defy Transparency

Axon Enterprise’s Draft One — a generative artificial intelligence product that writes police reports based on audio from officers’ body-worn cameras — seems deliberately designed to avoid audits that could provide any accountability to the public, an EFF investigation has found.Our review of...

Electronic Frontier Foundation
EFF's Guide to Getting Records About Axon's Draft One AI-Generated Police Reports

The moment Axon Enterprise announced a new product, Draft One, that would allow law enforcement officers to use artificial intelligence to automatically generate incident report narratives based on body-worn camera audio, everyone in the police accountability community immediately started asking...

Electronic Frontier Foundation

Axon’s #DraftOne is Designed to Defy #Transparency

#Axon Enterprise’s Draft One — a generative artificial intelligence product that writes police reports based on audio from officers’ body-worn cameras — seems deliberately designed to avoid #audits that could provide any #accountability to the public, an @eff investigation has found.
#ai #artificialintelligence #gai #generativeai

https://www.eff.org/deeplinks/2025/07/axons-draft-one-designed-defy-transparency

Axon’s Draft One Is Designed to Defy Transparency

Axon Enterprise’s Draft One — a generative artificial intelligence product that writes police reports based on audio from officers’ body-worn cameras — seems deliberately designed to avoid audits that could provide any accountability to the public, an EFF investigation has found.Our review of...

Electronic Frontier Foundation
"By introducing chatbots that are known to hallucinate, confuse jokes for facts, or randomly add incorrect information, police tech like #DraftOne could be used to legitimize wrongful arrests, reinforce police suspicions, mislead courts, or even cover up police abuse". https://arstechnica.com/tech-policy/2024/08/chatbots-offer-cops-the-ultimate-out-to-spin-police-reports-expert-says/
Chatbots offer cops the “ultimate out” to spin police reports, expert says

Experts warn chatbots writing police reports can make serious errors.

Ars Technica
"By introducing chatbots that are known to hallucinate, confuse jokes for facts, or randomly add incorrect information, police tech like #DraftOne could be used to legitimize wrongful arrests, reinforce police suspicions, mislead courts, or even cover up police abuse". https://arstechnica.com/tech-policy/2024/08/chatbots-offer-cops-the-ultimate-out-to-spin-police-reports-expert-says/
Chatbots offer cops the “ultimate out” to spin police reports, expert says

Experts warn chatbots writing police reports can make serious errors.

Ars Technica
Here Is What #Axon’s #Bodycam Report Writing #AI Looks Like
Axon has been inviting #police to webinars about the new AI-based product, #DraftOne, a new product that uses #OpenAI’s #GPT4Turbo to automatically generate police reports from officers’ bodycam audio, according to emails obtained by 404 Media.
"Critical safeguards require every report to be edited, reviewed and approved by a human officer, ensuring accuracy and accountability of the information.”
Is that enough?
https://www.404media.co/here-is-what-axons-bodycam-report-writing-ai-looks-like-draft-one/
Here Is What Axon’s Bodycam Report Writing AI Looks Like

Axon has been inviting police departments to webinars about the new AI-based product, which will automatically generate police reports based on bodycam audio, according to emails obtained by 404 Media.

404 Media