1
What AI means for the future of policing
Artificial intelligence is increasingly shaping who police stop, how reports are written, where officers patrol and how evidence is analyzed.Why it matters: The tech promises speed and efficiency. But its rapid spread is outpacing public rules, and could embed errors and bias deep within the criminal justice system.The big picture: Local law enforcement agencies are facing chronic staffing shortages amid pressures to reduce violent crime.Around 75% of officers say understaffing has delayed their backups in emergencies, and 56% say it has increased their exposure to high-risk calls, according to a Police1 survey released in May. Zoom in: Departments from California to Hawaii are piloting generative-AI tools that turn body-camera audio into police narratives, shaving hours off paperwork and getting officers back on patrol.San Francisco police are piloting Axon's Draft One, allowing AI to generate first-draft reports for citations and lower-level cases.South Fulton Police, south of Atlanta, partnered with IBM on an AI-driven public safety platform that aggregates data to "save man-hours and money" while helping predict and analyze crime patterns.Startup firm Longeye is testing AI analysis systems with 12 agencies, including one in Akron, Ohio, that allows detectives to comb through hours of jail phone calls, interviews and police footage for evidence. State of play: Law enforcement agencies are already investing significantly in AI and AI-adjacent technologies such as drones, license-plate readers, gunshot-detection systems and advanced analytics.The AI in law enforcement market is expected to grow from roughly $3.5 billion in 2024 to more than $6.6 billion by 2033, according to consulting firm Consainsights.What they're saying: "When a technology wave hits law enforcement, it hits it hard," Longeye CEO Guillaume Delepine tells Axios.About 70% of investigators say they don't have time to review all the digital evidence they collect, leaving critical phone calls, interviews and device data untouched, Delepine says.Longeye's AI tools allow agencies to review hours of material and flag key conversations, such as confessions, for detectives to examine. The intrigue: Some agencies are working with AI firms to help shape tools.Sno911, the emergency dispatch agency for Snohomish County outside of Seattle, partnered with AI startup Aurelian to launch a new artificial intelligence-powered system to help with emergency 911 calls.Dispatchers fielding emergency calls to Sno911 will now have an AI on-screen assistant called Cora, who will listen to the caller and dispatcher.Cora will offer suggestions to the dispatchers for questions to ask, provide other phone numbers or suggest instructions to give distraught callers, Sno911 director Kurt Mills tells Axios.The other side: Civil liberties activists warn that AI tools risk reinforcing bias within the criminal justice system and that it's unclear who controls the data collected by AI systems, as local laws fail to establish guardrails."The digital space has evolved in a way that wasn't necessarily anticipated when (privacy) laws were implemented," Beryl Lipton, senior investigative researcher at the Electronic Frontier Foundation, tells Axios.The city of Austin, Texas, recently paused a plan for AI-enhanced camera systems in parks — that would "analyze behavior" to spot potential crimes — over questions about civil liberties and efficacy, per KUT.Yes, but: The AI technology is just another tool to gather evidence and the burden of proof is still on human investigators and analysts, Delepine said."We tell our agencies, 'don't bring the chatbot to court. Use the chatbot to find the actual moment in the evidence that matters, and bring that to court'."Go deeper: AI drones are America's newest copsAI can't hear everyone equally
No comments yet.