Law enforcement agencies across the United States are adopting a new AI surveillance technology that tracks individuals by physical attributes rather than facial recognition, potentially circumventing growing legal restrictions on facial recognition systems. This development, occurring amidst the Trump administration‘s push for increased surveillance of protesters, immigrants, and students, raises significant privacy and civil liberties concerns as police departments independently adopt increasingly sophisticated AI tools with minimal oversight or community input.
The big picture: Police departments are using AI to track people through attributes like body size, clothing, and accessories, bypassing facial recognition restrictions.
- The ACLU identified this as the first instance of such a tracking system used at scale in the US after learning about it through MIT Technology Review.
- Civil liberties advocates warn this technology has high potential for abuse by federal agencies while escaping scrutiny because it doesn’t technically use biometric data.
Why this matters: The fragmented nature of US law enforcement allows widespread AI adoption with limited oversight.
- Over 18,000 separate police departments across the country have considerable discretion over technology purchases.
- No overarching federal law governs how local police departments adopt surveillance technologies like these tracking systems.
The AI policing landscape: Law enforcement technology is increasingly AI-centric as departments seek solutions for officer shortages.
- Companies like Flock and Axon sell comprehensive sensor networks—including cameras, license plate readers, and drones—paired with AI analysis tools.
- Police departments cite time savings, addressing staff shortages, and improving response times as key benefits of these AI systems.
Community tensions: AI-powered police technology is creating friction between departments and the communities they serve.
- Chula Vista, California police faced lawsuits after obtaining special FAA waivers for drone operations but allegedly reneging on promises to make footage public.
- Residents have complained that constant drone surveillance feels like an invasion of privacy.
What they’re saying: ACLU senior policy analyst Jay Stanley recommends a structured approach to police AI adoption.
- “The community should be very skeptical of this kind of tech and, at a minimum, ask a lot of questions,” Stanley stated.
- He advocates for public hearings, community permission, and explicit promises about system usage limitations before implementation.
The road ahead: Civil liberties groups are calling for more transparency and independent evaluation.
- Stanley outlined a roadmap requiring public hearings and community approval before police departments adopt AI technologies.
- He also emphasized that companies developing surveillance technologies should allow independent testing of their systems.
Police tech can sidestep facial recognition bans now