Private Tech Companies Are Making Law Enforcement's Opacity Problem Even Worse
The increasing reliance on tech by law enforcement means the increasing reliance on private companies. It's inevitable that tech developments will be adopted by government agencies, but a lot of this adoption has occurred with minimal oversight or public input. That lack of public insight carries forward to criminal trials, where companies have successfully stepped in to prevent defendants from accessing information about evidence, citing concerns about exposed trade secrets or proprietary software. In other cases, prosecutors have dropped cases rather than risk discussing supposedly sensitive tech in open court.
Elizabeth Joh's new article for Science says corporations are making existing transparency and accountability problems in law enforcement even worse.
Private companies are often wary of divulging too much about their products to gain competitive advantage over rivals. As a consequence, companies may decide to protect their intellectual property and market advantages by invoking trade secret privileges, demanding nondisclosure agreements with customers, and imposing other forms of property protections. These forms of commercial secrecy, common enough outside of the criminal justice system, pose challenges to basic police accountability.
This is only one of the problems the adoption of private sector tech creates. There are others. As Joh points out, law enforcement officers often attest to their "training and expertise" when testifying in court or seeking warrants. But actual tech expertise is the exception, not the rule. Private companies market products to law enforcement agencies, but the rollout of purchased tech is rarely accompanied by immersive training. In some cases, any analytic work is offloaded to private contractors, making it even less likely the public will ever be fully apprised of how the tech works or why -- in the case of predictive policing software and facial recognition AI -- it arrives at the conclusions it does.
Saddled by non-disclosure agreements, normal police secrecy, claims of valuable trade secrets, and a lack of technical expertise by law enforcement end users, private company tech can become a black hole where data on citizens goes in, but never comes back out for public scrutiny.
Take, for example, ShotSpotter. Its sensors and microphones pick up percussive noises. These are transferred to ShotSpotter's analysts, who then make a judgment call about the overheard noises. Sometimes the analysts are wrong. Sometimes, more disturbingly, they alter their judgment calls after being contacted by police officers. How accurate is it? More importantly, how can the public access this data without relying on either ShotSpotter's cheery claims about high accuracy or secondhand intuition based on agencies who have dumped the tech after too many false positives?
The real answer may never be known.
[A] community or researcher who wants to know more about ShotSpotter—its accuracy and its flaws—may find a dead end. ShotSpotter’s contractual arrangements with its police customers provide them with results, and results only. The company claims ownership not just of its proprietary software but also of the data its technology generates. This means that conventional tools of disclosure, like state public records requests laws, have no purchase on any acoustic gunfire detection system because such systems remain within private hands.
For better or worse, the solution likely runs through local governments. These entities can forbid law enforcement agencies from purchasing tech from vendors unwilling to allow public scrutiny of their software and hardware. They can demand more transparency on usage and effectiveness from the agencies they oversee. The general public may not be able to take its law enforcement business elsewhere, but their elected reps can help ensure the agencies they're stuck with are more accountable.
The other check against misuse is the nation's courts. Judges can (and should) challenge more broad statements about law enforcement expertise and allow fewer companies whose tech has generated evidence to shirk their proxy obligations to criminal defendants, who have a constitutional right to examine the evidence against them and confront their accusers in court -- even if the accuser is a shot spotting sensor or a proprietary DNA-matching algorithm.
No one's saying cops shouldn't have access to tech advances. But governments need to do more to ensure these business relationships don't supersede law enforcement agencies' obligations to the public. Action needs to be taken now -- both at the local and national level -- to prevent ongoing problems from getting worse and to head off future abuses and injustices before they can occur.