Last month, members of the California legislature were subject to a surveillance experiment, courtesy of the ACLU. Their portraits were fed into Amazon’s Rekognition facial recognition software and compared with a database of 25,000 arrest mugshots. Twenty-six lawmakers were incorrectly identified as matches. The would-be suspects included Assemblyman Phil Ting, a Democrat from San Francisco. He hoped it would drum up support for his bill, AB 1215, to ban facial recognition from police body cameras.

On Wednesday, the state Senate passed a slightly different bill—not a ban, but a moratorium that expires in three years. The change came just ahead of the deadline to make amendments before the session ends this week. Some privacy advocates worry that the bill’s expiration date will give companies, many of which acknowledge the limitations of their technology, time to improve their algorithms and win over skeptics. In three years, if the ACLU’s test is replayed, will the facial recognition companies pass it?

The bill, which needs approval by the state Assembly and the governor’s signature to become law, has been celebrated by the ACLU as an positive step. Matt Cagle, an attorney at the ACLU of Northern California, says that body cameras, which have been touted as tools for accountability after shootings of unarmed people of color, are poised to turn into tools of surveillance instead. “It’s a bait-and-switch,” he says. The bill would ban the use of facial recognition algorithms in real-time, when the body cameras are rolling, and in forensic analysis of footage later on. It carves out an exemption for algorithms that detect and redact faces from body camera footage, so that the rules don’t slow public records requests.

The moratorium comes amid growing concerns about facial recognition in public spaces. Cities including San Francisco and Oakland have passed broader bans on government use of facial recognition, and Massachusetts is considering a statewide moratorium. The bills have been driven by concerns about privacy and bias that some argue are inherent, but also technical shortcomings that have led even companies developing the technology to say it isn’t ready for prime time.

Last spring, Microsoft said it had refused to sell its facial recognition software to an unnamed California police agency. In June, Axon, the largest supplier of body cameras to law enforcement, said it wouldn’t include facial recognition in its product, on the recommendation of its external ethics board. In part, it was a recognition that the technology simply doesn’t work well enough—at least not yet. While facial recognition has been historically used to match faces on clear, forward-facing images—say, comparing a mugshot to a database of prior arrests—that’s much more difficult to do in real-time. Officers often find themselves in situations involving bad lighting, tricky angles, or quick motion. Axon has left open the possibility that it could pursue facial recognition technology in the future.

Keep Reading



The latest on artificial intelligence, from machine learning to computer vision and more

Companies like Amazon have argued facial recognition should be regulated, not banned. The company pushed back on the ACLU’s August experiment, saying the bad matches would not have happened if the ACLU had required 99 percent probability for a match. (The ACLU said it had used “factory standards” for the test.) The Information Technology and Innovation Foundation, an industry group that receives support from companies including Microsoft and Amazon, opposes AB 1215, arguing the technology could counter biases by humans reviewing footage.

The most vocal opposition, however, has been from police groups who argue it strips them of a key piece of technology for public safety. The bill “erroneously presumes that persons in public possess or are afforded a reasonable expectation of privacy,” the Riverside Sheriffs’ Association wrote in an analysis of the bill.

The switch from a ban to a moratorium, according to Ting, came out of concerns from lawmakers who “wanted to revisit the issue as the technology improves.” He says a moratorium strikes the proper balance, giving officials and technologists more time and flexibility. “If you were going to deploy cameras all over a particular city you would have a significant public process. Right now law enforcement can do that without a public process and have those cameras roving around.”