Face Value

Facial recognition technology can be a powerful tool for law enforcement, but opponents say it comes at the cost of civil liberties

Facial recognition technology has proliferated over the past decade, and so have its critics. In the past year, eight cities have voted to take the tool out of the hands of law enforcement and government agencies, and a coalition of organizers is hoping Denver will soon join them.

Supporters of the bans say facial recognition systems come with built-in biases and privacy threats. Others hope proper regulation will allow governments to use the technology for good while easing fears of a dystopian surveillance state.


In May 2019, San Francisco became the first major U.S. city to ban use of facial recognition by government agencies. Three other Bay Area cities — Alameda, Berkeley and Oakland — and the Massachusetts cities of Somerville, Northampton, Cambridge and Brookline have since followed. 

More jurisdictions are jumping on the bandwagon. The city council of Portland, Oregon, is pondering what would be the country’s strictest ban yet on facial recognition technology, barring not only its use by government agencies but also private businesses. Portland, Maine, and Springfield, Massachusetts, are also considering bans.

States are getting on board, too. In October, California’s governor signed legislation placing a three-year moratorium on the use of facial recognition in police body cameras, and state legislators in Michigan and New Hampshire have pushed forward moratoria on government or law enforcement use of facial recognition.

‘5280 NOT 1984’

While Colorado’s state and local lawmakers haven’t yet taken up the cause, organizers in Denver are trying to get a proposal for a ban on the 2020 ballot. The “5280 not 1984” initiative would prohibit government agencies or officials, including police, from accessing, developing, retaining or using facial recognition surveillance systems or information obtained from them. Under the proposed ordinance, the city and government agencies could be sued for injunctive or declaratory relief and ordered to enforce the ban if found in violation of the law.

The coalition behind the measure includes people from all over the political spectrum, many of them privacy advocates, said Connor Swatling, one of the group’s organizers. “I think, generally, you find this is a bipartisan or nonpartisan issue,” Swatling said. “Americans generally are pretty touchy about increased governmental surveillance or broader surveillance powers.”

Supporters of the ban say facial recognition technology has racial and gender biases and can lead to a high rate of false-positive identifications. On its website, the group also claims the biometric data these programs collect and store create “an appetizing target for identity thieves and state-sponsored hackers.” 

The measure’s spot on the 2020 ballot isn’t a sure thing. Proponents have until May 4 to file 8,265 signatures, and as of last week they had a little over half of what they need. Swatling said the group had been on track to meet its goal, but the COVID-19 pandemic has hindered signature collection in recent months. The group is exploring its options for extending the deadline or using digital notarization to get back on track, according to Swatling. Failing that, the group will look into getting the measure referred to the ballot by City Council. 

Many of the municipal bans in effect have been adopted by cities that never used facial recognition tools to begin with. That would also be true of Denver, should the measure pass. The Denver Police Department doesn’t use facial recognition technology, and it doesn’t have a position on the proposed ballot measure, a DPD spokesperson said in an e-mail. No other Denver agencies use the technology, according to media reports.  

Denise Maes, public policy director of the ACLU of Colorado, thinks these preemptive bans are a good thing. “The technology to violate our privacy is moving at a much faster rate than the laws to preclude these sorts of technology,” Maes said. “So, I actually applaud these prohibitions that are getting ahead of the technology.”

A WIDESPREAD TOOL

Nobody seems to know for sure how many U.S. law enforcement and government agencies are using facial recognition technology today.

“We know that, nationwide, a significant number of law enforcement agencies are using it,” said Jameson Spivack, a policy associate at the Center on Privacy and Technology at Georgetown Law. “The unfortunate thing is that we don’t know the full extent because a lot of… this use is pretty opaque, and the agencies that use it aren’t very forthright about the use of it.”

While Denver has so far opted out, many nearby cities have adopted the technology. The Colorado Information Sharing Consortium, an organization of more than 90 law enforcement agencies in the state that share data, offers its members access to a facial recognition system from LexisNexis called Lumen, which compares photos against a database of mug shots to find potential matches. 

CISC Executive Director David Shipley said the technology has been used widely in investigations by member agencies, which have had access to the tool since 2015. The software’s successes include helping Edgewater police track down a homicide witness, Shipley said, as well as aiding in the identification of an online stalker, serial shoplifters, a person using stolen checks, as well as people who were evading arrest through disguises or false identities. 

Detective Del Matticks of the Aurora Police Department, a CISC member agency, said he has used Lumen about a dozen times since the department adopted it a few years ago. In one 2018 case, he was able to generate a lead by uploading selfies from a man who had been communicating with a runaway teen. The photos matched mugshots of a known sex offender and, after further investigation, the man was eventually arrested. 

Matticks said the facial recognition tool was “certainly not the end-all, by any stretch,” and much more had to be done by investigators to secure probable cause. “But, time being of the essence in a lot of cases like this, especially when we’re dealing with children as victims, the sooner you can grab ahold of a lead like this and have a direction to go, that can save you valuable time,” he said.

IMPERFECT TECHNOLOGY

Critics of facial recognition say it’s often inaccurate and can lead to misidentification. Studies show the technology’s shortcomings are amplified when it comes to race, gender and age. 

A 2018 paper from researchers at MIT and Stanford found that facial-analysis programs from major technology companies could consistently identify the gender of light-skinned men, with error rates under 1%. But for dark-skinned women, the error rate exceeded 34% in some cases.

Activists have run experiments of their own to draw attention to the technology’s flaws. In 2018, the ACLU tested Amazon’s facial recognition tool on members of Congress, and the software incorrectly matched 28 legislators to different people who had been arrested for crimes. Nearly 40% of the lawmakers who had false matches were people of color, who make up only about 20% of Congress.

Swatling, the ballot measure proponent, ran a similar test on Denver City Council earlier this year. He used the same Amazon tool to compare portraits of the council members against a pool of 2,000 photos taken from the Colorado Bureau of Investigation’s sex offender registry. The software returned matches for nine of the 13 council members, and the program’s confidence levels exceeded 90% for some.

Cases of mistaken identity aren’t confined to experiments. They can cause real-world harm, noted Spivack of Georgetown Law, pointing to the 2019 Sri Lanka Easter bombing as an example. Police in the South Asian country misidentified an American college student as a suspect in the attack, a mistake they blamed on investigators using a facial recognition program. The young woman reportedly received death threats as a result of the error. 

Asked whether he’s concerned about potential biases in the technology, CISC’s Shipley said, “We’re always concerned, because we want to make sure people are treated fairly.” But the systems are used to generate investigative leads, not make a final determination, he said, adding that “the investigator is always the one accountable for making sure that the right person is arrested for the proper reasons.”

PRIVACY ISSUES

Even a 100% accurate and unbiased facial recognition tool would still have its detractors. Privacy advocates are wary of the technology’s potential as a surveillance tool, especially if paired with the cameras that are already ubiquitous in public spaces.

“With surveillance, the government can track the whereabouts of perfectly innocent people,” said the ACLU’s Maes. “And it can later be used in very consequential ways.”

In addition to concerns about invasion of privacy, there are free speech and freedom of assembly reasons to allow people some degree of anonymity, even in public, said Margot Kaminski, associate professor at CU Law and Director of the Privacy Initiative at Silicon Flatirons. 

“If you have a surveillance state that’s capable of tracking your face wherever you are, it’s capable of tracking your location wherever you are, which means it’s capable of tracking every association you have, from the political protest organization to deeply intimate associations,” Kaminski said.

These intimate associations could include things like participation in religious or LGBTQ organizations and visits to a rape crisis center, abortion clinic or Alcoholics Anonymous meeting. Spivack pointed to the 2016 protests of the killing of Freddie Gray in Baltimore, where police used facial recognition on protestors to identify and arrest people who had outstanding warrants, as an example of how the technology could chip away at First Amendment rights.

“This is a First Amendment-protected activity, a political protest, in which police are identifying people and then arresting them for completely unrelated charges,” he said. “So, this would definitely have a chilling effect on free speech and the right to protest.”

Several people interviewed cited China as a dystopian cautionary tale. Surveillance cameras with facial recognition capabilities have become commonplace in the country, and Chinese authorities have used them for everything from catching jaywalkers to, more worryingly, tracking Uyghurs and other Muslim minority groups.

“The technology itself isn’t evil. I think it’s mostly how it’s used,” said Morgan Klaus Scheuerman, a PhD student in information science at CU Boulder who researches race, gender and facial recognition software. While he says there are positive uses for the technology, the examples from China have made him cautious about its use here, especially by police.

“It could be really concerning, considering there’s already a government that’s using this to purposely target and discriminate against ethnic minorities,” Scheuerman said.

IS A BAN THE ANSWER?

“Based on what I’m seeing in the public conversation around these bans, I see people who are panicked, and they have concerns,” said Jed Brubaker, assistant professor of information science at CU Boulder. 

Brubaker used the analogy of a knife when talking about facial recognition: Both are tools, and both can cause harm. “You could cut off your finger,” he said. “Does that mean we throw away all our knives?” 

While that’s one option, we could also create some guidelines on how to use knives or ban knives of a certain length. Some cities are taking a similar approach to facial recognition, focusing not on outright bans but more nuanced regulation and oversight. 

There are advantages and drawbacks to both bans and regulation. According to Kaminski, bans aren’t necessarily forever. They can serve as a pause button and eventually be lifted once the community has figured out how to use the technology responsibly. 

“It’s a very strong statement to say you think that there’s something inherently broken about facial recognition,” she said. “I think it’s not so strong a statement to say you think there’s something inherently broken about facial recognition right now. And so that’s what’s leading a lot of these cities — against the backdrop of real conversations about racial justice — to call for bans.”

Moratoria, while difficult to pass, offer a nice middle ground for similar reasons, Spivack said, since they allow a community to press pause while allowing a set period of time for public debate and decision-making.

Spivack said bans can also be the result of “robust public dialogue,” which is a good thing, but they are often overly broad with unintended consequences. For example, he said, one jurisdiction that banned police use, including on mobile devices, resulted in officers not being able to use Facebook on their phones. 

Regulation can be more nuanced, Kaminski said, allowing for positive uses and development of the technology while trying to reduce harm. But there’s a lot of variation in regulation, Spivack noted, ranging from robust to very weak. “It’s really easy to pass these weak bills that don’t really regulate the technology,” he said.

Shipley, of the CISC, said that given its usefulness in solving crime, a ban would be short-sighted. “I certainly oppose the banning of valuable technology,” he said. “I don’t object to providing some boundaries for its use.”

“I think that, eventually, it will be seen as appropriate, if guided by proper policy and procedures,” he added. CISC has created a model facial recognition use policy to help officers avoid mistakes or privacy violations. Agencies aren’t required to use the policy, Shipley said, but many members have adopted or modified it.

While the public and policymakers will continue to debate the merits of bans, regulation and facial recognition in general, the technology itself will never be free of all its flaws, Brubaker warned.

“This is not an issue of: Are the things perfect or not? They’re not perfect,” he said. “It’s an issue of: How do we trust our civil servants and our civic organizations in the face of tools that will never be perfect?” 

— Jessica Folker

Previous articleCoronavirus Closures- Apr 27, 2020
Next articleWeb Conference Concerns Center on Privacy, Security and Privilege

LEAVE A REPLY

Please enter your comment!
Please enter your name here