Trending Topics

Calif. lawmakers aim to rein in police use of facial recognition

State lawmakers are considering regulation barring all California police officers from running facial recognition programs on body cameras

By Sam Dean
Los Angeles Times

LOS ANGELES — Facial recognition’s first blanket ban arrived in May, when San Francisco became the only city in the nation to bar police and other agencies from using the technology.

Now the powerful software, which uses machine learning algorithms to automatically track human faces in digital footage and match them to names, is facing a broader moratorium.

State lawmakers are considering regulation barring all California police officers from running facial recognition programs on body cameras. Other Bay Area cities such as Berkeley and Oakland are considering following San Francisco’s lead in banning all applications for local police. And federal legislators — from both sides of the aisle — are holding hearings on Capitol Hill to examine how federal agencies are using the technology, and whether it deserves more scrutiny and stricter controls.

Taken together, these efforts, pushed by activists and politicians from the tech industry’s home base in the Bay Area, constitute something not often seen in Silicon Valley: an attempt to impose preemptive regulations on a rapidly developing technology.

From social media to smart speakers, technological innovations have upended entire industries and changed the fabric of everyday life, with minimal public debate beforehand and sometimes significant unintended consequences. What makes facial recognition different is an emerging consensus that it poses a unique and alarming threat to basic civil liberties — and once it becomes widespread, it may be too late to stop it.

“People don’t expect to have their identity, their location, and who they associate with logged every time they step outside and walk down the street,” said Matt Cagle, an attorney for the Northern California chapter of the American Civil Liberties Union, which has been a key part of the coalition pushing for stronger regulation. “That’s the kind of world that automated face surveillance would usher in.”

The state measure, Assembly Bill 1215, would ban law enforcement agencies across California from using any “biometric surveillance system” — which includes software that would identify people by tattoo, gait and other individually distinguishable physical characteristics — in real time on police body cameras or on footage collected by those cameras. After passing the Assembly in early May, the bill was set for a key hearing in the Senate Public Safety Committee on June 4.

Assemblyman Phil Ting, the lead author of the bill, sees it as a necessary follow-up to his 2018 legislation requiring law enforcement agencies to release body camera footage within 45 days of incidents in which police kill or seriously injure someone, or any incident in which police shoot their guns.

“Body cameras were deployed to build trust with communities, to build more transparency and more openness,” said Ting, a San Francisco Democrat. “It really was not the intention of body cameras to have roving surveillance cameras on police.”

The bill states biometric surveillance is the “functional equivalent of requiring every person to show a personal photo identification card at all times in violation of recognized constitutional rights,” regardless of consent. It runs the risk of creating massive, unregulated databases about Californians never suspected of committing a crime, and “may chill the exercise of free speech in public places” as the identities of anyone in a crowd could be immediately discerned.

Formal opposition to the bill has come from the California Police Chiefs Assn., which said during an Assembly hearing that “prohibiting the use of biometric surveillance systems severely hinders law enforcement’s ability to identify and detain suspects of criminal activity.” Comparing images of suspects against facial recognition databases has led to cold cases being solved years later, and police commonly cite mass shooting or terrorist attack scenarios as potentially useful applications of facial recognition technology applied across a city.

Ting says that he is unaware of any police departments currently using the technology in concert with body camera footage.

In a statement, the Los Angeles Police Department said it does not use facial recognition technology in the department, though it has been used in limited instances in joint investigations with other agencies.

The Los Angeles Sheriff’s Department has conducted small pilots with body cameras but has not deployed them widely. But the department does rely on facial recognition technology as a way to generate leads in investigations, said Lt. Derek Sabatini, who manages the county biometric identification system.

Comparing suspect images against a database of Los Angeles County mug shots to surface possible persons of interest has proved valuable in solving crimes, said Sabatini, who drew a distinction between how it’s used today and its potential risks as a surveillance tool in real-time deployment.

“Surveillance needs discussion,” Sabatini said. “We should talk about it and understand how it’s used — there’s a lot of trust issues with that, and it’s totally understandable.”

Skeptics say the risks inherent in facial recognition software far outweigh potential benefits.

There’s the problem of false positives. Researchers have shown that the software often turns up incorrect matches, especially when searches are run on images of darker-skinned people and women. An ACLU study found that Amazon’s facial recognition system, Rekognition, incorrectly matched the official photos of 28 sitting members of Congress with mugshots of people who had been arrested for crimes.

The potential real-world effect of relying on unproven algorithms to identify suspects came to life in San Francisco in 2009, when a false positive from an automated license plate reader algorithm led police to believe a woman named Denise Green was driving a stolen Lexus. Green was stopped by police and forced out of her car and onto her knees at gunpoint by six officers, and the city ultimately paid hundreds of thousands of dollars to settle lawsuits linked to her detention.

But even if the software were perfectly accurate, civil libertarians say that allowing police to check the identity of any passersby without consent constitutes an invasion of privacy and undercuts current California laws on the right to anonymity in public.

Their worst fears are already playing out in China, where the government uses facial recognition-equipped surveillance systems to track and target Uighurs, a largely Muslim minority, and maintain a social credit system that ranks — and blacklists — residents based on behaviors such as smoking and jaywalking.

Without going to those extremes, use of facial recognition by American law enforcement nevertheless runs the risk of drifting into uncharted waters. Clare Garvie, a senior associate at the Georgetown Law Center on Privacy and Technology who leads its research on law enforcement facial recognition technology, says the sheer speed and scale of the software’s capabilities imperil the presumption of privacy, especially when used in real time.

A police body camera connected to a facial recognition system could theoretically allow officers working crowd control at a political protest to check protesters for criminal records or simply log their presence. In London, the Metropolitan Police has already begun parking vans equipped with cameras running facial recognition algorithms along busy sidewalks to test out the system, and in one instance reportedly ticketed a man who tried to hide his face from view while walking past.

“We have this idea that law enforcement can’t search you and can’t demand identification unless you are suspected of wrongdoing,” Garvie said. “But if everyone who walks by an officer is being searched and compared against their criminal history or a watch list of crimes, that means there is a search happening before any suspicion is generated.”

Unlike some states, California has no law requiring that people provide identification to law enforcement officers on request, though drivers are required to show licenses during traffic stops.

Deployed in real time on police body cameras, critics say, facial recognition could heighten the potential for deadly escalation. A police officer whose camera misidentifies a stopped motorist as having an outstanding warrant and a history of violent crime might be more likely to approach with a gun drawn.

“Inaccurate technology in the hands of armed law enforcement is not going to make us safer,” Cagle said. “It will result in additional dangerous encounters between law enforcement and the public, and false identifications could lead to the use of force and the loss of life.”

Axon, the Taser manufacturer and leading police body camera provider in the U.S., said in a statement that it is not actively working on facial recognition technology. An April investigation by the Financial Times found that the company had taken out patents and acquired companies related to facial recognition, but Axon said that those systems were only used for automatic face redaction for body camera footage, and noted that it had established a policing technologies ethics board to build in safeguards for any future use of the systems.

Motorola Solutions, another major body camera provider, declined to comment but has stated its intention to develop facial recognition technology for body cameras.

The big software companies building facial recognition software are split on its use. Microsoft President Brad Smith said in April that the company refused to sell its technology to a California law enforcement agency over human rights concerns, and the company has publicly called for regulation of the technology to prevent “a commercial race to the bottom.” Amazon, despite criticism from shareholders and activists over its facial recognition programs, is continuing to sell its Rekognition service to law enforcement.

Facial recognition is quickly making its way into daily life via commercial technologies such as Apple’s FaceID unlocking feature and Facebook’s automated photo tagging. JetBlue recently became the first U.S. airline to allow passengers to submit to a face scan in lieu of showing a ticket and ID at boarding, and some retailers and restaurants already use facial recognition to help with loss prevention and customer tracking.

But companies may not have the final word in how the technology is deployed. Oakland’s and Berkeley’s city governments are considering adding a ban to their local ordinances, and facial recognition has been the subject of two House Oversight Committee hearings, in which both Democratic and Republican representatives have expressed support for a moratorium on the technology’s use.

Those hearings revealed that the FBI has amassed a database of more than 640 million photographs for its facial recognition program, including driver’s license photos from 21 states (not including California).

Brian Hofer, one of the architects of San Francisco’s ban and the chairman of Oakland’s Privacy Advisory Commission, believes the movement for more regulation is still gaining strength.

“People believe that it’s inevitable that there’s going to be more and more surveillance, more and more police state power, and technology is going to keep creeping into our lives,” Hofer said. “But we still have the freedom and ability to say no.”

———

©2019 Los Angeles Times

RECOMMENDED FOR YOU