Report finds racial bias in facial recognition technology

More than 40 rights groups asked the Department of Justice to launch a probe examining whether systems used by police to investigate crimes disproportionately identify blacks as criminal suspects.

Officials use a facial recognition program to scan visitors at the Statue of Liberty in New York City.

Chip East/Reuters

October 18, 2016

US law enforcement agencies store images of 117 million adults as part of facial recognition programs that have become critical tools in modern police work. 

But a report released on Tuesday raises serious questions about racial bias built into these systems designed to identify suspects, saying the technology disproportionately singles out blacks in criminal investigations. 

The year-long study from Georgetown Law's Center on Privacy and Technology charted the rapid increase in facial recognition programs at 52 police agencies nationwide. The programs contain mug shots, images from driver's licenses, and other pictures cataloged in systems created without legislative approval and operated without legal oversight, according to the study.

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

"These algorithms don't see race in the same way you and I do, but that doesn't mean they're not racist," said Jonathan Frankle, a PhD student at the Massachusetts Institute of Technology who worked the study as a technologist with the Center on Privacy and Technology.

"When you have darker skin, there's less information in the photo because your skin reflects light differently. So having darker skin means it's harder to differentiate faces, which is bad because if you're going to use this as a policing tool you want this to be as accurate as possible."

According to the study, facial recognition systems are 5 to 10 percent less accurate when trying to identify blacks than when analyzing the facial images of white adults in the system.

In conjunction with the Georgetown report, more than 40 civil liberties organizations, including the American Civil Liberties Union and the Leadership Conference for Civil and Human Rights, asked the Justice Department to examine whether facial recognition systems contribute to racial bias in policing. 

"A growing body of evidence suggested that law enforcement use of face recognition technology is having a disparate impact on communities of color, potentially exacerbating and entrenching existing policing disparities," the groups said in a letter to the Justice Department.

Howard University hoped to make history. Now it’s ready for a different role.

As public surveillance become commonplace in cities across the country, police departments are able to use databases of facial images to identify potential suspects who may have been caught on video cameras at stores, city streets, or within public transportation, for instance.

"This is a fundamental change in policing where everything you do in public is trackable not through your technology, but through your body," said Alvaro Bedoya, executive director of the Center for Privacy and Technology, during a press conference Tuesday. "And this technology is not limited to serious criminals. It's not limited at all, really."

The report's authors recommend that police departments take several steps to limit the potential for racial bias in facial recognition programs affecting investigations. Police should obtain a court order to conduct mass searches of facial images, they said. And the researchers recommended that internal audits review the use of facial recognition programs.

Society has decided to adopt "this technology first, and decided to ask questions later," said Clare Garvey, associate researcher at Georgetown's Center on Privacy and Technology. "That approach is fundamentally flawed."

The Georgetown report comes one week after the ACLU revealed that police in Baltimore and Ferguson, Mo., monitored protests in real-time Facebook, Twitter, and Instagram posts. Baltimore police also used the state of Maryland's so-called Image Repository System, which accesses police mug shots and driver's license photos, to surveil protests following the death of Freddie Gray, who died in police custody last year.

"We have to ask ourselves, why was facial recognition used at this protest, and will facial recognition chill free speech," Ms. Garvey said. "If so, the damage from this technology extends to the community writ large if the people don't have a safe space to express themselves."