African Americans are more likely to be targeted by face recognition software

Latest

On Tuesday, a 150-page report released by Georgetown University’s Center for Privacy and Technology found that an astounding 117 million Americans, nearly half of all adults in the country, have their images stored in face-recognition databases searchable by federal, state, and local authorities. The databases are compiled primarily from images like mugshots, driver’s license photos, passports and visa pictures. Georgetown found that 1 in 4 police departments use face recognition databases, more than 4,000 total departments. The FBI’s database, many times larger than those of local police departments, is also sourced largely from non-criminal images, meaning that inclusion in the face recognition database (unlike fingerprint and DNA databases) isn’t reserved for criminal suspects.

Titled “The Perpetual Line-Up,” the report finds that African Americans, who are arrested at higher rates and thus more likely to recur in databases, are disproportionately impacted because of the increased level of policing in black communities. The report notes that, in certain states, black Americans are arrested as many as three times that of their share of the population, over enrolling them in face databases. (For context, in 2013, Ferguson issued 1,500 arrest warrants for every 1,000 people in the mostly black city.) And the Maricopa County, Arizona police department “uploaded the entire driver’s-license and mug-shot database from the government of Honduras, a major source of immigration to Arizona.”

Database images aren’t “scrubbed” when people are found innocent, meaning a simple arrest—not a conviction—can tie anyone to the database indefinitely. Even more distressing is that face recognition software is not routinely calibrated for accuracy; it occasionally has difficulty distinguishing between different dark-skinned faces. As appearing in the database is not tantamount to being arrested or charged, the ACLU said that a pattern of racially biased surveillance and suspicion emerges:

“A growing body of evidence…suggests that law enforcement use of face recognition technology is having a disparate impact on communities of color, potentially exacerbating and entrenching existing policing disparities. Face recognition systems are powerful—but they can also be biased.”

In a statement to the Washington Post, the FBI, which has access to more than 400 million total face images, denies accusations of racial bias:

“Facial recognition algorithms are developed in the computer vision field, based solely on pattern matching techniques. Facial recognition’ algorithms do not actually compare ‘faces’ and they do not consider skin color, sex, age, or any other biographic.”

Face recognition software uses a picture of your face and measures certain discrete traits: the depth of your eye sockets, the width of your nose, the distance between your eyes, etc. From there, it creates a “faceprint”—the unique measurements that make up your face—then uses algorithms to compare your faceprint to thousands, perhaps millions, of others. A concrete example of this is Facebook’s “suggest tag” system:

Facebook photos aren’t part of the police database or Georgetown’s report (though Facebook data may still end up in police hands). But the “suggest tag” system, which helps you identify friends after uploading photos, shows how common face recognition software already is.

And it’s only likely to get more popular, particularly in the policing of diverse communities. On a conference call with members of the press, Alvaro Bedoya, executive director of Georgetown Law and one of the report’s authors, said that police departments in Chicago, Los Angeles, and Dallas, among others, have expressed interest in real-time face recognition technology, which would scan people’s faces in specific areas as captured by CCTV surveillance cameras. Bedoya says that, by placing this technology in public spaces, authorities are eliding their responsibility to prove “reasonable suspicion” before surveilling suspects.

“Face recognition technology is rapidly being interconnected with everyday police activities, impacting virtually every jurisdiction in America,” the ACLU letter states. “Yet, the safeguards to ensure this technology is being used fairly and responsibly appear to be virtually nonexistent.”

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin