Google Photos identified black people as 'gorillas,' but racist software isn't new

Latest

Google has come under fire recently for an objectively racist “glitch” found in its new Photos application for iOS and Android that is identifying black people as “gorillas.”

In theory, Photos is supposed to act like an intelligent digital assistant. Its underlying algorithms can categorize your entire camera roll based on a number of different factors like date, location, and subject matter. Apparently, however, at least one black user has reported that the app categorized him and a black friend as “gorillas,” as opposed to people.

On Sunday, Google Photos user Jacky Alcine tweeted out a screenshot of the application that displayed a number of pictures organized into different albums. While the app’s algorithm was able to correctly identify pictures of a “graduation,” “skyscrapers,” and “airplanes,” it labeled photos of Alcine and a female friend as gorillas.

https://twitter.com/jackyalcine/status/615329515909156865/

https://twitter.com/jackyalcine/status/615331869266157568/

Yontan Zunger, a senior software engineer for Google, quickly tweeted back at Alcine, assuring him that the mistake was a bug that would be fixed immediately. Alcine, to his credit, explained that he understood how algorithms can misidentify things in ways that humans don’t, but he questioned why this type of issue in particular was still such a problem for a software giant like Google.

“We’re appalled and genuinely sorry that this happened,” an official Google statement on the matter read. “There is still clearly a lot of work to do with automatic image labeling, and we’re looking at how we can prevent these types of mistakes from happening in the future.”

As nice as it is of Google to assure us that something like this is a freak instance of coding-gone-wrong, it’s hardly the first time that we’ve seen software show an implicit bias against people of color.

One of the most well-known instances of technology snubbing its owners came in the form of digital cameras assuming that their eyes were closed while smiling. The cameras’ sensors mistook the shape of Asian eyes and interpreted them as blinking, prompting the camera to mark the photos taken as flawed.

Sadly, there’s more.

The software built to support a number of different sensors used in digital cameras and webcams has been observed to flat-out not be able to perceive people with darker skin tones.

Back in 2010, a series of HP computers was widely affected by  these so-called “racist” webcams. Five years later, similar software-based gaffes still plague services like Flickr. Last month Flickr rolled out a similar algorithm into its popular photo-sharing network that promised to help users more effectively tag their photos. The function identified both a black man and a white woman as apes on two separate occasions. Suffice it to say that this problem isn’t exactly going away.

The mistakes are made because algorithms, smart as they are, are terrible at making actual sense of pictures they analyze. Instead of “seeing” a face, algorithms identify shapes, colors, and patterns to make educated guesses as to what the picture might actually be. This works wonderfully for inanimate objects or iconic things like landmarks, but it’s proven to be a sticking point for people of color time and time again.

Perhaps if the titans of Silicon Valley hired more engineers of color, things like this wouldn’t happen so often. Or, you know, ever.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin