Getty Images
Getty Images

The short answer to why Google's algorithm returns racist results is that society is racist. But let's start with a lil' story.

On June 6 (that's Monday, for those of you keeping track at home) Kabir Alli, an 18-year old in Virginia, posted a brief video of himself running a couple of quick Google image searches. First he searched for "three black teenagers" and was met with several rows of decontextualized mugshots. Then he searched for "three white teenagers" and was served up stock photos of relaxed teens hanging out in front of various plain white backgrounds.


The tweet has been retweeted 67,687 times as of this writing. On Thursday Alli told The Guardian that he'd been told about the differing results by friends, but that, "When I saw the results for myself I was shocked."

He also told the paper that he doesn't think Google is racist. He noticed that some people were accusing the company of racism in responses to his tweet, and offered up a rejoinder.

“The results were formed through the algorithm they set up. They aren’t racist but I feel like they should have more control over something like that.”


Google agrees, at least when it comes to whether or not it's racist. A spokesperson for Google offered the following statement to FUSION via email:

Our image search results are a reflection of content from across the web, including the frequency with which types of images appear and the way they’re described online. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what image search results appear for a given query. These results don’t reflect Google’s own opinions or beliefs — as a company, we strongly value a diversity of perspectives, ideas and cultures.


As many have already pointed out, this isn't the first time this has happened. Earlier this year, grad student Bonnie Komona posted the results for "unprofessional hairstyles for work," which showed photos of black women and those for "professional hairstyles for work," which mostly displayed photos of white women.


There are cases where Google will intervene when its algorithm does bad things, such as last summer after Google Photo's auto-tagging program suggested that two young black people were gorillas. In that case a high-ranking engineer at the company apologized. There was also the 2009 instance where the company seemed to remove a racist image comparing Michelle Obama to a gorilla that appeared high in her search results.

The search results for the two terms Alli used remain largely the same for now, although, as is often the case, they're now also full of side-by-side comparisons from news articles. Here's what they look as of this writing:

Screen Shot 2016-06-10 at 11.09.15 AM
Search results for "three black teenagers."
Screen Shot 2016-06-10 at 12.32.11 PM
Search results for "three white teenagers."

And just for contrast, here are the results for "three latino teenagers:"

Screen Shot 2016-06-10 at 12.42.50 PM

And here's "three asian teenagers," where you will see a different kind of bias. It almost always shows female teens, often scantily clad. (These aren't personalized to me, by the way; my colleagues got the same results.)

Screen Shot 2016-06-10 at 11.14.53 AM

As BuzzFeed News explained back in April, a number of factors play into what images appear first in Google's image results, including "[t]he popularity of the image, how frequently it is shared, context such as text around the image, and meta-tagging." Meta-tagging (or metadata) is information about the image provided by the page the image is from or the image itself, so in the case of "three black teenagers" that description is likely coming from the sites that posted the mugshots, and people are probably clicking on those images. And in general, Google's algorithms are acting on a lot of inputs.

Alli's thought that Google "should have more control over [image results]" is a fair one. Google makes plenty of big, discretionary decisions about ad results and what appears in their other apps, but when it comes to search results they broadly stick to the line that they're simply very good at efficiently turning up what people are looking for on the internet. In other words, they're not racist, society is racist.


And fair enough, society is racist. But Alli raises the very fair question about what role society's institutions ought to play in allaying that racism rather than abetting it. Google may turn up what people on the web want, but because of its role as the world's most popular search engine, it also reinforces those wants. Being at the top of those rankings can make all the difference in the world, and there's a big industry based solely on getting there. But Google is ultimately a company in the U.S., and will pursue its fiduciary interests above anything else, which rely in part on not futzing too much with search results.

So for now, our best option may be something a little more difficult: trying to make society as a whole less racist.


Guess we'd better get on that.

Ethan Chiel is a reporter for Fusion, writing mostly about the internet and technology. You can (and should) email him at

Share This Story

Get our newsletter