Long before "Yahoo" was adopted by the well-known tech giant, the word was coined by Jonathan Swift in the classic "Gulliver's Travels" to describe a race of brutish humans. It was a word meant to invoke disgust. It seems we have come full circle: online hate groups have once again turned the word into a hateful slur. But now, instead of referring to an imagined populous, the vile offense is aimed at Mexicans.
"Yahoo," along with "Google," "Skype," "Bing," "Skittle," "butterfly," "fishbucket" and "durden" are the alt-right's new lexicon of hate. They are code words used to describe minority groups such as black people, Muslims and those who identify as LGBTQ. They were created by commenters on one of 4Chan's most hate-filled message boards as an attempt to avoid filters that have been set up by companies like Google to battle internet abuse—and they are almost as ridiculous as they are deplorable.
Last month, Google unveiled new software that uses machine learning to automatically flag language associated with harassment and abuse. On 4Chan, the news inspired protestation of Googlian overreach.
"Even if this shit were to happen (it won't), people would simply meme new definitions of shit into existence." one anonymous commenter wrote in response. "All it would take would be for internet arseholes to use a common word in place of something racist, like if they started using 'google' instead of 'n*****' what the f*** are they going to do then?"
Voila. Before the proverbial ink was dry, the latest trend in viral hate was born.
Racist Trump twitter has come up with a new coded way to share racial slurs w/ each other and avoid account suspension. pic.twitter.com/J4AmaHFVCd
— Alex Goldman (@AGoldmund) October 1, 2016
Confronted repeatedly with rampant sexism, bigotry and hate online, internet companies have invested resources in trying to quell it, or at least hide it from view. Google's new tool, slated for beta testing by Wikipedia and The New York Times, seeks to suss out trolling by monitoring not just which words a commenter uses, but how. “What’s up, bitch?”, for instance, gets a score of 63 out of 100 on the attack scale, while “What’s up bitches?” gets only 45.
By using words in untraditional contexts, what has been dubbed "Operation Google" by its supporters hopes it can escape the detection of anti-abuse efforts. And if the filters get smarter, the Operation can just change the terms of the game again.
The fight against internet abuse can start to seem like a Promethean game of Whack-A-Mole. After Instagram introduced a feature that lets its users block comments that use particular words or phrases, harassers started simply spelling the words differently. Taylor Swift, for instance, used it to block commenters who used emoji snakes, but some users simply switched over to creative spellings of the word "snake" instead. After Twitter cracked down on racist abusers attacking "Ghostbusters" actress Leslie Jones, trolls simply moved on to hacking her website. The infinite possibility of the internet means there are infinite venues from which to launch an attack.
Words matter. But it works both ways. Calling someone a "Google" does not have quite the same heft as calling someone that other unwritable term. One is a word for "black" distorted to turn skin tone into an unsettling racial slur. It is loaded with a long and painful historical legacy of hate. The other word is a company name that is a misspelling of the word for the digit 1 followed by 100 zeroes. It sounds like something a baby might say before it learns to speak, or the language of something furry and cute, like an Ewok. Were someone to call me a "Bing," a "Skittle" or a "fishbucket," I cannot quite tell whether I would be moved to laugh or cry.
We will never wipe harassment from every corner of the internet, but in forcing alt-right hate mongers to rely on absurd linguistic work-arounds to spread messages of fear and antagonism, we are dulling the power of their punch. Driving the alt-right to create its own loathsome lexicon means their messages may become so inscrutable that they are altogether lost in translation. The troll is driven, at least in part, by getting a rise out of its target. And it's hard to get a rise out of someone when nobody understands what you are saying.
“It’s hard for me to imagine a world where there’s not a continued cat-and-mouse game," Jared Cohen, the head of Google's Jigsaw, told Wired about its new abuse-fighting tool. "But over time, the mouse might just become bigger than the cat.”
We may have to keep whacking the mole, but eventually it will get slower and have fewer places to hide. This new language is a response to such constraints. It is a sign that in the war against online harassment, we may finally be winning.