What GitHub did to kill its trolls

Latest

A few years ago, the software start-up GitHub faced an uncomfortable truth: It could be a pretty unpleasant place.

It was 2014 and the company was growing rapidly as a hub for programmers to collaborate on coding projects. But as its user base grew, so too did its problems. A GitHub developer, Julie Ann Horvath, left the company amid searing accusations of sexual and gender-based harassment, putting GitHub at the center of bad press for weeks and leading to the resignation of the company’s CEO.

To make matters worse, GitHub soon realized such problems weren’t limited to the office. Bullying and discrimination ran rampant on the site. There was systemic discrimination against women, with female coders often taken less seriously than their male peers. Petty disagreements devolved into flame wars in project comments. A bitter ex followed his former girlfriend from project to project, saying nasty things about her. And racist, sexist trolls sometimes co-opted features meant to enable collaboration to carry out vicious attacks, using, for example, a people-tagging feature to tag their targets on projects with racist names, transforming their portfolios into a slur of racist epithets.

Nicole Sanchez, the company’s VP of Social Impact, told that these are the “dangers and pitfalls of online life,” and not unique to GitHub, but GitHub wanted to try to prevent them.

It might surprise you that a website built for programmers to share code could become a hotbed of online harassment. But GitHub, valued at $2 billion, is a social network in nature, a combination of Facebook and LinkedIn for computer programmers, and involves a lot of user-to-user interaction. And along with that, on the internet, usually comes abuse.

Hoping to recover and heal its bruised image, GitHub hired Sanchez, who at the time had just started Vaya, a diversity consulting firm.

“We want to connect every developer in the world, and to do that we need to build an inclusive community where everyone feels safe and welcome,” said CEO Chris Wanstrath.

Sanchez got to work, revamping how the company approached everything from hiring and performance reviews to office decor. Since its beginning, the company had been non-hierarchical, with no managers or titles, but Sanchez helped to kill it, finding that without bosses, people weren’t held accountable when their actions were in the wrong. She tweaked internal processes to make the environment more diversity friendly, like by creating a formal feedback process for complaints. And she hired February Keeney, a half-Puerto Rican transgender woman, to lead a new Community and Safety team to attack the problem of harassment on the site.

It was a difficult stance to take given the existing culture in Silicon Valley. GitHub, like so many tech companies, had long feared tamping down on what its users could say and do. Many techies feel that the internet is supposed to be open and free and that cracking down on even the most unseemly user behavior infringes on rights to free speech. Twitter, for example, had long refused to address its own problem with abuse, referring to itself as the “free speech wing of the free speech party.”

“People were so dogmatic about open source,” said Sanchez. “It meant that it has to be open all the time and accessible to everyone without question.”

Change-averse GitHub employees complained anonymously in the press that Sanchez was trying to “to control culture,” but eventually she won most of them over.

“It’s not just that harassment is unpleasant,” Sanchez told me. “It’s that we were losing people.”

A 2014 survey of women who had recently left the tech industry found that culture—including harassment—was a major factor in their decisions. GitHub viewed a diverse user base as essential to the company’s success and decided it needed to snuff out harassment to achieve it.

GitHub didn’t just need a new code of conduct—it needed to consider how every tiny detail of its design might be exploited to harass. Trolls be damned.

GitHub is not the only Silicon Valley company to have realized that ugly online behavior will not go away on its own. Two years ago, technology companies typically met calls to crack down on bullying with either a defense of free speech or a shrug. But scandal, criticism and harassment horror stories have forced a reconsideration of that approach.

When former Twitter CEO Dick Costolo famously admitted last February that “we suck at dealing with abuse,” it was a call to arms. Since then, Twitter and other companies have vigorously rolled out attempts at solutions. In September, Instagram announced a new feature allowing individual users to block offensive words. This fall, Google revealed it was building A.I. to combat internet trolls. Even the internet’s seedy underbelly, Reddit, has placed bans on its most toxic quadrants.

“There’s a general thing in the zeitgeist right now,” Julio Avalos, GitHub’s chief business officer, told me. “There has been a shift in what employees expect of their employers and what customers expect of companies. People are going to start voting with their feet.”

Twitter is Silicon Valley’s cautionary tale of what happens when you ignore the zeitgeist. Over its decade of existence, Twitter has mostly ignored abuse, making it a prime destination for trolls and hate. High-profile users have fled the network, citing harassment. As the embattled company has struggled to find a buyer in recent months, some have speculated that Twitter’s harassment problem has played a role.

Trolls have become the scourge of the internet era. The sad fact of the matter is that the internet is chock full of a**holes; something really ought to be done about it.

But how do you rid the online world of violent verbiage and hatred when violence and hatred so thoroughly permeate the world itself? To use Twitter again as the unfortunate example, over the past two years it has banned revenge porn, issued new anti-harassment rules, established a trust and safety council and suspended high-profile users it considers abusive. And still, it seems, abuse has flourished.

“On Twitter,” BuzzFeed’s Charlie Warzel wrote earlier this year, “abuse is not just a bug, but—to use the Silicon Valley term of art—a fundamental feature.”

There is no miraculous healing salve for an internet of hate.

“There is no end of ideas about solutions for online harassment,” said Nathan Matias, a researcher at MIT studying ways to reduce harassment and discrimination online. “There is a universe of possible outcomes, and right now we have very little evidence that any one solution will lead to the outcome desired.”

When GitHub decided to crack down on harassment, it also decided to hire Coraline Ada Ehmke, a transgender women and, up until then, one of GitHub’s most vocal critics.

Ehmke had been a victim of a glaring flaw in GitHub’s design. She was the author of the “Contributor Covenant,” a voluntary code of conduct adopted by many GitHub projects. Not everyone in the free-wheeling open-source community appreciated Ehmke’s contribution, though, and some users went after her. GitHub had no feature allowing users to opt out of being tagged by others and so bullies began tagging Ehmke as a contributor to made-up projects with racist names, marring her GitHub profile, a portfolio of all of her open-source work. It was as if someone had tagged a swastika across her resume then doled it out to future employers.

“Harassers are very clever. They take advantage of tools that are very innocuous and use them as vectors for abuse,” Ehmke told me. “If you’re creating products and not thinking about how it could be used for abuse, you are not doing your job.”

When GitHub hired Ehmke last February as a senior engineer, some were outraged by the direction her hire suggested GitHub was headed.

“At the start of my career, I had a lot of male privilege,” Ehmke told me. “Intellectually I knew that things like this happened, but until I transitioned I really didn’t fully understand. Open source is not very welcoming to people who are not male or white.”

Everyone I spoke with at GitHub underscored that the most important step in addressing harassment and diversity on GitHub was first solving those problems within the company itself.

“If there is not diversity at the front end of the funnel there is no ability to be able to deliver diversity to your customers on the other side,” Avalos, who is Guatemalan and joined GitHub in its boss-less era, told me. “We don’t want to have blinders to things in the product that are alienating people.”

The Community and Safety team, made up of six people, includes two people who identify as transgender, four as women, three as people of color, and two “token white men.” It is, in other words, a lot more diverse than your average Silicon Valley engineering team.

Their job isn’t just to build new anti-harassment tools, but to vet new GitHub features and anticipate how they might be used for abuse.

“We’re not just an engineering group at GitHub,” Ehmke told me. “We’re considered critical infrastructure. At GitHub, these things are as important as keeping the lights on.”

The biggest change the team has made is asking GitHub engineers to build “consent and intent” into the platform. Users should have the right to consent to being tagged by another user for example. That would have prevented Ehmke’s racist tagging experience. So GitHub tweaked the project-tagging feature to require user approval.

“We don’t want to have blinders to things in the product that are alienating people.”

Intent, though, is trickier. Not every person who says something that sounds offensive on GitHub intends to actually say something offensive. “You suck” can be a mean-spirited jab or a playful joke between friends.

“We realized that harassing behaviors really fall into two buckets,” said Keeney. “There is the intentional bigot. And then there is the person who, like, tells Asian driver jokes without realizing they are racist.”

GitHub needed a way to handle offensive behavior with more dexterity and nuance.

Last month, the company released a proposal for community guidelines. It included rules for banned behavior—doxxing, discrimination and bullying—and spelled out clearly what constitutes those behaviors. There are consequences for breaking the rules, from content removal to account termination.

Being GitHub, it has asked its community for feedback. One user looked at the proposed guidelines, and suggested its ban on pornography shut out projects that might have to do with sex education or reproductive health. Ultimately, the best way to meet community needs, GitHub decided, is to ask for the community’s help.

Moderating comments, for example, might be a job better shared by both GitHub and open-source project managers better equipped to tell if a potentially offensive joke actually offended.

“Right now, the only options moderators have are to delete comments, report an issue to us or block a person from their project,” Keeney said. “But we want to make sure that we offer a range of tools for moderators to respond to problems. A range of problems requires a range of responses.”

Eventually, Keeney told me, GitHub plans to roll out a variety of tools that will let project managers do things like ban a troublesome member for just a few days.

GitHub’s approach has three central tenets: design features to make hassling other users more difficult, empower users with tools to help safeguard themselves, and enlist the community to help keep everything under control.

So far, the company said, it’s been successful. Blocks and reports of incidents have gone up, indicating that GitHub users are actually using the site’s new tools. And when incidents do occur, the time it takes to respond to user reports has gone down.

“The more we can put tools in the hands of users to manage their own experience, the better.”

Other online communities are embracing similar tactics. The subreddit r/science transformed its embattled comment threads into civil discourse by establishing clear community rules and rallying an army of more than 1,300 moderators to enforce them.

“Any reasonable approach to governing online behavior will ask users to do at least some work to govern communities,” Matias told me. “When communities take on this work, we often get more accountability and responsiveness.”

Instagram and Twitter, too, have recently shifted toward giving users more ability to deal with abuse themselves. In November, Twitter unveiled a new feature allowing users to mute specific words and phases from appearing in notifications.

“The forms that abuse can take can vary tremendously,” Del Harvey, Twitter’s VP of Trust and Safety told me. “It is unrealistic that we will be able to predict everything people will consider harassing. The more we can put tools in the hands of users to manage their own experience, the better.”

Not every space on the internet has to be squeaky clean.

“There need to be spaces on the internet with different social norms,” said Amy Bruckman, a researcher at Georgia Tech who studies online communities. “There should be online spaces that are a little more rough and tumble. That’s okay as long as they are open about what they are and don’t cross a line that’s actually dangerous.”

In other words, it all comes back to rules. Just like the real world, there are different spaces for everyone. Maybe Reddit is your neighborhood dive bar, and Facebook your corner coffeeshop. We can all agree that what happens in a dive bar at 4 a.m. is not always appropriate in a coffee shop.

In the end, the most important shift in how companies approach harassment is that they’re willing to lose troublesome users.

Voicing different points of view and opinions is necessary to foster communication across ideologies. People randomly yelling “bitch!” is not. But finding a balance isn’t easy. At GitHub, new hires and rules have resulted in significant backlash. Not everyone is happy. But GitHub could care less. In the end, the most important shift in how companies approach harassment is that they’re willing to lose troublesome users.

“If they don’t like the culture we are trying to create, disgruntled users have other options,” Nicole Sanchez told me. “It is okay for us to draw a line.”

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin