It’s a Wednesday morning on the Facebook campus. In Building 18, a pretty, dark-haired woman wearing thick-framed glasses is led into a “user experience observation room” by Paul André, a Carnegie Mellon-trained researcher who wants to find out how a normal person will react to a prompt in the Facebook mobile app to review her privacy settings. In the room are comfy chairs, a pink orchid, and two misty photos of beaches. André and the woman, who started working at Facebook a week earlier, sit down at a wooden table facing a wall-to-wall mirror.
“This isn’t a test. We want to see how you use the product so we can improve it,” says André, revealing a British accent. “There might be some people behind the mirror. We don’t want them in here disturbing us.”
She glances at the mirror. “That’s always the awkward moment,” says one of the seven people sitting at two rows of desks in a darkened room watching her. A screen hung from the ceiling in the room has close-up views of the woman’s face and the iPhone she’ll use to check the Facebook app. The group observing her includes a lawyer, a privacy engineer, a privacy product manager, and two privacy designers. “Privacy” is now a product at Facebook like Newsfeed or Photos, and has a dedicated team tasked with creating tools to make it simpler for people to protect their party photos and political musings from unwanted observers. They hold these sessions weekly with a non-technical Facebook employee or paid member of the public, and are currently assessing an unreleased mobile “Privacy Check-up.” André warms up the current subject by asking her, “What does privacy mean to you?”
She says it means that non-Facebook friends won’t be able to see the things she posts, and adds that she doesn’t want anything she shares on Facebook to be public, not even her face.
Once she starts the “Privacy Check-up” — which will prompt mobile users to take a look at who they’re sharing information with — she’s surprised to find that some of her photos are public, including profile selfies and ones of her in a smiling state of coupledom. She’s dismayed to discover that her profile photo and cover photo have to be public per Facebook’s rules. André points her to an explanation why — so that people searching for her will know it’s her. She doesn’t like it but says she wouldn’t delete her Facebook account over it. After going through the “privacy health” of her apps, posts and bio information, she rates it helpful and says she would encourage her mom to use the tool. Her major complaint is that it should be easier to change the privacy settings on her photos en masse. The team in the darkened room is pleased.
Facebook privacy is an oxymoron to many. Facebook's privacy record after all has many blemishes: The ire-inspiring introduction of Newsfeed in 2006. The ill-fated purchase-broadcasting program Beacon in 2007. The infamous 2009 privacy settings change that exposed even Mark Zuckerberg’s private photos. The 2010 revelation that Facebook apps were leaking users’ private information to advertising and Internet trackers. In 2012, the Federal Trade Commission brought the hammer down on Facebook for repeatedly deceiving users about who was going to be able to get access to information they put on the site. As part of its settlement, Facebook promised to create a “comprehensive privacy program” and agreed to pay fines if it screwed up. (It hasn’t screwed up yet, at least not according to the terms of the settlement.) This observation session is one of the reasons why Facebook is no longer a reliable source of privacy scandals. Shamed by past mistakes and weary of angry regulators, the social networking giant has transformed the way it makes new products and privacy controls, making all of the information we pour into the site on a daily basis far less likely to be used against us.
The face of the new, privacy-conscious Facebook is Yul Kwon, a Yale Law grad who heads the team responsible for ensuring that every new product, feature, proposed study and code change gets scrutinized for privacy problems. His job is to try to make sure that Facebook’s 9,199 employees and the people they partner with don’t set off any privacy dynamite. Facebook employees refer to his group as the XFN team, which stands for “cross-functional,” because its job is to ensure that anyone at Facebook who might spot a problem with a new app — from the PR team to the lawyers to the security guys — has a chance to raise their concerns before that app gets on your phone. “We refer to ourselves as the privacy sherpas,” says Kwon. Instead of helping Facebook employees scale Everest safely, Kwon’s team tries to guide them safely past the potential peril of pissing off users.
Kwon, 39, has a million-dollar testament to his ability to orchestrate group dynamics. He was the winner of the 13th season of Survivor, the season in which the CBS reality show controversially divided contestants by race (Kwon is of South Korean descent). His competitors said his gift of diplomacy helped him win — though some called him a “puppetmaster.” “I learned how to navigate difficult environments,” Kwon now says.
Kwon has been bouncing back and forth between Silicon Valley and D.C. for most of his career, with a few stops in front of T.V. cameras. His roller coaster of a resume includes business consulting at Google, crafting Joe Lieberman’s security legislation (in the emotionally-charged but bipartisan period after September 11th), working on net neutrality at the FCC (the first time it came around), acting as a TV host for CNN and PBS (for shows on Asian-American issues and American infrastructure), and making People Magazine’s Sexiest Men Alive and Hottest Bachelors lists (he’s married with kids now).
Kwon was drawn to Facebook after visiting the company for the PBS show, “America Revealed.” The social network, with its 1.39 billion members, seemed like it had more power than the American government to change the world. “Government work was like rolling a big boulder up a large hill. In this political environment it’s very difficult to make meaningful legislation,” said Kwon. “What other organization exists besides Facebook that has a global footprint, affects millions or billions of lives, and moves this quickly?”
Every product manager at Facebook now goes through a boot camp session with Kwon’s XFN team, and has one of the 8 privacy managers on the team assigned to him or her. In a cramped conference room during a boot camp session for 5 newly-hired product managers working on ads, Instagram, and location products, a lawyer on the XFN team named Mark Pike explained that his job is to make sure that the things they’re working on “not show up on the front page of the New York Times” because of a privacy blow-up.
During the boot camp session, Kwon lounged in a chair at the side of the room in a dark blue polo shirt, dark jeans and black leather loafers (instead of sneakers, which is a giveaway he’s spent a lot of time in D.C.). He doesn’t want people groomed on the long-time Facebook tenet to “move fast and break things” to see the privacy team as onerous. “This process is not designed to slow you down,” he chimes in. “It’s the opposite. If we’re engaged early on, we can help you avoid thorny issues later.”
This is where new product managers learn about the “Privatron,” Kwon’s tool for keeping track of everything that’s happening at Facebook in order to avoid surprises. There are currently over a thousand Facebook projects in the tool, which is essentially a spreadsheet with columns for keeping track of who’s seen what, which problems were raised, and how they were resolved. As was demonstrated during Google’s infamous ‘Wi-Spy’ debacle — when its Street View cars sucked up passwords, emails and credit cards from the unprotected Wi-Fi networks they drove by — privacy blow-ups can happen if just one engineer decides to tinker. In that case, a Google engineer thought it would be cool if the cars mapped Wi-Fi networks while mapping streets, and so threw it into the code late in the process. “Everything needs to go into the Privatron, even tiny changes,” says Pike. Kwon adds, “Maybe not when you’re changing the color of a button from dark red to light red, but we are trying to be involved with every change to code.”
The privacy sherpas seem to be effective. James Grimmelmann, a University of Maryland professor with a long history of analyzing Facebook’s privacy mistakes, says the company has turned a new page. “Facebook is not my go-to suspect when I open up the news and look for privacy problems. In 2008 and 2009, they did something wrong like clockwork every few months. It was a nasty cycle,” he says. “Facebook moves carefully now. It doesn’t want to move fast and break things anymore.”
Kwon says the Federal Trade Commission’s crackdown was a turning point for the company: “The FTC consent order made it a priority for us.” The agency has made a concerted push toward regulating privacy, even though it’s not explicitly part of its mission to stop “unfair and deceptive trade practices.” The FTC has entered into consent orders that require privacy programs and biannual privacy audits by tech companies with some of our most sensitive data, including Facebook, Google, Twitter and Snapchat. (PricewaterhouseCoopers, which conducted Facebook’s first audit in 2013, said the company’s “privacy controls were operating with sufficient effectiveness.”) When it comes to cracking the whip on companies for poor privacy practices, the FTC is the only American agency with handle in hand. “FTC consent decrees are wonderful things, aren’t they?” says Grimmelman. “They force companies to slow down and actually plan their privacy protections using a rational process.”
The biggest blow-up for Facebook last year predated the creation of the XFN team: the “emotion contagion study” which manipulated the emotional tenor of users’ Newsfeeds to see if it would elate or depress them. “Back then, a small group of people could move something forward without review from other departments. That doesn’t happen anymore,” says Kwon. “We would have prevented that.”
One narrowly averted disaster was Nearby Friends, an app you can use to see where your Facebook friends are. It was the first big product Kwon worked on when he arrived at Facebook in February 2013, three days before his second daughter was born. He was working with a tall Italian named Andrea Vaccari, whose passive location sharing app Glancee was acquired by Facebook in 2012. “I saw Andrea more than my wife that year,” says Kwon. “At one point, I felt like I needed to reassure her that Andrea was a man.”
It took two years to hammer Glancee — which had 50,000 users when it was acquired — into a product that could be released by Facebook as Nearby Friends. The first big debate Vaccari had with Kwon was whether Facebook’s hundreds of millions of American users should automatically be opted in to passively sharing their whereabouts with their Facebook friends. After all, Vaccari thought, Facebook wouldn’t show your exact location on a map, just how far away from that person you were — “1/2 mile,” for example. Vaccari wanted people to try the feature and see how valuable it was before opting out, and he knew it was one of those apps that relies on a network effect: an app that makes it easier for you to unexpectedly meet up with friends only works if your friends are using the app. The old Facebook probably would have agreed with that logic, but not the new Facebook.
Working with the privacy sherpas changed Vaccari’s thinking. “It was frustrating at first,” says Vaccari. “I came into Facebook to launch this thing I believed in and I wanted to launch it to as many people as possible. At first the recommendations felt too conservative, but I learned quickly to appreciate the value of these recommendations. A bad launch could have hurt our ability to grow in the future.”
Facebook launched Nearby Friends in April 2014 very carefully, putting “optional” prominently in its headline. Facebook only pushed the app on mobile to people who had previously used Facebook check-ins, indicating they liked sharing their location, and it does not yet allow minors to use it. The opt-in process involved 4 different screens explaining the product that a user had to click-through — the most Facebook had ever used — to make absolutely sure a Facebook user wanted in. And at Kwon’s prompting, the app included the ability to delete your location history — either entirely or a day of activity you didn’t want stored. “We didn’t even think about deletion originally,” says Vaccari. “We were just focused on making the product work well.”
There may be a cost to Facebook’s new emphasis on privacy. By becoming slower and more deliberate about releasing products, it becomes slower to the draw on introducing innovative new products that people want to use. In the last year, Facebook has released chat app Rooms, ephemeral messaging app Slingshot (after Poke failed to catch on), and news app Paper, none of which have caught on in a huge way. “It’s great that Facebook has buttoned up but it also leads to changes in its engineering that make it less edgy and cool,” Grimmelmann says.
Kwon doesn’t want Facebook to screw up on privacy, but he also doesn’t want the company to fall behind. “You’re trying to find the right balance between coming out with innovative delightful products that still respect privacy,” he says.
But he thinks the bigger challenge with Facebook’s well-established reputation for privacy carelessness is to make people feel comfortable using Facebook’s products and trusting it with even more of their information. In a 2013 joint ABC-Washington Post survey, 30% of those surveyed viewed Facebook unfavorably; Google and Apple were shown far more goodwill. “Many people I know have taken to treating Facebook like Twitter—just assume it's all public. The intricacies and the complications of their privacy maneuvers and back-and-forth has left many people exhausted,” says Zeynep Tufekci of the University of North Carolina. Alessandro Acquisti, a Carnegie Mellon professor who has studied the way Facebook has pushed people to make more information on the network public, says Facebook has gotten better at privacy but that there's a lot to forgive. "This progress does not erase Facebook's original sins," says Acquisti. "Facebook is the silent listener to all the things you do and share with your friends through your profile. By nudging users to reveal more and more often through its services, Facebook has created an unprecedented infrastructure of surveillance. The jury is out on how this concentration of power within a single company will pan out for its users in the long run."
Kwon, though, is optimistic. “It’s a slow process and we have a lot of work to do,” he says. “It’s like changing an oil tanker. Building user trust requires slow, incremental steps. But I think people are starting to believe us that we really are trying to do the right thing.”