Last month, when Gizmodo reported that Facebook "routinely suppressed" conservative news, it set the world a-tizzy. After all, Facebook controls the eyeballs of over a billion people and has an incredible potential to influence them. Facebook's PR machine went into mach-10 mode. A senator launched an investigation. Prominent conservatives were invited to meet with Facebook CEO Mark Zuckerberg to air their complaints. All while Facebook denied that the alleged suppression had occurred.
A problem with the report, though, was that it didn't look at Facebook's "Trending Topics" over time—Gizmodo gave us anecdotes about particular stories that Facebook's human news curators allegedly failed to promote, but it didn't give us the data to prove that the big blue social networking giant was routinely ignoring news of interest to those who lean to the political right. Because those sources were anonymous with testimony that was impossible to verify, we decided it would be useful instead to look at the raw data of the Trending content to see if a bias is clearly apparent.
Ideally, we would have examined Trending Topics content prior to Gizmodo's report, but only Facebook has those archives. (Facebook reviewed them and said that the topics they'd been accused of suppressing, from Lois Lerner to Chris Kyle, had been included multiple times). Instead, we analyzed Facebook's Trending Topics over six days in the week after Gizmodo's report. As our measuring stick, we compared the news Facebook featured to the news surfaced by conservative news curator The Drudge Report.
Each morning, we catalogued the stories that appeared on Drudge Report and both the main and politics news sections of Facebook's Trending Topics; the Trending sections appeared to be the same among different Facebook users, though ordered differently. Over the course of the week, we catalogued 102 stories curated by the Drudge Report and 93 stories catalogued by Facebook Trending. Admittedly, this is not highly scientific. Update: And as pointed out by Aram Zucker-Scharff, this is only comparing a version of Trending Topics shown to one person as opposed to the full list of Trending Topics that might be shown to any Facebook user. They vary from person to person, but usually with some overlap.
The two outlets shared the same stories 14 times over the six days. So, judging by Drudge standards, Facebook included conservative-interest news 15% of the time for this Trending Topic feed. Here are the 14 stories they had in common over six days. Unless otherwise noted, the stories appeared on the same day:
Facebook definitely prefers stories starring celebrities, while Drudge loves stories on government corruption, privacy invasions, the weather, and anything embarrassing to the TSA or Hillary Clinton.
Surprising stories went unreported by both sites. Drudge, with its love of sex scandals, spotlighted the story, "High schoolers live stream threesome." But while that viral news event reportedly happened on Facebook Live, it never made its way into Facebook Trending. On the day that Facebook let its users know that "Trump tells interviewer his tax rate is ‘none of your business,’" Drudge let people know that "Trump brings more excitement than some folks can handle." Where Facebook Trending told users about "Authorities planning new wave of deportations," the Drudge Report's headlines reflect an America where immigrants are only entering the country not being forced out of it: "Cuban refugees pouring into Texas," "Border battle: 40 immigrants crossing a day," and "Surge of immigrants seeking citizenship, voting rights before election."
So clearly, yes, the news one gets through the two sites differs quite a bit. (For more on just how different conservative and liberal news feeds can be, see this brilliant "Blue feed, Red feed" WSJ interactive.) Whether our analysis absolves Facebook of bias or further damns it depends on whether you think a 15% ratio of conservative-interest stories is high enough.
Gizmodo's "Trending Topics" report was not the first Gizmodo piece to question Facebook's political biases, though, so we didn't stop there. In April, Gizmodo reported that Facebook employees had asked Mark Zuckerberg whether the company has an ethical responsibility to prevent Trump's election; after the story, Neetzan Zimmerman of DC-based political outlet The Hill suggested conspiratorially that this might explain why Facebook engagement on The Hill's stories about Donald Trump had "cratered."
With this in mind, we took a look at Facebook engagement—likes, shares, and comments—on stories about the remaining presidential candidates across nine different news sites (New York Times, Buzzfeed, CNN, Fox News, Washington Post, Huffington Post, Huffington Post Politics, The Hill, and Fusion). Here is what we found:
These numbers compare engagement with stories about a candidate to engagement with all stories posted by a publication. For instance, the graph shows that in July 2015, the median story about Bernie Sanders got about 4.9 times as many likes, mentions, and shares as the median among all posts that month.
The numbers suggest that Facebook members' interest in stories about the presidential race has waned across the board compared to July 2015 when the candidates' campaigns first started—not just interest in stories about Trump. Bernie Sanders looks to excite the most interest, while Trump and Clinton stories are receiving a comparable number of median likes, shares and comments.
Trump, though, does seem to be seeing a little dip more recently. Here's his engagement broken out by publication. Use the legend to eliminate the sources you don't want to see. Interestingly, The Hill and Buzzfeed in particular have seen a reduced engagement with Trump stories from their Facebook followers compared to other outlets.
The thing about data is that it can almost always be twisted to tell a different story. These dips could mean people are sick of the political news roller coaster. They could mean that there's other news in the world that's more engaging these days. It could mean that people are sick of being barraged with stories about Trump. Or, it could mean that Facebook is indeed turning the levers on its invisible interest algorithm, silently manipulating what appears to its users, and thus what they engage with.
What is clear is that Facebook's biases, if they do exist, are not crystal clear in the data alone.
Daniel McLaughlin is a creative technologist exploring the 2016 presidential election. Before joining Fusion, Daniel worked at the Boston Globe and graduated from MIT with a BS in urban studies and planning.
Kashmir Hill is the editor of Fusion's Real Future. She has hacked a stranger's smart home, lived on Bitcoin & paid a surprise visit to the NSA's Utah datacenter, all while trying to prove privacy isn't dead yet. Contact her at firstname.lastname@example.org. PGP: D934E5E9.