The real 'news curators' at Facebook are the engineers who write its algorithms

Latest

Because Facebook is now the largest online platform for the distribution of news, New York Magazine argues that Facebook is simply the world’s biggest newsroom. According to this line of thinking, the ‘news curators’ Gizmodo reported on this week, who allegedly exercised bias by not putting conservative news into “Trending Topics,” are an expected development. The point these critics make is that we can’t expect Facebook not to have an editorial viewpoint, just as we can’t expect The Washington Post or The New York Times not to have an editorial viewpoint.

But Facebook is not at all the same as a traditional newsroom. Facebook, which sees itself as a social search engine for its billion users, presents itself as a service whose content is surfaced almost purely by algorithm. That’s how Facebook framed its rebuttal to Gizmodo’s report. Facebook argues that its trending content is in fact selected by software programmed to look for popular content, and that its human news curators simply do a bit of hashtag consolidation and topic formatting on top.


Facebook’s vice president of search, Tom Stocky, whose team is responsible for Trending Topics, wrote, “We have in place strict guidelines for our trending topic reviewers as they audit topics surfaced algorithmically: reviewers are required to accept topics that reflect real world events, and are instructed to disregard junk or duplicate topics, hoaxes, or subjects with insufficient sources.”

This framing assumes that human editing is ideological whereas the algorithmic editing and surfacing of content is “neutral.” On this logic, any judgments made by humans, unless they are minor, can be tainted by bias, but algorithms deliver content without bias. The problem with this line of thinking, whether one is arguing for Facebook as a newsroom, as some journalists do, or Facebook as a neutral algorithmic production, as Facebook does, is that the choices made by algorithms about what to surface are chosen by humans, based on those humans’ values and in the case of Facebook, the company’s business needs.

This is to say that the algorithms that produce the Facebook you see, from what appears in your News Feed to what appears on the Trending Topics sidebar, reflect the choices that Facebook’s engineers made about what you should see. And naturally, because Facebook is a business, what they believe you should see is based on what is good for Facebook.

Thus, when the News Feed surfaces to the top of your timeline the photos from a party that many of your friends are tagged in, this represents a choice to weight the News Feed algorithm to feature party photos full of friends, based on the fact that Facebook considers these photos to have great social importance to users, and hence will drive users to come to Facebook to consume News Feed.

But it is conceivable to imagine a culture that doesn’t place as much value on party photos and finds, for example, religious text to be more compelling. For Facebook users in this culture the choice to privilege photos of people partying with friends may not be the natural or “neutral” choice, but it is the choice that Facebook has made for them based on Facebook’s understanding of social values. The point here is that no choice a human or business makes when constructing an algorithm is in fact “neutral,” it is simply what that human or business finds to be most valuable to them.

Neutrality is an invisible editorial choice, and it involves making specific choices that seem “neutral” to those either constructing the “neutral” algorithm or those editing a set of topics by hand to appear “neutral.”

This is why “neutral” is not the same as “transparent.” True transparency in a defense against claims that Facebook edits or influences news would be Facebook publishing the choices it makes in weighting content on its platform. Stocky does not, however, tell us how the algorithms that surface content for Trending Topics are designed. What choices go into deciding what content those algorithms spit out? Does it calculate the popularity of topics based on geographic region, population density, one’s personal social networks, or on a country-wide basis? Does it switch between these, and when, and why? What does the algorithm consider real “news” versus “junk”? Are celebrities or certain types of events given more weight because they seem more important to Facebook?

None of these choices is “wrong” per se, but they all would influence what one sees, and what one sees then would not be “neutral” but would be the product of an editorial hand of the engineering department, not the news curation department.

This is why it isn’t particular helpful to think of Facebook as just another, very large journalists’ newsroom. Facebook is something different: an engineering department that curates news. And unlike a traditional newsroom, Facebook is not on par with traditional newsrooms but occupies a layer above them, standing between traditional newsrooms and their readers. This gives Facebook a lot of power, and without knowing how Facebook’s algorithms make their choices, it is a power that we still don’t really know the workings of.

Kate Losse writes about technology, design, and culture and is the author of The Boy Kings, about the early years of Facebook, where she worked from 2005 to 2010. She is based in California.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin