Facebook study says it’s your fault, not theirs, if your feed is all like-minded friends

Latest

Is Facebook an echo chamber? Does the social network help us create filter bubbles, through which we’re only exposed to content and opinions that are like our own? According to the company, not really.

In new study published today in the journal Science, Facebook claims that it’s mostly humans, not its News Feed ranking algorithm, that are at fault for making their feeds ideologically consistent.

“While News Feed surfaces content that is slightly more aligned with an individual’s own ideology (based on that person’s actions on Facebook), who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter,” according to Facebook’s Data Science page.

The researchers found that individual choice lessened the likelihood that a person would click on different content than the News Feed algorithm served up by 17 percent for conservatives and 6 percent for liberals. In contrast, the News Feed algorithm’s effect was smaller, lessening the chance that users would click on “cross-cutting” content (in other words, content that didn’t match their ideology) by 5 percent for conservatives and 8 percent for liberals. Or, as Christian Sandvig puts it, Facebook’s algorithm “filters out 1 in 20 cross-cutting hard news stories that a self-identified conservative sees (or 5%) and 1 in 13 cross-cutting hard news stories that a self-identified liberal sees (8%).”

Facebook’s researchers took this to mean that user choice, rather than Facebook’s newsfeed algorithm, was the biggest factor in determining which stories showed up on their feeds. Story placement “depends on many factors,” the researchers wrote, “including how often the viewer visits Facebook, how much they interact with certain friends, and how often users have clicked on links to certain websites in News Feed in the past.”

But before you blame yourself for the state of your Facebook feed, note that the results rest on some flawed assumptions.

The study looked at 10 million Facebook users in the U.S. who self-identified as conservative or liberal and roughly 7 million links shared between July 7, 2014 and January 7, 2015. (Amazingly, the study was started right after the now infamously unethical Facebook mood study, which was published in June 2014.) Researchers used a 5-point scale to indicate where a user landed on the conservative-liberal spectrum: a -2 meant you were very liberal; a 2 meant that you were very conservative. Then, “by averaging the affiliations of those who shared each article, we could measure the ideological “alignment” of each story,” the researchers write in a blog. That is to say, the test measured the political slants of the users who chose to share a given story, not of the story itself. Using all this, they were able to figure out which websites trended conservative and which liberal. So far, so good.

But notice that Facebook’s sample was of users who self-reported their political slants. Only 9 percent of Facebook users actually report their political affiliations on their profiles, according to the company. That’s important because it means that these people may not be a representative sample of Facebook users. As Zeynep Tufekci writes, “people who self-identify their politics are almost certainly going to behave quite differently, on average, than people who do not.” It may be that people who self-identify as liberals on Facebook click on more cross-cutting content, or less, than people who keep their politics to themselves.

Our own networks also affect what we see on Facebook. The amount of “cross-cutting content” we encounter depends on how diverse our friend groups are. Facebook’s study found roughly a quarter of the average user’s friends have opposing political view points. That’s not a lot, but it should, at the very least, expose us to different types of content:

It turns out, according to Facebook, that the “filter bubble” does exist, but it’s not as powerful as previously assumed. Most of what we see on Facebook is by choice, and it’s not all that surprising that we tend to stay within our own ideological spheres. (After all, we tend to hang IRL with people who share our same interests and opinions about the world.) But the study also shows that Facebook is involved, in some small way, in pushing more homogeneous content to its users.

Also, another fair question to ask, given that the study was conducted by Facebook itself: if the filtering effect had been more pronounced, would we be hearing about it?* Maybe. But maybe not. After all, no company wants to paint itself as a diversity flattener.

One of the study’s other flaws is that it’s not easily replicable. Facebook controls its own data, and makes it available to independent researchers only once in a while. But perhaps they’ll open up their data again in the future, and we’ll know more about how the social network keeps us informed.

*Update: this post has been updated to better reflect the results of Facebook’s study, which found a limited — but statistically significant — algorithmic effect in steering users to like-minded content

Daniela Hernandez is a senior writer at Fusion. She likes science, robots, pugs, and coffee.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin