Alex Jones and Other Extremists Are Just Migrating to Private Facebook Groups After Being Banned

Latest

After years of dragging their feet, the last few months has seen Facebook crack down on content that they say violates their community standards. The site recently suspended conspiracy theorist Alex Jones and removed four of his pages. But the platform has done little to nothing to rein in what is said in private groups, according to the New York Times, which reported that one private Infowars group with 110,000 people was still going strong.

Jones himself was posting in the group, using it as an alternative to a public page, and several Infowars-related accounts were listed as moderators. “In the Infowars group, posts about Muslims and immigrants have drawn threatening comments, including calls to deport, castrate and kill people,” the Times writes.

The paper also gave a few more examples of existing private groups that might violate Facebook’s community standards:

Several private Facebook groups devoted to QAnon, a sprawling pro-Trump conspiracy theory, have thousands of members. Regional chapters of the Proud Boys, a right-wing nationalist group that Twitter suspended last month for its “violent extremist” nature, maintain private Facebook groups, which they use to vet new members. And anti-vaccination groups have thrived on Facebook, in part because they are sometimes recommended to users by the site’s search results and “suggested groups” feature.

The Times isn’t the first to pick up on extremists migrating to private forums. Mark Zuckerberg himself said he wanted a billion people to join “meaningful,” Facebook groups. But what if what’s meaningful to someone is the extermination of an entire group of people?

“They’ve essentially empowered very large groups that can operate secretly without much governance and oversight,” Jennifer Grygiel, an assistant professor at Syracuse University, told the Times. “There may be harms and abuses that are taking place, and they can’t see.”

Facebook uses the same automated tools to take down prohibited material—like nude photos, as an example—in private groups as they do in public areas of the site. A spokesperson for the company told the Times that they’re working on developing more comprehensive measures.

It’s impossible, and probably a very bad idea, for one company to be the arbiter of truth across large swaths of the internet. That’s most likely why Facebook has contended that misinformation isn’t against its policies unless it leads to violence.

On the other hand, too strong of a moderating presence could also be bad for the platform. The company was recently criticized by the UN for their broad definition of terrorism, which some experts say could lead to the quashing of legitimate dissent against oppressive regimes. But in Myanmar, Facebook has been blamed for spreading information that has fueled the genocide of Rohingya.

“The vast majority of groups on Facebook are probably the run-of-the-mill groups,” Renée DiResta, a researcher who studies online extremism, told the Times. “The challenge is, how does the groups feature interact with the other features on Facebook that we know are causing radicalization, hate speech and genocide in certain places? Who is taking responsibility for looking at the negative externalities of this push to create communities?”

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin