Why Reddit hosting its own images won't do much to fight abuse and harassment

This image was removed due to legal reasons.

Reddit is starting to let users bypass other image hosts and upload their memes, GIFs and giffy memes directly to its own servers.


The move itself isn't too exciting—it's fairly common for social media platforms to bring third party services in-house as they get older. Remember the days when Twitter users had to send their photos to Twitpic or Yfrog instead of uploading them directly? Probably not, even though it was just five years ago.

But hosting images does bring a new responsibility to the service. The content aggregator previously had only two ways to submit items to its communities: text or links. This is the first time it has actually hosted any non-text content. Now it has to make sure Redditors aren't uploading any photos containing inappropriate or harmful content. And have you met Redditors? Spoiler alert: This has and will be an issue.

Reddit engineer Chris Slowe told Slate that it would be easier for Reddit's Anti-Evil team, an internal team tasked with addressing harassment, copyright and spam on the site, to respond to user requests to take down offending content if it was hosted on Reddit itself. So far, the Anti-Evil team has tweaked the site’s source code to make it harder for people to post spam and making improvements to the site’s blocking function. It will be interesting to see what else the team comes up with.

"[Users are] in this position where they've got something out on the Internet that they don't like, and they have to track down the 12 stake-holders who can actually fix it. We're streamlining that process by hosting," Slowe told Slate.

That's a laudable sentiment, if an easily-bypassed one. We don't know how responsive Reddit is going to be to requests to remove images encouraging harassment or abuse yet because they've just started. But even if they turn out to be hyper-vigilant, Redditors can just choose not use the company's image hosting and do what they do right now: post pictures elsewhere (like on popular image host Imgur) and link them on the site. And seeing how Reddit pretty much depends on outside links to function, that's unlikely to change.

In a post announcing the feature, Reddit admin Philippe Beaudette said the company's content policies for images would be the same as the rest of the site. And because this is Reddit we're talking about, he was doing it in response to a question from the moderator of the Watch People Die subreddit, which I am not going to link because it is as bad as you think it is.





So far, Redditors seem generally happy with the new feature. The original thread seeking subreddits to test the service received ample volunteers, and the top comments on the wider announcements are mostly technical questions and people happy to have an alternative to Imgur.

We'll have to wait and see how Reddit's more toxic communities use this tool, which is part of the overall problem with Reddit and online communities in general. From Twitter to YouTube, social media networks are reliant on users flagging content and moderators to curb abuse and harassment, usually after the damage is already done.


Facebook says it uses machine learning and artificial intelligence to flag photos that violate its content policies and that these AI now report more offending photos than humans do, which seems like a step in the right direction. But like almost everything Facebook does, there's little transparency on what's actually happening there and we have to take their word for it.

Here's hoping Reddit's users use this new tool for gifs of animals being de-rescued more than they use it for evil.