LEGO Universe had a huge 'dong detection' problem, says former developer

Latest

In 2010, LEGO unleashed LEGO Universe, a massive, multiplayer game where builders of all ages could create whatever they wanted on their own digital plots. Well, almost anything: LEGO didn’t want any of the players to endow its online world with penises. After all, it was meant to be a kid-friendly place in which phalluses had no role—even its mini-fig citizens were dickless.

To keep the game penis-free, the company hired a sizable moderation team to scan screenshots of every structure that went up, according to a former developer for the game. Management “wanted a creative building MMO [massively multiplayer online-game] with a promise of zero penises seen,” tweeted Megan Fox, a developer who worked on the project, on Friday. “YOU could build whatever you wanted, but strangers could never see your builds until we’d had the team do a penis sweep on it.”

But as any internet company will tell you, the dicks will not be kept down. They kept sprouting, faster than the team could take them down, according to Fox. Even an employee working on the game built one.

“The moderation costs of LEGO Universe were a big issue in general,” Fox went on to say on Twitter. “We were asked to make dong detection software for LEGO Universe too. We found it to be utterly impossible at any scale.”

We’ve reached out to LEGO for comment but the company has not yet responded. According to Fox, the lack of an automated ‘dong detector’ was costly for LEGO. Fox said the human moderators hired to fight the battle of the bulge were the largest expense associated with the game, which LEGO shuttered in 2012.

Fox recalled the story on Twitter after pondering how Nintendo was preventing dick doodles in its new game with a “draw what you want” component. We’ve had three years since LEGO’s shuttering to improve algorithm censoring of phallic shapes but dick-pic curation is still an unsolved problem despite wide interest from tech companies. Twitter, Facebook, Snapchat, Reddit and others spend millions attempting to shield us from inappropriate content. There’s wide interest in building machine learning systems that can do this automatically, but so far no one’s been able to come up with a truly reliable solution.

“Trying to detect nudity in general is difficult because there’s a lot of content that can just be suggestive, and not necessarily inappropriate,” said David Luan, the CEO of Dextro, a New York-based startup looking to build an automatic dick-pic detector. “There’s a lot of nuance that can be really hard to pick out.”

For instance, a picture of breast-feeding mom isn’t (or at least shouldn’t be) in the same category as a woman baring her boobs for beads at Mardi Gras. Likewise, in the case of LEGO Universe,  crafty players started chopping up their digital shafts into different parts, a visual penis puzzle, if you will.

Another issue is that an algorithm’s accuracy depends on something called training data. That’s a set of labeled images or videos that an engineer feeds into the system to teach it what an object looks like. For most artificial-intelligence algorithms, it takes a larger number of images to get a system to consistently identify an object correctly. The training is most successful when these images are high-quality, well-lit, taken from direct angles. Think about the type of images you get when you Google things like cats, houses, or cars. Most are pristine stock photos. But think about the dick/clit picks or naked selfies you’ve taken. They’re probably a bit grainy, blurry, and taken from not-so-flattering angles, despite what you may think.

“That makes it really challenging. As a result, even the best machine-learning algorithms aren’t that great,” he added. “The risk of having one false negative — some random video that isn’t properly flagged — is relative high.”

For now, having humans do the curation is the best way to make sure no dicks slip by. Twitter, for instance, still relies on people reporting instances of revenge porn. Other gaming communities, like Minecraft and Twitch, largely rely on volunteer moderators to keep things PG.

So, it seems, until AI figures out how to infallibly recognize a penis, at least one job will be safe from our robot overlords.

Daniela Hernandez is a senior writer at Fusion. She likes science, robots, pugs, and coffee.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin