On Saturday night, the people in Hollywood whose job it is to stamp out Internet piracy got a little shiver down their spine courtesy of Periscope, the Twitter-owned live-streaming app.
All night, moderators at Periscope had been pulling down live-streams that appeared to show the fight between Floyd Mayweather Jr. and Manny Pacquiao, which cost $100 to watch legally on pay-per-view. But there were too many streams — the moderators couldn't play piracy whack-a-mole fast enough, which meant that, at any given time, thousands of people were watching the fight illegally, depriving HBO and Showtime of many potential dollars in lost revenue.
A solution for piracy opponents, then, would be to replace Periscope's human watchdogs with an AI-based algorithm that could recognize when illicit content was being streamed and automatically pull down the feed. And now, one New York-based artificial intelligence startup, Dextro, thinks it's cracked the code to doing exactly that.
Today, Dextro is launching Stream, a computer-vision system that it says can analyze and categorize what’s going on in public Periscope videos in real-time. It uses image-recognition algorithms to lump together broad types of content and make them more easily navigable. If you’re in the mood for streams from the Baltimore protests, Stream can find a dozen. Want puppy Periscopes instead? The software can do that, too.
"In the context of Periscope, where things are happening live, there's way too many streams for humans to do that kind of curation," he told me. Luan said that over the weekend, "We saw a huge spike over the weekend in how many TVs were being picked up — everyone was pirating the Pacquiao fight." And while Stream isn't yet being used by Periscope itself to shut down pirated or explicit streams, it could be.
Stream's image-recognition algorithm isn't perfect. (When I tested it out by pointing it at a cold-brew coffee contraption, the app mistakenly labeled it as a lamp.) And it only works on public Periscopes for now, meaning that you can evade it by setting your stream to private. But, if their algorithms actually work as well as Luan says they will, it has the potential to be a tool for policing live video on the internet.
Algorithm-based image recognition is one of the thornier AI problems out there. Google already uses pretty sophisticated artificial intelligence to power its image search and catalog the world's images. Other companies, like on-demand web chatting service Chatroulette, have tried to apply similar technologies to live video to detect dick flicks. Currently, algorithms that crawl the web looking for illegally-uploaded movies work by doing a pixel-by-pixel comparison of the original and bootleg content. If an algorithm "sees" a stream that matches the original, it alerts the owner.
That's harder to do with a Periscope video, because, to a computer, the official ringside video of the Mayweather/Pacquiao fight looks completely different than a live-stream of a guy watching the fight on the TV in his living room. To work effectively, Stream would have to identify the TV or monitor where the video is playing, then take that portion of the stream and compare it to the original, taking into account any distortions introduced into the stream.
"This is a whole different ballgame," Luan said. "You're not looking for an exact match…It's doable, but more challenging."
In the future, Stream could also be used to keep dick pics and other nude images off Periscope and other live-streaming apps. (Periscope's community guidelines, for one, discourage "pornographic or overtly sexual content.") Luan says he has gotten requests from companies to use Stream, which will be rolled out as a video analysis API, to identify when content goes X-rated.
Using AI to scan live video feeds, of course, raises some privacy concerns. Would Periscope users feel more inhibited on the service if they knew a bot was monitoring their feeds? And what if it weren't just illegal live-streams and dick pics being analyzed, but more harmless types of content? At what point do algorithms become a tool for censorship?
"This is part of a larger picture," University of Washington cyberlaw expert Ryan Calo told me in an email. "Google scans all Gmail for child porn, but claims it will not scan for other criminal activity, implying it could. Facebook, in partnership with the University of Washington’s School of Social Work, uses a mixture of technology and people to look for suicide risks." Calo added: "What I am worried about is the possibility that the government, or companies working on its behalf, will begin to scan all traffic for unlawful content and then hide behind a kind of exception to the warrant clause that says you have no reasonable expectation of privacy in illegal contraband."
The line between terms-of-service enforcement and broad-brush surveillance is thin, and in the future, companies like Dextro will need to prove that their algorithms are capable of restraint. Otherwise, Periscope and other live-streaming apps could lose their spontaneous feel, and users could desert them for platforms that aren't patrolled by robots.
Daniela Hernandez is a senior writer at Fusion. She likes science, robots, pugs, and coffee.