Face recognition remains a relatively imprecise science. It's often biased against black people, can have damning consequences when it misidentifies criminal suspects, and even at its most accurate still leaves significant room for error. Nonetheless, it's growing commercially due in large part to government investment and the allure of billions in profit.
In April, the Russian app FindFace gave us a glimpse of a world where public anonymity could cease to exist. The hyper-accurate face recognition app was all over the news as it helped a man track down two strangers he'd taken a photo of years before and then, more disturbingly, aided online harassers out the real world identities of amateur porn performers.
But FindFace had it limits. It relied on photos sourced from VK, Russia's Facebook, meaning that only the social network's members were in the database. But now NTechLab, the year-old Russian startup behind FindFace, is hoping to bring its technology to the world, launching a service called FindFacePro.
Until now, the FaceN algorithm, which was designed by NTechLab co-founder Artem Kukharenko, was only publicly available through the FindFace app. There were some exceptions, like a reported deal with Moscow police (which NTechLab says is still being finalized) and a Russian music festival that let people opt-in to being detected. With FindFacePro, corporate and government clients will be able to potentially identify individuals from a photograph, surveillance stills, or (eventually) video.
Last year FaceN beat out Google's face recognition program and a number of others in a University of Washington competition called Megaface; it was 73% accurate in correctly identifying faces from a set of 500,000 photos of 20,000 different people. This year the company stalled a bit in the competition: FaceN maintained its level of accuracy, but placed fourth behind other teams that achieved somewhere between 74% and 75% accuracy.
The new FindFacePro service is a cloud-based API, which would allow its customers to provide databases of their own with which NTechLab's algorithm could be trained. Kukharenko's co-founder, Alexander Kabakov, who handles NTechLab's business, told me in a recent interview that the company has 400 potential customers "somewhere in the pipeline," a number a spokesperson revised up to 450 since we spoke. An estimated 70% of those 400 were security companies, who would have to furnish their own databases, but could check them against stills, security footage, and video.
NTechLab declined to name all but one potential customer, Papilon Savunma, a Turkish biometrics firm that boasts of working with the Turkish government, military, and police forces. The Turkish government has put what Amnesty International has described as "unprecedented pressure" on the press and on protests. The organization also documented increased "[c]ases of excessive use of force by police and ill-treatment in detention."
When I asked Kabakov about the possibility that NTechLab's software might be licensed and employed by an abusive security force or oppressive regime, or that it might misidentify someone, or both, he said that isn't really the company's problem.
"As Snowden said, we know that this information is not only used by Apple, Google, Facebook, but also by government agencies," he told me. "And if some authority or some bad guys want to intervene in your private life, they can do it without any face recognition."
That in itself isn't unusual, especially if NTechLab provides only minimal tech support and is hands off with whatever data its licensees train the program on. However, that may be cold comfort if the new, hyper-accurate system is used by law enforcement in repressive countries. Even in the U.S. face recognition is a touchy subject, due to the growth of vast, almost entirely unregulated databases of faces.
But NTechLab, or at least Kabakov, views that issue as almost passé. As he explained in our interview, he sees America's thinking on privacy as simply insufficiently developed, and alluded to a vague future where a different moral calculus will govern how we think about it.
"Our life will be transparent for anybody. The moral standards will be changed, so it will be a very interesting situation.”
Ethan Chiel is a reporter for Fusion, writing mostly about the internet and technology. You can (and should) email him at email@example.com