Each week brings us more stories of new ways in which the government, private companies and fellow citizens are unearthing information about us that we don't want them to know. New face recognition technology that will "bring an end to public anonymity." Face analysis technology that will tell the government whether you're "a pedophile or a terrorist." Analysis of your face and speech in the videos you post online that will let advertisers make predictions about your personality and behavior.
Human beings remain deeply mysterious to one another, but technology seems to offers a magical ability to see below the surface instantly, promising insights that would usually come only from spending days, months or years with another person. It seems like the whole world wants to know you better—and is willing to employ any of the creepy, occasionally spurious tech that is offered to do it.
We start asking ourselves, "Is there any privacy left? Will I ever move through the world (or internet) again without being catalogued and tagged like a research animal? Do I need to start wearing a ski mask and DNA-preserving full-body condom in order to have a little peace of mind?"
But then we get stories of the modern surveillance state failing. As when LinkedIn's all-seeing eye mistakes one of its members for a white supremacist because he shares a name with one. Or this terrible failure of security and face recognition: "TSA gave my Macbook Pro to another passenger at LAX, and now it's gone."
Eric Cheng, like anyone going through a TSA security (theater) line, took his $2,800 Apple Macbook Pro out of his bag and put it in a bin to go through the X-ray machine. He was held up in the body-scanning line, and when he finally got through and grabbed his belongings, he discovered his laptop was missing. Cheng flagged down a TSA agent. He writes:
We moved over to the camera footage station, and a nice agent began to review archived camera footage…. we watched as a TSA agent pulled my computer off of the belt as soon as it came out of the machine—there is an area where agents can remove things from the belt before passengers have access to belongings. He moved my computer to a holding area immediately behind the x-ray machine. And then, we watched as the computer was inspected, after which it was handed back… to a random woman. The woman took my computer and left the security area.
Though the TSA and police had footage with the woman's face captured, there was nothing they were able to do. So much for facial recognition. They knew only that she was likely on a 4:15 flight, due to comments her companion had made in line. Cheng flew out of LAX without his laptop—or much hope of getting it back.
Which is insane when you think about it. This is a citizen's brush with one of the most extreme parts of our new security state, where we are treated as if we're entering a supermax prison. Our credentials are checked meticulously. We remove belts, shoes, and lint from our pockets, assume an uncomfortable position for a body scan, and sometimes a follow-up pat-down. We put our intimate belongings through an X-ray search, and sometimes a hand-search. This is one of the moments when we are most vulnerable, most viscerally aware of government surveillance. And yet, somehow it's possible for someone to be caught on camera at this checkpoint taking someone else's computer and not be trackable.
In our times of need, it turns out the surveillance state is not as all-knowing as we might want it to be. Many a victim of an unsolved crime can surely attest to this.
Luckily for Cheng, he was able to program his computer to display his name and contact information on the log-in screen. The woman reached out to Cheng, because she said had taken it by mistake, not intentionally to steal it. Cheng told me by email that she "told me she would send me the computer, but she hasn't shipped it yet."
The Orwellian technologies we've been promised may well be coming, but as it stands now, we live in a world of not-quite-perfect surveillance.