My news feed is filled with people talking about Apple, the FBI, the DOJ and iPhone unlocking. What the hell is going on?
The FBI has an iPhone left behind by San Bernardino shooter Syed Farook, who, along with his wife, killed 14 people and injured others at his workplace in December. The FBI wants to unlock the phone so they can see if there's anything helpful for their investigation. But there's a problem: the phone's data is encrypted; it can only be unlocked with Farook's passcode and Farook is dead.
So why don't they do what I do to unlock my bae's iPhone? When he's sleeping, I just press it against his thumb. They have Farook's body. Why don't they do that?
Whoa, you're a creep. And they can't do that because Farook's phone is locked with a passcode not a fingerprint. Plus, the phone, which his work gave him, is an older model, an iPhone 5c which doesn't have a fingerprint scanner.
Well. Why don't they just look at the smudgy marks on the phone screen and try to guess his passcode?
Apple has built extra security into the iPhone: 1. It only lets you try 10 incorrect passcodes before it erases the data on the phone and 2. It makes you wait an increasingly long time between incorrect attempts. So the Department of Justice got a judge to force Apple to write special code that will turn those features off so that it can brute-force the passcode and open the phone.
Okay, well, what's the problem?
Apple doesn't want to write the special software. It specifically designed iPhones this way to make them secure, so that only a phone's user would be able to get to the sensitive information stored within.
"Once created, the technique could be used over and over again, on any number of devices," wrote Apple CEO Tim Cook in a public letter opposing the judicial order. "The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals."
Yeah, but, you said this is Farook's work phone. And his work supports opening it. So, why is Apple not helping out?
Because it would set a precedent where Apple would be required to build special tools to undermine security protections to get at its customers' data whenever the government asks. In New York already, the district attorney says he has hundreds of iPhones he would want Apple to use this tool on. And it would set that precedent for other companies too. That's why other tech companies, including Google, Twitter, Facebook and Mozilla, have come out in support of Apple's position.
Both shooters are dead. What's even on the phone that the FBI needs?
Before the shooting, Tafsheen Malik, Farook's wife, posted a message to Facebook swearing allegiance to ISIS, transforming this into a terrorist act rather than just a horrible, one-off workplace shooting. The FBI wonders if they were part of a terrorist network. Moreover, the Department of Justice said in a court filing that "evidence found in the iCloud" associated with this phone suggests Farook was in contact with his wife and with his victims shortly before the attack. The FBI wants to know what was said.
Wait a second. The FBI was able to get into the iCloud?
Yes, the Apple iCloud isn't encrypted the way the data on a phone is. Apple can get into that data and hand it over when served with a warrant—which it did here. But the last time the phone was backed up was in October. The FBI wants the phone's most recent data.
So they have the phone. Why don't they just make it back up again?
You're smart! That was Apple's suggestion to the FBI. If the phone were connected to a WiFi network that it recognized, it might back itself up again revealing the most recent data. But, unfortunately, Farook's employer changed the Apple ID password associated with the phone, per the request of the FBI, and now that won't work.
Yeah, it was a screw-up. The phone would now need to be unlocked so it can sign back into the iCloud with its new password.
Doesn't the U.S. government have a hacker on payroll, perhaps at the NSA, who can just figure this out?
I know, dude! In the movies, the NSA can hack anything. In a laughable public letter, "cybersecurity legend" John McAfee said he could easily crack it and would do so for free "or eat a shoe." As iPhone forensics expert Jonathan Zdziarski put it to me, "Either the government's capabilities are severely limited beyond what we thought, or this is a government test case to see how the courts and how Apple will respond."
The government chose a good test case!
They did. According to Bloomberg, it's part of a bigger picture: in the fall, the National Security Council issued a "secret memo" that tasked government agencies with finding workarounds for encryption, like that protecting this iPhone. In this case, Apple is put in the uncomfortable position of refusing to help cooperate with a terrorism investigation, even if it's unlikely there's anything interesting on that phone. Farook had two other non-work devices that he destroyed, which makes it seem more likely those were the ones with the valuable evidence on them.
I thought employers could see everything you do on their devices. Why doesn't Farook's work just pull the records on the phone?
Apparently, this employer was less creepy than others.
Wait, you said the FBI wants to know what Farook texted to his colleagues before attacking them…
So why doesn't the FBI just check their phones instead?
That is a good, but unanswered question. If the victims' families were willing to share their phones, and if they knew their passcodes, they could share that information with the feds. Given the fact that the victims support breaking into Farook's phone, it seems like they'd do everything they could to support the investigation. The DOJ says though that it wants data that "may reside solely on the phone" and "cannot be accessed by any other means."
Why is this such a big deal?
Because we live in a world in which we leave behind digital trails constantly. Ever since the Snowden revelations, we've been having an ongoing debate about just how much information the government should have access to. Apple is drawing a line in the sand here, and saying it's not willing to undo protections that it built into the iPhones after Snowden, and that it thinks the FBI's use of the All Writs Act of 1789 to force it do so would have a "chilling" effect on privacy.
The All Writs what?
It's the law that the DOJ is invoking to get Apple to help it out. It gives judges a lot of power to make special requests to aid government investigations. Former federal prosecutor Orin Kerr has a great post about it. Tl;dr: it's the same law the government used in the 1970s to get a phone company to use a tool that recorded all the numbers someone dialed from their phone—which has since become a standard surveillance tool.
So, legally, this isn't a typical "right to privacy" case, because Farook is dead and so doesn't have privacy rights, and the owner of his phone, the San Bernardino County Department of Public Health, has given permission to the FBI to go into the phone. It's more a case of how broad the All Writs Act is, and whether Apple can be forced to build special software.
I read that this hack the FBI wants Apple to build would only work on old iPhones, and that I'm safe if I have an iPhone 6.
Not according to Apple. During a background call with reporters, a senior Apple exec said the tool that the FBI wants it to build would work on all iPhones, even the newest models that have the Security Enclave that you have heard people talking about. Though if Farook's passcode is longer than 10 digits, it's going to take a very long time to crack.
So now what?
Apple plans to fight the order, but on Friday, the DOJ filed a new motion to compel Apple to write this software post-haste. The DOJ says Apple is opposing them as part of a publicity and marketing stunt.
Opposing the government is a marketing ploy now?
Crazy, right? Except, according to a Pew survey, most of the public sides with the FBI and says Apple should unlock the phone—though it's unclear if the public really understands all the bigger issues at play.
So what if Apple is ordered again to unlock the phone?
It can keep on appealing the decisions. It might, along the way, be held in contempt of court if it refuses to comply—which would mean it would have to pay fines. When encrypted email provider Lavabit refused to hand over a cryptography key needed by the FBI to get at information about Edward Snowden's email, it had to pay $5,000 a day—a puny amount for a company with Apple's coffers. Yahoo on the other hand was threatened with a $250,000 fine, that would double daily, when it refused to give information to the NSA.
It seems highly likely that this battle is heading to the Supreme Court. Then the nation's highest court will have to decide just how much power the government should have to get access to information. Its decision won't just matter here in the US. Once this tool is built, and this precedent set, it will mean any government, even highly repressive ones, will be able to use legal tools to undermine the security protections provided by technological ones.
Why does helping out the US government mean that Apple has to help out some other government?
Because companies generally have to comply with what governments ask them to do, and if they do it for one government, they can't usually refuse to do it for another government. To summarize this piece from the New York Times, if the US po-po can get Apple to do this to help it unlock a phone, that means the Chinese po-po will be able to do the same thing.
Did I miss anything? Leave questions in the comments, email me (firstname.lastname@example.org), or tweet at me.