There might not be an easy iFix for this feud.
A long-brewing conflict between one of the world’s largest and most recognizable companies and the FBI leapt into public view on Tuesday after a federal judge ordered Apple to help government investigators find a way into an iPhone used by one of the shooters in the San Bernardino, California, massacre last December.
So … what’s the deal? Is Apple really locking itself out of its own phones? Does everyone in the government agree on this issue, against Apple and other major tech companies? And what do cybersecurity experts and cryptographers think?
Who uses encryption, anyway? Is it all bad guys?
You have almost definitely used encryption, maybe every day. Along with almost everyone else you know. It’s an ancient technology that’s been used since the days of Romans. Now, with a computer in everyone’s pocket, there’s just a lot more of it around, and it’s a lot more sophisticated than anything Julius Caesar ever used.
If you’ve ever made a mobile banking transaction, it probably involved encryption. Ordinary folks, multinational corporations, and even government agencies love BlackBerry smartphones because of the security of BlackBerry Messenger. Many websites and services like Dropbox use various layers of encryption. Encryption on its own isn’t something that’s weird or creepy. It’s basically a math problem so hard you can’t figure it out without a special key.
What does the FBI want?
Both Apple and the FBI want to do their job, as they see it. The FBI wants to fight bad guys and keep people safe. Apple wants to build cool new technology and keep its shareholders happy.
As more and more people use digital devices to communicate every day, everywhere around the world, some law enforcement agencies have looked around and said, oh crud, we can’t just open their mail or tap their phones anymore, even with a court order. They refer to this as the “going dark” problem.
In the case of the San Bernardino shooters, the judge has ruled that Apple has to provide “reasonable technical assistance” to help the government get info on the iPhone 5C used by Syed Farook. Apple has five days to respond to the court order, which essentially asks the company to build a program that would allow the FBI to try passwords over and over again until they got into that specific phone, decrypting the data.
It’s important to note that this particular court order doesn’t ask Apple to just go ahead and smash all their encryption on all devices with a hammer at once.
“There is no threat to mass surveillance here,” Lance James, a cybersecurity expert and chief scientist at NBC News partner Flashpoint, said in an email. “It was a reasonable search warrant request no different than a warrant to the free webmail services or Facebooks asking for data. You’re not giving them your keys to all your data, you’re only giving them the very specific data of the account that was requested.”
“Forensically speaking and legally speaking, the judge asked for reasonable assistance on unlocking this specific phone,” James said. “Even if that requires them to modify the firmware with a key they have, they don’t have to give that software to the FBI.”
Lots of law enforcement agencies, from the FBI to the New York Police Department, have said they want a way to peek in on what’s going on in these encrypted communications if they get a court order.
What does Apple want?
In January, Apple said that more than a billion of its devices had been active around the world in just the previous three months. Making sure all those people feel like the NSA isn’t staring at them through their iPhone camera is an important part of Apple’s business.
In a letter addressed to customers and published on Tuesday, Tim Cook said his company opposed the court order and placed the issue of this particular case in the larger frame of the cryptography debate.
“The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data,” Cook wrote. “The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”
What does the cybersecurity community think?
This new court order presents some wrinkles for the ongoing conversation. But there’s been a lot of chatter about the broader encryption issue in the tech community over the past year.
As encryption became a hot topic again after the terrorist attacks in Paris and San Bernardino at the end of last year, a flurry of reports from top cybersecurity and code experts have come out outlining the difficulties in trying to either stop the spread of encryption or build a backdoor into existing devices.
Two separate studies released in February by Harvard University’s Berkman Center for Internet & Society have addressed the crypto debate, and both urged a degree of caution in allowing increased government access.
The first study took aim at the “going dark” question, and led researchers to conclude that we are likely not headed for a future in which police don’t have the tools they need to catch bad guys. The second found that, even if encryption was weakened or halted in the United States, there are literally hundreds of encryption products, many of them free, available from other countries that criminals or terrorists could easily install and use.
Fifteen top cybersecurity and cryptography experts authored a report released in June 2015 that said weakening encryption or providing other workarounds for law enforcement would create new security challenges for users.
This fight could drag on for a while. The two sides are fairly well dug in, and the argument has been framed in stark terms that pit the threat of an invasive Big Brother versus the specter of violent extremists.
The Electronic Frontier Foundation, a nonprofit digital rights group, said it will support Apple, warning that, “Even if you trust the U.S. government, once this master key is created, governments around the world will surely demand that Apple undermine the security of their citizens as well.”
And they probably won’t be the only group to step up in Apple’s defense. Close to 200 individuals, companies and other groups — including a former CIA analyst, the ACLU and Amnesty International — signed a letter asking President Barack Obama to oppose any encryption “back door.”