Here’s why the FBI forcing Apple to break into an iPhone is a big deal – USA TODAY
When U.S. MagistrateÂ Sheri Pym ruled that Apple must helpÂ the FBI break into an iPhone belonging to one of the killersÂ in the San Bernardino,Â Calif., shootings, the tech world shuddered.
Why? The battle of encryption “backdoors” has been longstanding in Silicon Valley, where aÂ company’s success could be made or broken based on its ability to protect customer data.
TheÂ issue came into the spotlight after Edward Snowden disclosed the extent to which technology and phone companies were letting the U.S. federal government spy on data being transmitted through their network.
Since Edward Snowden’s whistleblowing revelations, Facebook, Apple and Twitter have unilaterally said they are not going to create such backdoors anymore.
So here’s the “backdoor” the FBI wants: Right now, iPhone users have the option to set a security feature thatÂ only allowsÂ a certain number of tries to guess the correct passcode to unlock the phone before all the data on the iPhone is deleted. It’s a security measure Apple put in place to keep important data out of the wrong hands.
Federal prosecutors looking for more information behind the San Bernardino shootingsÂ donât know the phone’s passcode. If they guess incorrectly too many times, the data they hope to find will be deleted.
That’s why the FBI wants Apple to disable theÂ security feature. Once the security is crippled, agents would be able to guess as many combinations as possible.
What does Apple have to say about this? They haven’t commented yet today, but back in December, Apple CEOÂ Tim Cook defended the company’s use of encryption on its mobile devices, saying users should not have to trade privacy for national security, in aÂ broad interview with 60 Minutes. In the interview,Â Cook stood by the company’s stance of refusing to offer encrypted texts and messages from users.
He said: “There’s likely health information, there’s financial information,” says Cook describing a user’s iPhone. “There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It’s because if there’s a way to get in, then somebody will find the way in.”
Cook says Apple cooperates with law enforcement requests, but can’t access encrypted information on users’ smartphones.Â According to a page on Apple’s websiteÂ detailing government requests, Apple says encryption data is tied to the device’s passcode.
Cook also dismissed the notion that iPhone users should swap privacy for security. “We’re America. We should have both.”
And the lawmakers? Well, they are torn. Key House Democrat,Â Rep. Adam Schiff, D-Calif., saysÂ Congress shouldn’t force tech companies to have encryption backdoors. Congress is struggling with how to handle the complex issue, especially afterÂ last November’s attacks in Paris, where investigators believe that some of the terrorists used encrypted phone apps to communicate via the “Dark Web.” On the other side of things, Senate Intelligence Committee Chairman Richard Burr, R-N.C., and Vice Chair Dianne Feinstein, D-Calif., say they want to require tech companies to provide a backdoor into encrypted communication when law enforcement officials obtain a court order to investigate a specific person.
What now? This could push the tech companies to give users access to unbreakable encryption. To some extent, it’s already happening. Companies like Apple and Google â responding to consumer demands for privacy â have developed smart phones and other devices with encryption that is so strong that even the companies can’t break it.