The Apple vs. FBI controversy going on right now is quite a techno-political drama. FBI wants a terrorist’s phone unlocked to read its content. Simple enough, right? Unless it’s Apple iPhone! This has triggered one of the most important legal battles over the future of security – Digital vs. US National security. Strange? Let’s have a look about what’s at stake.
16-Feb: US Magistrate Sheri Pym ordered Apple to build a custom version of its iOS software to unlock an iPhone 5C used by Syed Farook, one of two terrorists who gunned down 14 people at a party in San Bernardino, California, in December. Apple refused, arguing that the order goes too far and that bypassing the password means creating a “backdoor” in its iOS mobile operating system that could be used to access every other iPhone.
Apple chief executive Tim Cook issued a letter on the Apple homepage entitled “A Message to Our Customers.” The FBI had asked Apple “to build a backdoor to the iPhone,” Cook wrote. “The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor,” Cook wrote. “And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.” Ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect,” he wrote.
iPhone 5C Challenge
Introduced in 2013, iPhone 5C was Apple’s lowest-priced iPhone. Farook had the least expensive model: an 8GB version. Data is stored on a memory chip that’s soldered to the phone’s motherboard. It’s locked with a passcode. The FBI doesn’t have the code, and neither does Apple. The passcode is stored only on the device itself. Because of Apple’s built-in security, you have up to 10 tries to enter a passcode. After that, the iPhone wipes itself — that is, removes all the data stored on the device.
iPhones running 2014’s iOS 8 software or the newer iOS 9 protect their data using 256-bit AES encryption [Advanced Encryption Standard]. That’s the same standard that protects US government computers against brute-force attacks intended to crack into a device. It could take years to recover data by attacking the iPhone’s memory chip.
The iPhone is designed to run only iOS software created by Apple. For the phone to recognize that the software was made by Apple, the company must sign each piece with an encrypted key to verify it. Even if the F.B.I. tried to build a new version of iOS, it would not have Apple’s crucial signature.
The Court Order
The court order asks Apple to create a new, custom version of iOS that runs only on this specific iPhone and that makes three changes to the software. The first two changes would bypass or disable the auto-wipe function and the delay that limits how quickly new passcodes can be entered. The court also asks Apple to add a way to attach a cable or wirelessly connect to the iPhone so the FBI can automatically enter passcodes. That way, the FBI can use a supercomputer to bombard the phone with passcode guesses until it finds the right one.
“Backdoors” in security parlance refer to vulnerabilities used to access an otherwise closed system. Though the order does not require the creation of a back door per se, it does require Apple to disable core security features that will allow the FBI to quickly and easily hack the phone. This is functionally a backdoor. If any black hat hacker, foreign intelligence agency, or criminal syndicate got their hands on this tool, they could exploit it for their own nefarious purposes.
Smartphones & a 227-year-old law
The government is using the All Writs Act, which was signed into law by President George Washington in 1789, to get Apple to change its software. The FBI is citing this law from 1789 that requires people or businesses not involved in the case to execute court orders. Apple’s defenders cite a 1977 Supreme Court case that says the government can’t place “unreasonable burdens” on a third party to assist law enforcement.
Over time, use of the All Writs Act has been more or less limited to situations where no other law, statute or provision can be applied, usually because it’s extraordinary. The shooter’s iPhone passcode is certainly an extraordinary situation, which explains why a law from 1789 is at play in a case about smartphones. In fact, Apple has complied with previous demands under the All Writs Act at least 70 times.
Apple already gave the FBI data that was backed up from Farook‘s phone to the company’s iCloud online storage service. But Apple was able to give the FBI backups only through October 19, when Farook apparently stopped backing up the phone.
Apple had another possible solution: If the FBI placed Farook’s phone near a known Wi-Fi network (like the one at his home or his workplace), it might automatically create a new iCloud backup with the missing information. That idea was foiled when investigators reset Farook’s iCloud password. Apple also said it had been working with the government since the initial requests came in, recommending four different ways to recover the data without building the backdoor.
Apple Inc. (“Apple”) informed the government and the Court that it will seek relief from the Court’s Feb 16 Order Compelling Apple to Assist Agents in Search (the “Order”). On Feb 18 the Court held a telephonic status conference in this matter and heard the positions of the government and Apple. The government filed a Motion on Feb 19 to Compel Apple Inc. to Comply with this Court’s Feb 16 Order Compelling Assistance in Search.
The Court will hear Apple’s application for relief and the government’s motion to compel on March 22 and Apple shall file its application for relief/opposition to the government’s motion by not later than Feb 26.
Patriotism vs. Privacy
The FBI and the Department of Justice say it’s about making sure Americans aren’t in jeopardy, about fighting terrorists who are using increasingly sophisticated communication tools, and about a reasonable request to gain evidence from a single iPhone.
Apple says the fight is about security and privacy for everyone, about the US government trying to compel a public company, using a 227-year-old law, to compromise its most important products, and about setting a “dangerous precedent” that gives the US authority to ask it and other businesses to change their products in the future.
Apple’s CEO Tim Cook argues that once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks.
Federal prosecutors claim Apple has less-than-pure motives in fighting the order. Apple’s current refusal to comply with the court order’s order, despite the technical feasibility of doing so, instead appears to be based on its concern for its business model and public brand marketing strategy. Apple executives, meanwhile, vowed not to back down and accused the government of trying to use public pressure against it. Apple said they had no intention of backing down, reiterating that their position was in the best interest of Apple customers and the country. The implementation of the order would necessarily place at risk the security of millions of other devices and the people who use them.
Beyond a single case
A chilling Precedent – Not only would it require a company to create a new vulnerability potentially affecting millions of device users, the order would also create a dangerous legal precedent. The next time an intelligence agency tries to undermine consumer device security by forcing a company to develop new flaws in its own security protocols, the government will find a supportive case to cite.
The Brand Image – It is believed to be technologically possible for Apple to do what the court has demanded. However, the process would be a devastating blow to its image around the world. Seeing the extent of government surveillance, many tech companies are under pressure to show customers that they hadn’t been selling their data to the government. Apple is in a strong position to do that. That might get people to buy phones from Apple instead of the competition.
The Law – The fundamental question here isn’t whether the FBI gets access to this particular phone, it’s whether a catch-all law from 1789 can be used to effectively conscript technology companies into producing hacking tools and spyware for the government.
The Trade – The issue is not just about individual freedom and liberty, but also about the very future of the U.S. Technology industry. This is not just a law-enforcement matter. It is also a global trade issue that will potentially affect the United States’ leading competitive edge. Silicon Valley’s competitors know that the customers are up for grabs if the U.S. government weakens the quality and security of high-technology products by demanding intrusive access.
Many tech companies, including Facebook, Google, LinkedIn, Microsoft, Twitter, and Yahoo have come out strongly in support of Apple and Cook. While technology companies recently have resisted government demands, Apple’s letter is one of the industry’s most forceful push backs against a court ruling. These demands would create a chilling precedent and obstruct companies’ efforts to secure their products.
A narrow majority of Americans sides with the FBI in its fight to get Apple Inc. to help unlock the iPhone, according to a new survey. About 51% of Americans agree with the FBI, while 41% side with Apple, according to a poll of about 1,000 adults conducted this week by online-survey company SurveyMonkey Inc.
This week, rallies are scheduled in 30 cities outside Apple stores by consumers supporting the company. And surely more politicians will call for Apple to find a compromise with officials.
Apple’s fight against the FBI is anything but simple, don’t know how it’s going to turn out. Experts expect the case to go all the way to the Supreme Court. Lawmakers have repeatedly rejected new efforts to put limits around companies’ ability to employ encrypted communications, but they are likely to revisit the issue now. In the most basic sense, the debate is about Privacy versus Security, with valid arguments on each side.
What’s your take on this? Do share in the comments below…