A polarizing legal debate that’s engulfed the nation has almost everyone talking.
Should Apple be forced to help the FBI unlock a phone belonging to a terrorist? The arguments are simple enough, but the ramifications and precedent that they set could undermine trust at the foundations of Silicon Valley, one of the largest industries in the world.
The FBI scores a game-changing win over tech firms in the ongoing encryption dispute.
US judge Sheri Pym ruled Tuesday that the iPhone and iPad maker must provide a tool that would allow federal agents to beat a security feature preventing the phone from erasing after a number of failed unlocking attempts, according to the AP.
The court ruling did not order Apple to break the encryption, but said it should offer “reasonable technical assistance” to law enforcement.
The iPhone 5c was a work phone used by Syed Farook, who along with his wife, Tashfeen Malik, murdered 14 people in San Bernardino, California in December 2015.
Federal agents don’t know the passcode to the phone, and run the risk of erasing all the data. But Apple doesn’t have access to the passcode either. The company began locking itself out of the security chain to prevent law enforcement from demanding that it hands them over.
Apple’s bid to shut itself out of the encryption loop was precisely to avoid the kind of ethical dilemma that would force it into handing over customer data to the authorities.
More than 94 percent of all iPhones and iPads, which run iOS 8 or later, can be encrypted.
Apple chief executive Tim Cook said in an open letter hours after the ruling that it “opposes” the order because it has “implications far beyond the legal case at hand.”
Simply put: if Apple can be forced to hack one iPhone, where will it end?
The case is ever-changing and developing over time. We’ve collated as many questions as we can, and will update over the next few hours. If you have a specific question, send an email, or leave a comment below.
Here’s what you need to know.
What is Apple specifically being asked to do?
Apple can’t break the encryption on the iPhone (or its other products), so he FBI has instead asked the company to disable certain features that would help its agents to unlock the iPhone.
The FBI wants to create a special version of the iPhone’s software that only works on the recovered device. Apple has to sign it with its secret keys in order to install it on the subject’s iPhone. This custom version will “bypass or disable the auto-erase function” so it will not wipe the phone after a number of failed passcode guesses.
Apple must also modify the software on the subject’s iPhone will not “purposefully introduce any additional delay between passcode attempts beyond what is incurred by Apple hardware.” That’s currently about 80 milliseconds. That limits the FBI to about 12 passcode guesses each second. Farook reportedly used a four-digit passcode, says the BBC, which could take just minutes to crack. Instead of forcing someone to type in passcodes manually, Apple must “enable the FBI to submit passcodes” to the subject’s iPhone through an FBI device.
The FBI will ship the iPhone to Apple, so that the company’s proprietary code or secret keys never leaves the campus.
What kind of iPhone is subject to this order?
Farook’s phone was an iPhone 5c, running the latest version of the mobile software, iOS 9. The phone belonged to the county he worked for, San Bernardino Dept. of Public Health, which has given the government permission to search the phone.
The problem is, because the phone is encrypted, it can’t.
What is the legal basis for the FBI’s court order? What law was used?
Apple is essentially being forced to punch a hole in the security of its own product.
The judge invoked a little-know law dating back almost 230 years. The All Writs Act is designed to gives a court the “authority to issue [orders] that are not otherwise covered by statute,” so long as the request is not impossible.
OLD LAW, NEW TRICKS
But the company said it has the “technical capability” to extract data in one-in-ten iPhones.
A court forcing Apple to reverse its encryption would be “substantially burdensome,” but asking it to remove the feature that prevents the phone from erasing after ten failed passcode attempts is not.
The government invoking All Writs Act could set, in Cook’s words, a “dangerous precedent” down the line. That’s because “coding is not burdensome,” the government says, according to Andrew Crocker, a staff attorney at the Electronic Frontier Foundation.
“The scope of authority under the [All Writs Act] is just very unclear as applied to the Apple case.,” said Orin Kerr, professor of law, in the Washington Post. “This case is like a crazy-hard law school exam hypothetical in which a professor gives students an unanswerable problem just to see how they do.”
Kerr has an unprecedented insight on the case. You can read more here.
Surely the NSA can crack the iPhone. Why hasn’t it? Is there some alternative motive behind this legal move?
Some believe that the National Security Agency (NSA) can probably crack the iPhone. The agency, embroiled in mass surveillance programs in recent years, has reportedly hacked into companies’ networks to steal secret codes in order for its spies to get access to people’s phone calls, messages, and even their smartphones.
Apple said the FBI’s demands will set a “dangerous precedent.” That’s the key: the argument is that the FBI could do this itself if it really wanted to, but the government is “desperate to establish” the legal case, said Christopher Soghoian, principal technologist at the American Civil Liberties Union.
The ramifications and precedent that they set could undermine trust at the foundations of Silicon Valley, hamper growth, and force foreign companies to look elsewhere.
Is Apple being asked to bypass or break the iPhone’s encryption?
It comes down to semantics. Technically, no, there has been at no point any suggestion that Apple’s use of encryption or the crypto it uses is in any way insecure.
The court order does not demand Apple bypass the encryption because Apple can’t. But, it has been asked to remove a feature that would allow the FBI to carry out as many passcode entries as it wants. But the fact that FBI can forcibly enter as many passcodes as it wants could be considered a significant flaw in the security.
How does the iPhone’s passcode-protected encryption work?
It’s relatively simple: If you have a passcode on your iPhone running iOS 8 or later, the contents of your phone are scrambled. When you enter your four or six-digit passcode, it immediately unlocks your phone.
The passcode is coupled with a key that’s embedded in the phone’s hardware called the “secure enclave.” Because it’s part of the actual hardware, it can’t be modified.
Security researcher Dan Guido, who has been extensively cited on this case, explained this in a bit more detail on his blog:
“When you enter a passcode on your iOS device, this passcode is ‘tangled’ with a key embedded in the [secure enclave] to unlock the phone. Think of this like the 2-key system used to launch a nuclear weapon: the passcode alone gets you nowhere. Therefore, you must cooperate with the secure enclave to break the encryption. The secure enclave keeps its own counter of incorrect passcode attempts and gets slower and slower at responding with each failed attempt, all the way up to 1 hour between requests.”
He said that even a customized version of iOS “cannot influence the behavior of the Secure Enclave,” meaning any iPhone that has a secure enclave can’t just be modified by Apple.
The FBI wants to unlock an iPhone 5c, which doesn’t have a “secure enclave.” Can Apple comply with this court order?
It’s said that the FBI’s requests are “technically feasible” in this case. That’s because Apple is able to modify the iPhone’s software to remove the security features.
Guido noted on his blog:
“On the iPhone 5C, the passcode delay and device erasure are implemented in software and Apple can add support for peripheral devices that facilitate PIN code entry. In order to limit the risk of abuse, Apple can lock the customized version of iOS to only work on the specific recovered iPhone and perform all recovery on their own, without sharing the firmware image with the FBI.”
Apple has not said if it has no technical means not to comply.
What about other iPhones? Is it possible to unlock other, newer iPhones?
A senior Apple executive speaking to the media on background (reporters were not asked to name executives or quote them directly) said Apple is fighting for all its iPhones, not just the terrorist’s phone.
“The custom software tool the FBI has ordered it to develop in order to crack into a dead terrorist’s iPhone 5c would be effective on every type of iPhone currently being sold,” reports Motherboard, one of the news outlets on the call.
Apple executives said that the request was “unduly burdensome” — its main argument against carrying out the order — and that it could take weeks or months to carry out.
It’s worth noting that Apple can bypass the passcode on devices running software prior to iOS 8, with or without a court order.
If this sets a legal precedent, other companies could be forced to perform similar actions. Who else in the tech industry supports Apple?
At first, Silicon Valley was muted. It wasn’t clear why. Some were worried they might make themselves targets, or lose government contracts down the line.
Sundar Pichai, chief executive of Google called for in a series of tweets on Wednesday “a thoughtful and open discussion on this important issue.” Pichai fell short of demanding an end to the FBI’s offensive, but did say that hacking of devices could set a “troubling precedent.”
“We must not allow this dangerous precedent to be set. Today our freedom and our liberty is at stake,”
— Jan Koum, WhatsApp CEO
Some saw it as a voice of support, whereas others thought it was a weak statement.
Jan Koum, chief executive of WhatsApp, published a post on Facebook (which owns WhatsApp) in support of Apple’s stance. “We must not allow this dangerous precedent to be set. Today our freedom and our liberty is at stake,” he said.
Twitter boss Jack Dorsey said on Twitter that he supported Cook’s decision, tweeting: “We stand with @tim_cook and Apple (and thank him for his leadership)!”
Other companies associated with the Reform Government Surveillance coalition, which includes Microsoft and Yahoo — two firms also implicated by the PRISM surveillance program — offered tepid support.
“RGS companies remain committed to providing law enforcement with the help it needs while protecting the security of their customers and their customers’ information,” the statement read.
Republican presidential nominee frontrunner Donald Trump called for “common sense” to prevail and for Apple to work with the FBI. Trump said he “100 percent” agreed with the courts. “But to think that Apple won’t allow us to get into her cell phone, who do they think they are? No, we have to open it up,” he said.
No presidential candidate has yet endorsed or spoken out in favor of Apple’s move.
The FBI says it’s not impossible, and the court has issued an order. So why is Apple refusing to comply with the court order?
Cook said in an open letter published on Apple’s website that the court’s demands “would undeniably create a backdoor” for the FBI.
Apple argues that introducing a backdoor into the iPhone wouldn’t just make Farook’s phone insecure, it would make every iPhone weaker. As pointed out by The Guardian, the argument that Apple is somehow “helping” the terrorists isn’t fair. Because encryption (and other technologies) are inherently agonistic, Apple cannot pick and choose who it protects. Either it mandates privacy for everyone, or no-one.
Cook said the FBI had “asked us for something we simply do not have, and something we consider too dangerous to create.” It would be opening Pandora’s box of security.
Why did Apple begin to roll out passcode-protected encryption in the first place?
Some argue it was the US government’s fault that sparked Apple to begin encrypting its devices in the first place.
The move to add encryption was in part a response to accusations that the company was complicit in the PRISM surveillance program, leaked by whistleblower Edward Snowden, a claim the company strenuously denies. Apple aimed to show this by setting itself apart from the rest of the crowd by bolstering its encryption efforts in such a way that makes it impossible for it to decrypt the data.
Cook said in an interview with PBS’ Charlie Rose at the time that if the government laid a warrant at its door, “We don’t have a key. The door is closed.”
Apple announced it switched on encryption the day iOS 8, released in September 2014, was released, likely to preempt any government pushback.
Edward Snowden, said in a tweet following the court ruling, said the FBI was “creating a world where citizens rely on Apple to defend their rights, rather than the other way around”.
What’s stopping other countries and repressive regimes, like Russia and China, making similar demands?
The US won’t be the only country wanting this power. If the US can have it, why can’t Russia, or China, or any other major global powerhouse? Because Apple is headquartered in the US, it has to abide by US law. But it has to also adhere to every law it operates in. That can get tricky very quickly.
Sen. Ron Wyden (D-OR), a member of the Senate Intelligence Committee and staunch privacy advocate, said the move could easily “snowball” around the world. “Why in the world would our government want to give repressive regimes in Russia and China a blueprint for forcing American companies to create a backdoor?” he added.
China could impose rules forcing Apple to hand over encryption keys — or some backdoor technology that the US has demanded — or it could stop the company from operating in China. That could be a massive blow to the company, where its mainland China revenue accounts for almost half of its global revenue, as of its first fiscal quarter.
Apple told reporters that “no other country in the world has asked them to do what DOJ seeks.”
But it’s not just oppressive nations. The UK has a draft surveillance bill in its parliament, which if it passes, could demand the same “secret backdoors” that the FBI sought. (Vice’s Motherboard has more on this.)
Can Apple appeal this case?
Apple has until February 26 to respond to the court order. A hearing is expected on March 22, according to Reuters. If Apple were to challenge the order (which is expected), it will appeal to the Ninth Circuit appeals court.
It’s possible this case may go all the way to the Supreme Court, but only if the government “loses big” at the appeal’s court, said Nate Carozo, staff attorney at the EFF, said in a tweet.
Article source : Business Original Page