Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Presumably any "backdoor" is a security hole that can be accessed on any phone. So far, this request asks that Apple remove only the part of one phone's security that 1) destroyes that phone's data after 10 failed tries, and 2) slows the automated entry of passcodes into that phone. So far, this case is specific to one phone and thus not about a general backdoor.

As the case stands, unless Apple can show how compliance weakens security on other phones, I don't see how they can refuse to comply, unless it's technically impossible for them to do. If it is impossible, their refusal in this case may lead to a ruling that requires them to change iOS to comply with such requests in the future. But if they can comply now, only the security on one phone will be diminished. It's strongly in the company's interest to comply.

If Apple does not (cannot) comply, and subsequently they are ordered to change iOS to comply with future requests, would this be a "backdoor"? Yes. I don't see how Apple can add this "feature" to iOS and assure that only authorized legal authorities could activate it in the future.

Is this case about changing iOS to comply with future requests of this kind? Not yet.



Precedence is a serious issue in the U.S. While the end result of this case might not be an immediate, radical policy shift, it would certainly lay the groundwork for that in the future.

You give the devil an inch...

As others have said, this is an attempt to demonize encryption in the eyes of the public.

I truly do not know what to think of the matter, since Apple seems to enjoy a suspicious amount of support from the judicial branch. Tim's letter does seem to fly in the face of our government's desired narrative. Hopefully he gets his vehicle inspected frequently and keeps his closet free of skeletons.


If Apple creates new software that removes a part of one iPhone's securty, that software can potentially be used on all iPhones. It's no different from what you said: " I don't see how Apple can add this "feature" to iOS [or in this case, create a tool to use on the current version of iOS] and assure that only authorized legal authorities could activate it in the future."


How is that different from "If I hand over this iCloud data, I'll need to create software to download iCloud data from a specific user, and that software can potentially be used on any iCloud account"? How is this not applicable to any warrant that requires technical ability to comply?


Apple doesn't make the same promises about encryption and privacy on their iCloud service that they do their iPhones, that's the only difference. If you store information in iCloud and the FBI comes asking for it, Apple will hand it over. By the very nature of iCloud, Apple has access to your data. Not so with your iPhone. Apple says your device is encrypted and they cannot access it without your consent, end of story. Forcing them to create tools that invalidate this promise on one iPhone means creating tools that can invalidate the promise on any iPhone.


Saying you'll break the law before you do it does not make it legal. "We told our customers we won't do something" is not a legal defense.

By the very nature of iPhone 5C, Apple can put whatever software they want on it with physical access.

Also, note that the phone is owned by the government. They provided it to the terrorist, who worked for the government. So there's no promise broken to customers, because it's the customer themself who's asking for this.


I'm not saying anything about the (il)legality of Apple refusing the FBI's request, or whether their iPhone promises constitute a legal defense. At this point it's neither legal nor illegal, and the courts will take up this case soon (right now, Apple is arguing that precedent is on their side. From the NYT: "In a 1977 case involving the New York Telephone Company, the Supreme Court said the government could not compel a third party that is not involved in a crime to assist law enforcement if doing so would place “unreasonable burdens” on it.").

I've only been responding to the claim that this is only about "one iPhone", as if there would be no impact on all iPhones as a result.


Apple lost in court. They may win on appeal, or not, but as of now they've lost. I think that "it may or may not be legal" is not correct.

Re one iPhone: I responded that the same could be said about any warrent. The actual order requires them only to modify one iPhone. Whether that proves they can do it for others doesn't matter. If they need to do it in this case, they need to do it whenever the government has a warrent. That's not a slippery slope, and it doesn't affect any non warrented devices.


The legality is still in question, then. If they can still win on appeal, it's not definitively illegal.

As for the rest, I only pointed out the difference between this request and an iCloud request, which you asked about (rhetorically, I know). This iPhone request plausibly places an undue burden on Apple that the iCloud requests do not, so it's different (including from a legal standpoint). If they eventually lose and the Supreme Court says "No it's not an undue burden, now go hack the phone!" then fine, the highest court in the land will have declared your analogy sound. Right now, that's very much in dispute.


1. This is quickly getting into the philosophy of law. Every case can be struck down by either appeal or a later case overturning precedent. I think such cases should be thought of as "it was illegal, but now it has been changed by the court". So once someone's lost, we call their actions illegal until they appeal and win.

2. You said that complying would open the door to doing it for other phones. In that regard, complying with an icloud request also opens the door to complying with future icloud requests. The fact that it creates the ability (for Apple) to do it later has no effect on legalities.

There may be a difference on undue burdens, but that's a different point. Your point would still be invalid. " creating new software for FBiOS is an undue burden, but creating new software to download a user's icloud data isn't " is a different argument than "creating new software means we can use it again later". The latter is the argument you made, and it doesn't differentiate between icloud and FBiOS.

If we're concerned the FBI will take the software and use it without a warrent, Apple was given the option to do everything on their own premises and just give the FBI the data/unlocked phone when done.


> You said that complying would open the door to doing it for other phones. In that regard, complying with an icloud request also opens the door to complying with future icloud requests.

There is one crucial difference. In the iCloud case, the government must ask Apple any time they want to get iCloud user data. Each time, Apple can verify that they have a court order before doing the work. So it's "stateless" in that the first request doesn't "open the door" for later requests.

In the iPhone case, the government is asking for an OS (signed by Apple) that can be flashed onto any device. This new OS would have a giant backdoor that disables all the protections of an iPhone. We all know there is no way to prevent this OS from being used elsewhere, for other uses. This is not stateless -- once Apple creates this OS, there is no going back, all phones are now insecure.


I went into this elsewhere.

Every update to any iOS device (well, any since iPhone 3GS) requires a signature unique to that device. Since iOS 5, it also includes a nonce generated on device at the time of upgrade, so you can't even replay the signature, it needs to be signed at the time you install the new version.

This is why you can't downgrade to earlier versions of iOS after Apple stops signing them.

Anyway, I hope it's clear now why that was wrong.


> Also, note that the phone is owned by the government.

There's the irony in this situation.

The reason nation states actually purchase iPhones is BECAUSE of the security they provide.


He wasn't working in a security role, so that would not have been a concern for whoever made that decision.


Other government agencies that do actually have people in security roles will be less inclined to issue iPhones.

This case sets precedent, that's the whole point.


But Apple is only able to do anything at all because it's an old phone without security features present in later models.

Also, any other manufacturers' phones would be trivially hacked in the same case, because the firmware can be overwritten without needing to be signed. Apple is pretty much the only one that requires signing.


I really don't understand how anyone can claim "this case is specific to this phone" like that means anything at all. Of course the case is specific to this phone. And the next case will be specific to the next phone. How could any case not be "specific" to the case at hand? It's just a tautology.

Can the government compel Apple to create new functionality to enable cracking their own phones' security? Any reasonable discussion on the topic must consider if the answer is "yes", then why is the answer yes in this case, and perhaps no in any other case. What (if any) are the specific conditions under which they can be forced to assist? What is the legal precedence that is being set, and is there anything that makes this case special?

Is the answer "yes, Apple must assist" because the subject of the investigation is a mass murderer? As far as I know, there is no such law. As I understand it, the specific crime or facts of the case are completely irrelevant to the legal argument the FBI is presenting. There is encrypted data on a phone that they think could be useful in investigating a federal crime of some kind. Nothing more.

The legal justification is crucially important. The FBI is arguing Apple must comply as a matter of "All Writs" and if they succeed in this line of argument, then from what I understand it clearly follows that any time the government wants access to encrypted data, and has either permission of the device owner or a warrant, then they can compel Apple to assist in recovering that data.

So I'm fairly convinced this case is absolutely not specific to "one phone". At most this case is "specific" to compelling the development of custom software and firmware after-the-fact in order to bypass the existing security measures of a device in order to access otherwise encrypted data. If they are forced to comply in this case, either explain why they can't be forced to comply in the next federal case, or stop claiming that the security on only "one phone" will be diminished.

This is no better than the word wrangling the NSA does around "collecting" data. The idea that it's not a "backdoor" just because it is deployed in real-time on a per-phone basis versus being pre-deployed across all phones is balderdash. This is a distinction without a difference. How the backdoor is deployed doesn't make it any less of a backdoor to the person who has the capability. And of course iPhones clearly have the capability of deploying said backdoor over-the-air just as easily as over the lightning cable.

That the firmware image might have to be re-signed by Apple's code-signing key because it includes a hard-coded identifier for a specific device is equally irrelevant, unless you can explain why the government can't compel Apple to generate these firmware builds on-demand whenever they want them.


If for each phone to be opened, the government must first get a search warrant, and second approach Apple with phone in hand and request that only that phone be cracked... this does not constitute a backdoor. This is just due process.

If Apple does not comply, the courts might 1) reject Apple's claim they cannot open the phone and respond to that (e.g. do nothing or jail Cook for contempt), or 2) accept Apple's claim and respond to that (e.g. do nothing or demand that Apple develop a means to crack only this phone, given that model and version of iOS.)

As additional requests to crack other phones arise in future cases, this new Apple service is likely to be requested again, with new court order in hand. But adding such a capability within the services that Apple Corp. provides to law enforcement does not constitute a "backdoor", at least not in a technical sense. It becomes a backdoor only if it's accessible to others without following legal due process. That's the theory anyway.

In practice, would this create a vulnerability that could be abused outside due process? Probably yes. But what's more likely and more dangerous is that the US government would continue to say, "Fuck due process", and once the tech exists, they would hack it themselves and then invite every law enforcer down to dog catcher to circumvent due process and abuse the system, the way it has with the Stingray program among numerous others.

So yes, I agree that this would be a very bad precedent. But I think it's imprecisde to call it a technical backdoor. It's more a pre-malicious legal loophole, like so many of the practices adopted by US law enforcement since 9-11.


I think we need to be very precise and consistent with the language. A "backdoor" is a coded method to allow bypassing encryption without authorization. That the backdoor may only be exploited after legal due process doesn't make it any less of a backdoor. A "backdoor" by definition includes some attempt at making it non-publically accessible (otherwise it would be a front door!)

Today, there is a vulnerability that exists on the 5C. If Apple writes the code to enable them to exploit this vulnerability, then they have created the backdoor. Reasonable people may then disagree if the backdoor will remain secure from unlawful use, or if we can even trust the system that Apple and the Government setup together will ensure the backdoor is only used lawfully.

The biggest problem I think many people are missing is even if the backdoor is only ever used with "legal due process", the established legal precedent for getting Apple to create the backdoor would now be "All Writs". It's right there in the name of the law -- now we have a precedent that says, any time the government can show legal due process to want a backdoor created in a device which might possibly assist in some federal investigation, the manufacturer would be required to create such a backdoor for the government. Well, fuck.

It seems to me the only logical conclusion is every electronic device you own will be backdoored to have the capability to spy on you. Your car, your phone, your security system, your thermostat, your entire Internet of Things, now all just investigative tools at the governments disposal. I mean, this is already true for all cloud services (see CALEA) but there was at least some hope it would not extend to our own personal property.

Now maybe you think Apple actually is capable of creating a backdoor while effectively controlling access. The iPhone is perhaps the most well funded attempt in history to create a truly secure consumer electronic device, and clearly they failed to create a secure 5C, and possibly even failed to fully secure the latest 6S if rumors are true that they can re-flash the secure enclave on a locked device without cycling the encryption keys. Apple ships and patches vulnerabilities just like the rest of us humans. So, I personally wouldn't trust even them. But how about every other device manufacturer on the planet?

I really hope we don't wake up in a few years in this nightmare scenario, but one thing you can be absolutely certain is that this is exactly the game plan. It started with the Clipper chip, and it's obvious the careful planning and execution over the last several years to setup this scenario and to try to win over this capability. There are public statements and documented proof that certain officials have been specifically planing to leverage the next available terrorist attack towards these ends. Apple already provided an unencrypted iCloud backup of the device in question! Apple asked for the order to decrypt the device to be put under seal, and the government objected! This is the fight the government has been waiting for.

As Senator Frank Church spoke of the NSA, "I know the capacity that is there to make tyranny total in America, and we must see to it that this agency and all agencies that possess this technology operate within the law and under proper supervision, so that we never cross over that abyss. That is the abyss from which there is no return." I believe the capability for the government to compel creation of backdoors in our personal devices is a bridge into this abyss. It is a capability far too powerful to be controlled from those that believe they are doing the just and Godly work of Protecting the United States of America. In their furious pursuit of ever greater surveillance power, I believe these civil servants are not only recklessly endangering our security and privacy, but paving the road to the destruction of our civil liberties and perhaps even our democracy itself.

"The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." Not every search can be authorized by a warrant, it must also be reasonable. For the safety and security of our citizens, for the preservation of liberty, backdoors to our personal devices should be deemed prima facie unreasonable in any case.

Sadly I have very little hope the Supreme Court will lean this way, and I can only see that our security and privacy will suffer greatly as these backdoors are deployed and ultimately exploited. I have seen Pandora's box, and this is it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: