If for each phone to be opened, the government must first get a search warrant, and second approach Apple with phone in hand and request that only that phone be cracked... this does not constitute a backdoor. This is just due process.
If Apple does not comply, the courts might 1) reject Apple's claim they cannot open the phone and respond to that (e.g. do nothing or jail Cook for contempt), or 2) accept Apple's claim and respond to that (e.g. do nothing or demand that Apple develop a means to crack only this phone, given that model and version of iOS.)
As additional requests to crack other phones arise in future cases, this new Apple service is likely to be requested again, with new court order in hand. But adding such a capability within the services that Apple Corp. provides to law enforcement does not constitute a "backdoor", at least not in a technical sense. It becomes a backdoor only if it's accessible to others without following legal due process. That's the theory anyway.
In practice, would this create a vulnerability that could be abused outside due process? Probably yes. But what's more likely and more dangerous is that the US government would continue to say, "Fuck due process", and once the tech exists, they would hack it themselves and then invite every law enforcer down to dog catcher to circumvent due process and abuse the system, the way it has with the Stingray program among numerous others.
So yes, I agree that this would be a very bad precedent. But I think it's imprecisde to call it a technical backdoor. It's more a pre-malicious legal loophole, like so many of the practices adopted by US law enforcement since 9-11.
I think we need to be very precise and consistent with the language. A "backdoor" is a coded method to allow bypassing encryption without authorization. That the backdoor may only be exploited after legal due process doesn't make it any less of a backdoor. A "backdoor" by definition includes some attempt at making it non-publically accessible (otherwise it would be a front door!)
Today, there is a vulnerability that exists on the 5C. If Apple writes the code to enable them to exploit this vulnerability, then they have created the backdoor. Reasonable people may then disagree if the backdoor will remain secure from unlawful use, or if we can even trust the system that Apple and the Government setup together will ensure the backdoor is only used lawfully.
The biggest problem I think many people are missing is even if the backdoor is only ever used with "legal due process", the established legal precedent for getting Apple to create the backdoor would now be "All Writs". It's right there in the name of the law -- now we have a precedent that says, any time the government can show legal due process to want a backdoor created in a device which might possibly assist in some federal investigation, the manufacturer would be required to create such a backdoor for the government. Well, fuck.
It seems to me the only logical conclusion is every electronic device you own will be backdoored to have the capability to spy on you. Your car, your phone, your security system, your thermostat, your entire Internet of Things, now all just investigative tools at the governments disposal. I mean, this is already true for all cloud services (see CALEA) but there was at least some hope it would not extend to our own personal property.
Now maybe you think Apple actually is capable of creating a backdoor while effectively controlling access. The iPhone is perhaps the most well funded attempt in history to create a truly secure consumer electronic device, and clearly they failed to create a secure 5C, and possibly even failed to fully secure the latest 6S if rumors are true that they can re-flash the secure enclave on a locked device without cycling the encryption keys. Apple ships and patches vulnerabilities just like the rest of us humans. So, I personally wouldn't trust even them. But how about every other device manufacturer on the planet?
I really hope we don't wake up in a few years in this nightmare scenario, but one thing you can be absolutely certain is that this is exactly the game plan. It started with the Clipper chip, and it's obvious the careful planning and execution over the last several years to setup this scenario and to try to win over this capability. There are public statements and documented proof that certain officials have been specifically planing to leverage the next available terrorist attack towards these ends. Apple already provided an unencrypted iCloud backup of the device in question! Apple asked for the order to decrypt the device to be put under seal, and the government objected! This is the fight the government has been waiting for.
As Senator Frank Church spoke of the NSA, "I know the capacity that is there to make tyranny total in America, and we must see to it that this agency and all agencies that possess this technology operate within the law and under proper supervision, so that we never cross over that abyss. That is the abyss from which there is no return." I believe the capability for the government to compel creation of backdoors in our personal devices is a bridge into this abyss. It is a capability far too powerful to be controlled from those that believe they are doing the just and Godly work of Protecting the United States of America. In their furious pursuit of ever greater surveillance power, I believe these civil servants are not only recklessly endangering our security and privacy, but paving the road to the destruction of our civil liberties and perhaps even our democracy itself.
"The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." Not every search can be authorized by a warrant, it must also be reasonable. For the safety and security of our citizens, for the preservation of liberty, backdoors to our personal devices should be deemed prima facie unreasonable in any case.
Sadly I have very little hope the Supreme Court will lean this way, and I can only see that our security and privacy will suffer greatly as these backdoors are deployed and ultimately exploited. I have seen Pandora's box, and this is it.
If Apple does not comply, the courts might 1) reject Apple's claim they cannot open the phone and respond to that (e.g. do nothing or jail Cook for contempt), or 2) accept Apple's claim and respond to that (e.g. do nothing or demand that Apple develop a means to crack only this phone, given that model and version of iOS.)
As additional requests to crack other phones arise in future cases, this new Apple service is likely to be requested again, with new court order in hand. But adding such a capability within the services that Apple Corp. provides to law enforcement does not constitute a "backdoor", at least not in a technical sense. It becomes a backdoor only if it's accessible to others without following legal due process. That's the theory anyway.
In practice, would this create a vulnerability that could be abused outside due process? Probably yes. But what's more likely and more dangerous is that the US government would continue to say, "Fuck due process", and once the tech exists, they would hack it themselves and then invite every law enforcer down to dog catcher to circumvent due process and abuse the system, the way it has with the Stingray program among numerous others.
So yes, I agree that this would be a very bad precedent. But I think it's imprecisde to call it a technical backdoor. It's more a pre-malicious legal loophole, like so many of the practices adopted by US law enforcement since 9-11.