> What the court is ordering Apple to do, security experts say, does not require the company to crack its own encryption, which the company says it cannot do in any case. Instead, the order requires Apple to create a piece of software that takes advantage of a capability that Apple alone possesses to modify the permanently installed “firmware” on iPhones and iPads, changing it so that investigators can try unlimited guesses at the terror suspect’s PIN code with high-powered computers. Once investigators get the PIN, they get the data.
I don't think there's much difference between a backdoor and that. A backdoor can be "just a vulnerability", and that's what the FBI is asking Apple to create - a vulnerability in its security system.
It's kind of like saying "we don't want Apple to break its AES-GCM encryption, we just want it to replace it with RC4." Or "we only want Apple to support export crypto protocols as well, so we can downgrade to them when we do our attacks".
Whether we call it a "backdoor" or "vulnerability" or "just don't make it that secure" thing, the end result is the same. The FBI wants Apple to weaken its security, and that weaker security can and will be exploited by malicious actors, too (even if you're assuming it won't be abused by the FBI and the police itself, which of course it will be).
Presumably any "backdoor" is a security hole that can be accessed on any phone. So far, this request asks that Apple remove only the part of one phone's security that 1) destroyes that phone's data after 10 failed tries, and 2) slows the automated entry of passcodes into that phone. So far, this case is specific to one phone and thus not about a general backdoor.
As the case stands, unless Apple can show how compliance weakens security on other phones, I don't see how they can refuse to comply, unless it's technically impossible for them to do. If it is impossible, their refusal in this case may lead to a ruling that requires them to change iOS to comply with such requests in the future. But if they can comply now, only the security on one phone will be diminished. It's strongly in the company's interest to comply.
If Apple does not (cannot) comply, and subsequently they are ordered to change iOS to comply with future requests, would this be a "backdoor"? Yes. I don't see how Apple can add this "feature" to iOS and assure that only authorized legal authorities could activate it in the future.
Is this case about changing iOS to comply with future requests of this kind? Not yet.
Precedence is a serious issue in the U.S. While the end result of this case might not be an immediate, radical policy shift, it would certainly lay the groundwork for that in the future.
You give the devil an inch...
As others have said, this is an attempt to demonize encryption in the eyes of the public.
I truly do not know what to think of the matter, since Apple seems to enjoy a suspicious amount of support from the judicial branch. Tim's letter does seem to fly in the face of our government's desired narrative. Hopefully he gets his vehicle inspected frequently and keeps his closet free of skeletons.
If Apple creates new software that removes a part of one iPhone's securty, that software can potentially be used on all iPhones. It's no different from what you said: " I don't see how Apple can add this "feature" to iOS [or in this case, create a tool to use on the current version of iOS] and assure that only authorized legal authorities could activate it in the future."
How is that different from "If I hand over this iCloud data, I'll need to create software to download iCloud data from a specific user, and that software can potentially be used on any iCloud account"? How is this not applicable to any warrant that requires technical ability to comply?
Apple doesn't make the same promises about encryption and privacy on their iCloud service that they do their iPhones, that's the only difference. If you store information in iCloud and the FBI comes asking for it, Apple will hand it over. By the very nature of iCloud, Apple has access to your data. Not so with your iPhone. Apple says your device is encrypted and they cannot access it without your consent, end of story. Forcing them to create tools that invalidate this promise on one iPhone means creating tools that can invalidate the promise on any iPhone.
Saying you'll break the law before you do it does not make it legal. "We told our customers we won't do something" is not a legal defense.
By the very nature of iPhone 5C, Apple can put whatever software they want on it with physical access.
Also, note that the phone is owned by the government. They provided it to the terrorist, who worked for the government. So there's no promise broken to customers, because it's the customer themself who's asking for this.
I'm not saying anything about the (il)legality of Apple refusing the FBI's request, or whether their iPhone promises constitute a legal defense. At this point it's neither legal nor illegal, and the courts will take up this case soon (right now, Apple is arguing that precedent is on their side. From the NYT: "In a 1977 case involving the New York Telephone Company, the Supreme Court said the government could not compel a third party that is not involved in a crime to assist law enforcement if doing so would place “unreasonable burdens” on it.").
I've only been responding to the claim that this is only about "one iPhone", as if there would be no impact on all iPhones as a result.
Apple lost in court. They may win on appeal, or not, but as of now they've lost. I think that "it may or may not be legal" is not correct.
Re one iPhone: I responded that the same could be said about any warrent. The actual order requires them only to modify one iPhone. Whether that proves they can do it for others doesn't matter. If they need to do it in this case, they need to do it whenever the government has a warrent. That's not a slippery slope, and it doesn't affect any non warrented devices.
The legality is still in question, then. If they can still win on appeal, it's not definitively illegal.
As for the rest, I only pointed out the difference between this request and an iCloud request, which you asked about (rhetorically, I know). This iPhone request plausibly places an undue burden on Apple that the iCloud requests do not, so it's different (including from a legal standpoint). If they eventually lose and the Supreme Court says "No it's not an undue burden, now go hack the phone!" then fine, the highest court in the land will have declared your analogy sound. Right now, that's very much in dispute.
1. This is quickly getting into the philosophy of law. Every case can be struck down by either appeal or a later case overturning precedent. I think such cases should be thought of as "it was illegal, but now it has been changed by the court". So once someone's lost, we call their actions illegal until they appeal and win.
2. You said that complying would open the door to doing it for other phones. In that regard, complying with an icloud request also opens the door to complying with future icloud requests. The fact that it creates the ability (for Apple) to do it later has no effect on legalities.
There may be a difference on undue burdens, but that's a different point. Your point would still be invalid. " creating new software for FBiOS is an undue burden, but creating new software to download a user's icloud data isn't " is a different argument than "creating new software means we can use it again later". The latter is the argument you made, and it doesn't differentiate between icloud and FBiOS.
If we're concerned the FBI will take the software and use it without a warrent, Apple was given the option to do everything on their own premises and just give the FBI the data/unlocked phone when done.
> You said that complying would open the door to doing it for other phones. In that regard, complying with an icloud request also opens the door to complying with future icloud requests.
There is one crucial difference. In the iCloud case, the government must ask Apple any time they want to get iCloud user data. Each time, Apple can verify that they have a court order before doing the work. So it's "stateless" in that the first request doesn't "open the door" for later requests.
In the iPhone case, the government is asking for an OS (signed by Apple) that can be flashed onto any device. This new OS would have a giant backdoor that disables all the protections of an iPhone. We all know there is no way to prevent this OS from being used elsewhere, for other uses. This is not stateless -- once Apple creates this OS, there is no going back, all phones are now insecure.
Every update to any iOS device (well, any since iPhone 3GS) requires a signature unique to that device. Since iOS 5, it also includes a nonce generated on device at the time of upgrade, so you can't even replay the signature, it needs to be signed at the time you install the new version.
This is why you can't downgrade to earlier versions of iOS after Apple stops signing them.
But Apple is only able to do anything at all because it's an old phone without security features present in later models.
Also, any other manufacturers' phones would be trivially hacked in the same case, because the firmware can be overwritten without needing to be signed. Apple is pretty much the only one that requires signing.
I really don't understand how anyone can claim "this case is specific to this phone" like that means anything at all. Of course the case is specific to this phone. And the next case will be specific to the next phone. How could any case not be "specific" to the case at hand? It's just a tautology.
Can the government compel Apple to create new functionality to enable cracking their own phones' security? Any reasonable discussion on the topic must consider if the answer is "yes", then why is the answer yes in this case, and perhaps no in any other case. What (if any) are the specific conditions under which they can be forced to assist? What is the legal precedence that is being set, and is there anything that makes this case special?
Is the answer "yes, Apple must assist" because the subject of the investigation is a mass murderer? As far as I know, there is no such law. As I understand it, the specific crime or facts of the case are completely irrelevant to the legal argument the FBI is presenting. There is encrypted data on a phone that they think could be useful in investigating a federal crime of some kind. Nothing more.
The legal justification is crucially important. The FBI is arguing Apple must comply as a matter of "All Writs" and if they succeed in this line of argument, then from what I understand it clearly follows that any time the government wants access to encrypted data, and has either permission of the device owner or a warrant, then they can compel Apple to assist in recovering that data.
So I'm fairly convinced this case is absolutely not specific to "one phone". At most this case is "specific" to compelling the development of custom software and firmware after-the-fact in order to bypass the existing security measures of a device in order to access otherwise encrypted data. If they are forced to comply in this case, either explain why they can't be forced to comply in the next federal case, or stop claiming that the security on only "one phone" will be diminished.
This is no better than the word wrangling the NSA does around "collecting" data. The idea that it's not a "backdoor" just because it is deployed in real-time on a per-phone basis versus being pre-deployed across all phones is balderdash. This is a distinction without a difference. How the backdoor is deployed doesn't make it any less of a backdoor to the person who has the capability. And of course iPhones clearly have the capability of deploying said backdoor over-the-air just as easily as over the lightning cable.
That the firmware image might have to be re-signed by Apple's code-signing key because it includes a hard-coded identifier for a specific device is equally irrelevant, unless you can explain why the government can't compel Apple to generate these firmware builds on-demand whenever they want them.
If for each phone to be opened, the government must first get a search warrant, and second approach Apple with phone in hand and request that only that phone be cracked... this does not constitute a backdoor. This is just due process.
If Apple does not comply, the courts might 1) reject Apple's claim they cannot open the phone and respond to that (e.g. do nothing or jail Cook for contempt), or 2) accept Apple's claim and respond to that (e.g. do nothing or demand that Apple develop a means to crack only this phone, given that model and version of iOS.)
As additional requests to crack other phones arise in future cases, this new Apple service is likely to be requested again, with new court order in hand. But adding such a capability within the services that Apple Corp. provides to law enforcement does not constitute a "backdoor", at least not in a technical sense. It becomes a backdoor only if it's accessible to others without following legal due process. That's the theory anyway.
In practice, would this create a vulnerability that could be abused outside due process? Probably yes. But what's more likely and more dangerous is that the US government would continue to say, "Fuck due process", and once the tech exists, they would hack it themselves and then invite every law enforcer down to dog catcher to circumvent due process and abuse the system, the way it has with the Stingray program among numerous others.
So yes, I agree that this would be a very bad precedent. But I think it's imprecisde to call it a technical backdoor. It's more a pre-malicious legal loophole, like so many of the practices adopted by US law enforcement since 9-11.
I think we need to be very precise and consistent with the language. A "backdoor" is a coded method to allow bypassing encryption without authorization. That the backdoor may only be exploited after legal due process doesn't make it any less of a backdoor. A "backdoor" by definition includes some attempt at making it non-publically accessible (otherwise it would be a front door!)
Today, there is a vulnerability that exists on the 5C. If Apple writes the code to enable them to exploit this vulnerability, then they have created the backdoor. Reasonable people may then disagree if the backdoor will remain secure from unlawful use, or if we can even trust the system that Apple and the Government setup together will ensure the backdoor is only used lawfully.
The biggest problem I think many people are missing is even if the backdoor is only ever used with "legal due process", the established legal precedent for getting Apple to create the backdoor would now be "All Writs". It's right there in the name of the law -- now we have a precedent that says, any time the government can show legal due process to want a backdoor created in a device which might possibly assist in some federal investigation, the manufacturer would be required to create such a backdoor for the government. Well, fuck.
It seems to me the only logical conclusion is every electronic device you own will be backdoored to have the capability to spy on you. Your car, your phone, your security system, your thermostat, your entire Internet of Things, now all just investigative tools at the governments disposal. I mean, this is already true for all cloud services (see CALEA) but there was at least some hope it would not extend to our own personal property.
Now maybe you think Apple actually is capable of creating a backdoor while effectively controlling access. The iPhone is perhaps the most well funded attempt in history to create a truly secure consumer electronic device, and clearly they failed to create a secure 5C, and possibly even failed to fully secure the latest 6S if rumors are true that they can re-flash the secure enclave on a locked device without cycling the encryption keys. Apple ships and patches vulnerabilities just like the rest of us humans. So, I personally wouldn't trust even them. But how about every other device manufacturer on the planet?
I really hope we don't wake up in a few years in this nightmare scenario, but one thing you can be absolutely certain is that this is exactly the game plan. It started with the Clipper chip, and it's obvious the careful planning and execution over the last several years to setup this scenario and to try to win over this capability. There are public statements and documented proof that certain officials have been specifically planing to leverage the next available terrorist attack towards these ends. Apple already provided an unencrypted iCloud backup of the device in question! Apple asked for the order to decrypt the device to be put under seal, and the government objected! This is the fight the government has been waiting for.
As Senator Frank Church spoke of the NSA, "I know the capacity that is there to make tyranny total in America, and we must see to it that this agency and all agencies that possess this technology operate within the law and under proper supervision, so that we never cross over that abyss. That is the abyss from which there is no return." I believe the capability for the government to compel creation of backdoors in our personal devices is a bridge into this abyss. It is a capability far too powerful to be controlled from those that believe they are doing the just and Godly work of Protecting the United States of America. In their furious pursuit of ever greater surveillance power, I believe these civil servants are not only recklessly endangering our security and privacy, but paving the road to the destruction of our civil liberties and perhaps even our democracy itself.
"The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." Not every search can be authorized by a warrant, it must also be reasonable. For the safety and security of our citizens, for the preservation of liberty, backdoors to our personal devices should be deemed prima facie unreasonable in any case.
Sadly I have very little hope the Supreme Court will lean this way, and I can only see that our security and privacy will suffer greatly as these backdoors are deployed and ultimately exploited. I have seen Pandora's box, and this is it.
Apple put themselves in that position by being the only entity able to sign software on their systems. This is a wide open backdoor by design.
If they provided ios devices with the capability to have their own master keys - it would have been moot. iPhone is as secure as the pk of the iphone owner. They cannot do anything to help FBI
I'm worried that the FBI could respond to the latter with "Well, that just means that you're not allowed to build something like that."
That, personally, is far more terrifying than Apple giving the FBI one phone. Because it's a very short road from "You're not allowed to build completely unbreakable security" to "You're not allowed to build anything that could hinder FBI investigation efforts and must build a backdoor."
The Supreme Court is going to have to hear this stuff, and I'm doubtful that the rulings are going to be friendly.
Maybe there's a different piece of legal justification elsewhere, but in this particular case it rests on the "All Writs" act. That act does not say that you cannot create a system that the government can't get into. It says that courts are allowed to compel you to help the government in a legal investigation. The extent to which that act allows a judge to compel Apple to assist is what is in question.
IANAL, but I suspect it would be a much larger jump to try to use this statute to ban the creation of an encryption system (as opposed to helping to defeat one). If they wanted to do that they would almost certainly have to go through congress or find a different statute.
Also, the court isn't asking, its' demanding, and its demanding Apple create a vulnerability in a system designed to thwart such vulnerabilities. I'm not sure it's even possible.
What if Apple didn't fight this yet failed to create a working vulnerability? After all the phone has protections against its firmware being replaced without the passcode!
IF Apple were to fail would they be held in contempt of court?
This is why I have contempt for our courts-- way too many judges who are never punished for their tyranny.
Another issue, tangential and not widely discussed, is the very fact that a court, at a federal agency's behest, is ordering a private company to do highly technical and difficult work, at its own expense. They have not demonstrated that Apple committed a crime, and yet they demand that Apple set up an internal project and commit employees and resources to, essentially, do the FBI's work for it.
There has to be a violation of the Constitution in there somewhere. The government cannot compel private companies or individuals to surrender private property (in this case, intellectual property), it cannot restrict freedom of speech (in this case, software is an expression of speech), and perhaps there is also a tie-in to the Commerce Clause.
In other words, at a certain point the U.S. government's power should be and must be limited. Unlimited power is dangerous and surely would violate the vision and foundational philosophy behind the Constitution. In this case, unlimited power means that a law enforcement agency can justify nearly any kind of forcible action with the vague reasons of "national security" or "criminal justice".
I'm curious what Apple would charge for this "service". If I were CEO, I'd request ten billion dollars. A million or so for the time and manhours, and $9.999 billion for the damage to the company's reputation, stock price, etc., that this will cause -- breaking of a promise that "even Apple can't retrieve your data". Suppose Samsung, a non-American company, jumps on this and says, their phones are truly non-crackable and Samsung would not be able to do it, even if they installed a hacked OS to work around the login failure limit. Boom. Apple has just lost millions of sales. Our heavy handed government hard at work, damaging America's best companies.
> Knake said that the Justice Department’s narrowly crafted request shows both that FBI technical experts possess a deep understanding of the way Apple’s security systems work and that they have identified potential vulnerabilities that can provide access to data the company has previously said it can’t get.
I assume the actual request is more technical then, because the overview they gave here explains the things you would want to do if you knew nothing about the encryption and wanted to brute-force. Reduce password attempt timeouts, allow automating the password attempts, and don't melt-down after too many failures.
That's why a lot of people (myself included) are so cynical about the request. We all know the intelligence community has experience with side channel attacks, decapping processors, etc. But they've apparently decided not to use them in this case, "in the interest of time."
If they really wanted that data, they could get it with their current capabilities. They don't really want that data; they want the legal precedent to compel companies to subvert their own security mechanisms, and they want to intimidate one of their harshest critics (Cook). That's part of their broader strategy; if we can compel you to break security you built, you build security you can't break. The next logical question is whether they can compel you to not build security mechanisms you cannot break in the first place. That's the legal question the FBI really wants to ask.
Another option the FBI has is to issue a search warrant for Apple's code signing key. Then they could flash the device themselves. That's what they did to Lavabit and it has a lot more legal standing than this All Writs Act thing.
I do not understand why no one is talking about this possibility. This is a much more terrifying option, the precedent already exists (lavabit), and it would enable a much broader application of different versions of FBiOS.
If the FBI were to do this, would it be possible for Apple to counter it in the future by splitting the key across multiple entities, some in non-US jurisdictions, in such a manner that all of them would have to agree to sign anything?
It's kind of an interesting idea. Doesn't seem like it would work for RSA cryptography, though -- in that model, your private key is a pair of prime numbers, and your public key is their product. It's hard to produce both factors from the product, but it's pretty easy to produce one factor if you know the product and the other factor.
On the other side the risk to Apple is the congress deciding the answer to the question posed by the FBI. There are members in congress who believe service providers should comply with court orders regarding access to encrypted information, and further the public by and large, subscribe to that view. Now, Apple's business proposition, and the position of many technologists, though not all, is that Apple shouldn't be compelled to comply because it will weaken privacy for all, not just that of the subject of a criminal investigation. However, long term, there is reason to believe that if not the USG, other large markets will ask Apple to comply, in order to have access to the markets. As noted before India[1] as well as some ME countries forced BBM to comply as a condition of doing business.
In the latest Ctrl-Walt-Delete, Nilay (an ex-lawyer) makes a great, and chilling point, that it's entirely possible that the FBI can easily get into the phone with the help of the NSA, but that they're choosing to make a public request in this case to set precedent since so many facts are on their side. Terrorism, simple request, giving Apple full control, etc.
Not just in the San Bernadino case. There are a lot of perfectly reasonable cases where you might need to get into a locked phone that do not involve terrorism.
Examples include investigating the murder of the phone's owner, accessing financial information of a parent who has passed away, or recovering corporate passwords from the phone of an ex-employee who cannot be located or refuses to give them up.
It shouldn't be in their favor since even the most basic of background checks would have shown her a ravi g lunatic who hated Americans and was highly supportive of terrorists. They failed us through utterly gross negligence and somehow the public is on their side.
I actually think that either case(win or lose) have a lot of benefits for the government. If they win, they have the precedent and legal access ,so it makes their lives easier.
If they lose and the media circus leads people to strongly believe that the iPhone is 100% secure while it really isn't - gaining the trust that was lost due to snowden - this is really useful both from surveillance point-of-view , from blocking access to foreign countries, and for helping the iPhone usage grow, especially among "important" users ,be they legal or illegal.
But just looking at what seems more valuable for the government ,it seems that the second option is far more valuable. So i wonder ,why do everybody automatically accept the first motive ?
> If they lose and the media circus leads people to strongly believe that the iPhone is 100% secure
I don't think this is it. IMO if they lose they will ask congress for a law to require backdoors because they have already shown that not having such a law helps terrorist.
It's not just about phone adoption - it's about the total pattern.
If you don't trust the iPhone ,maybe ,for your important communications, you'll only use Qubes + Whonix + TOR ? write your own software in addition(as a rich terrorist org) ? use a machine not connected to the internet and pass files ? Be more careful in what you communicate over the phone ? etc..
If they really wanted the data, they could turn to any one of the countless Beltway bandits that have sprung up in response to the massive influx of contracting dollars the various three letter agencies have been pumping into the DC/NoVA/Ft Meade areas.
This has nothing to do with capability, and everything to do with precedent
I'm not sure how that would change things. If the phone in question had been a BlackBerry, wouldn't the FBI be making the same demands of that company?
Also, RIM actually folds to these types of requests.
"RIM's carefully worded statements about BlackBerry security are designed to make their customers feel better, while giving the company ample room to screw them" - Bruce Schneier
> The U.A.E. wanted RIM to locate servers in the country, where it had legal jurisdiction over them; RIM had offered access to the data of 3,000 clients instead, the person said.
If it has the access to offer then it seems to support the point.
Particularly:
> RIM respects both the regulatory requirements of government and the security and privacy needs of corporations and consumers. While RIM does not disclose confidential regulatory discussions that take place with any government, RIM assures its customers that it is committed to continue delivering highly secure and innovative products that satisfy the needs of both customers and governments.
What happens when those needs are diametrically opposed?
Well, this article seemed to suggest the FBI might know of vulnerabilities in the iPhone, which would presumably not be on the BlackBerry. Although the BlackBerry could conceivably have its own vulnerabilities too.
The Clipper chip initiative[1] from the Clinton era completely failed, for two reasons: one, the technology was proven to be flawed, and two, privacy advocates shot it down.
It seems as though all the debates and analysis on this topic have already occurred. Yet, here we are again: a law enforcement agency demanding special privileged access to privately owned consumer electronics because it might contain useful crime fighting information.
It seems to me that the U.S. needs to have a broader discussion about what levels of government surveillance and intrusiveness into private lives we are comfortable with.
The outside threat of terrorism is now the club being wielded to force the issue, but is there really any evidence that this type of increased access helps? We had the Boston Marathon attack, in which two brothers immigrated from Chechnya, a known breeding ground for some of the most brutal terrorists in the world, the Russians actually phoned to warn us about them, and nothing was done.
Similarly, there was chatter in 2000-2001 about an attack involving passenger jets, reported by Israeli and German intelligence agencies. Yet, nothing was done. One would have thought it common sense to scrutinize foreign nationals, especially from Muslim countries with a lot of hostility toward the U.S. among the populace, who were involved in aviation. Reportedly, the Israelis even were monitoring a couple of the 9/11 hijackers in the U.S. at one point.
Should we not be streamlining our intelligence bureaucracies to avoid another Marathon fiasco, before sacrificing what little remains of our privacy on the altar of national security?
> It seems as though all the debates and analysis on this topic have already occurred.
I don't know if I buy that framing of it. We've already had this debate: hundreds of years ago. We decided that the government can't search your stuff at random, but if it gets a warrant, it can, and is entitled to reasonable cooperation in doing so (breaking locks, drilling into safes, breaking open safe deposit boxes, etc).
I think you can just as easily say that privacy advocates are the ones trying to reignite a settled debate. They want phones treated differently than how other property is treated. This is privacy advocates trying to shift the Overton window: where a search pursuant to a valid court order somehow becomes a privacy violation.
The difference in all of your examples is that the government actually cannot compel assistance with the things you described. The government itself can break locks, drill safes, break open deposit boxes, of course. But to date, they haven't tried to call Brinks and force them to come explain how to break into a safe.
If I were able to somehow develop a file cabinet that was impenetrable, the government would not be able to come compel me to break it open for them. They are allowed to try to break it open, but they cannot compel me to help them.
(CALEA requires that systems must be built so that the government can snoop. Cell phones aren't covered by CALEA (the network is)).
The power of the courts to compel cooperation in civil and criminal investigations predates the Constitution. Courts routinely order banks to drill into safe deposit boxes, for example. They can also force your accountant to hand over records about you and testify against you. The limiting principle is whether the cooperation imposes an "unreasonable burden" on the third party.
The key differences now are that our devices' activities are more akin to sending a letter through the legally protected US mail, rather than disclosing secrets to untrusted third parties like cell carriers (no reasonable person, upon being shown a text messaging app, would think they were texting their carrier and not their friend). Thus, we expect a much higher standard of proof and accountability for accessing that information than has been applied of late.
Second, our devices are becoming effectively cybernetic extensions of our minds, and this cognitive intimacy definitely warrants a new discussion of boundaries that is not burdened by awkward analogies to historic traditions of the courts.
Finally, by whatever basis you want to argue (multiple amendments in the bill of rights, etc.), we should not be forced to deny the effectiveness of mathematics. We should have the right to use whatever algorithmic and physical security mechanisms we desire to guard ourselves against hackers, thieves, and foreign spies. If that slows the local spies down too, they'll just have to find some other way to do their jobs, and reconsider whether the scope and scale of their activities was appropriate to begin with.
> Second, our devices are becoming effectively cybernetic extensions of our minds, and this cognitive intimacy definitely warrants a new discussion of boundaries that is not burdened by awkward analogies to historic traditions of the courts.
I guess I just don't believe this.[1] Courts have always been able to compel say your friend or girlfriend to reveal your darkest secrets--stuff that you'd never even write down--but your text messages should be treated differently? It's not like cognitive intimacy didn't exist before electronic devices.
[1] As far as sci-fi philosophy goes, my go-to is WWHIS (What Would Happen in Star Trek). You think the Federation can't compel someone to assist in decrypting a computer?
Don't mix the separate points I was making. Text messages are like mail, other device use is like thought, imagination, or memory. I also notice you didn't mention spouses, which are treated differently from friends, and did not address the third point about encryption.
I'm also not making an argument from sci-fi, I'm talking about reality today. Star Trek (especially TNG) has a lot of great morality plays and valuable lessons can be learned from it, but the vast majority of the action we see takes place on a military-like vessel and can't inform us about how to structure a total population. But this is all beside the point that we are experiencing the melding of our minds and our devices right now.
> other device use is like thought, imagination, or memory
And I'm saying I don't buy that. My phone isn't an "extension of my brain" (that's what I'm calling sci-fi philosophy). It's a replacement for my phone, calendar/notebook, and camera, which are all things that have been subject to search with a warrant.
> I also notice you didn't mention spouses, which are treated differently from friends
But not for reasons relevant to your argument. Spousal privilege exists not to protect privacy but to protect marriage. The rationale is that you shouldn't turn spouses on each other. It dates to a time when spouses were considered the same legal person.
I didn't address your point about encryption because I don't disagree with it. But banning encryption isn't directly at issue here.
Star Trek is always relevant. It imagines a future society where people achieve prosperity through a very powerful, sometimes fallible, but basically benevolent government. It's an ode to the righteous power of Institutions. It's a compelling alternative to the anarchic leanings of much of modern sci-fi.
And I'm saying I don't buy that. My phone isn't an "extension of my brain" (that's what I'm calling sci-fi philosophy).
It may not be true for you, but you don't have to buy it for it to be compelling to others. I'm saying that when I and presumably many others use technology, it's not perceived as an external device, but rather a direct extension of our minds and senses. There's no skeuomorphic substitution taking place in our minds.
It's very vaguely analogous to (though much stronger than) a car enthusiast saying that they feel one with the car. A similar concept applies to many other tools humans use as well; people may say "ow!" and grimace in pain when they damage their tools, even though their own flesh wasn't physically harmed.
I use those analogies to suggest that a tool used not by the body but by the mind can have an even stronger integration with the core identity of a person, a connection that must be taken into account by any laws seeking to regulate such tools.
Regarding spouses, I'd make the argument that in a society that doesn't treat a marriage as a single person, privacy is one of the necessary substitutes that society needs to maintain its sanity and achieve the same end of building stable, productive households. So the historic rationale for spousal secrecy should not prevent us from extending the concept to other interactions, perhaps even with non-human parties (like our devices).
I'm glad we can agree about the banning of encryption. Maybe it's not the direct issue in the Apple case we are discussing, but it is a part of the rhetoric being used by (parts of) the government, and does relate to forcing a company to make it easy to break a device's encryption.
---
Star Trek is always relevant. It imagines a future society where people achieve prosperity through a very powerful, sometimes fallible, but basically benevolent government. It's an ode to the righteous power of Institutions. It's a compelling alternative to the anarchic leanings of much of modern sci-fi.
It also only works in a post-scarcity, free energy environment. There's often a recurring theme of what happens when a government is corrupt or is infiltrated by outside powers (represented for the sake of fiction as brain-eating parasites). And I'd say Kirk at least has a pretty strong anarchic leaning :-). But it's mostly entertainment, and doesn't provide a blueprint for achieving such a society. I really like Star Trek, but I think in this conversation it's more distracting than it is enlightening.
> But it's mostly entertainment, and doesn't provide a blueprint for achieving such a society.
I think it does. How did the society become post-scarcity and free energy? Teams of scientists working in Federation research labs, Federation programs that make the fruits of that technology available to everyone.
Bear in mind the technological basis for its post-scarcity society (cheap and efficient conversion of matter into energy and back) is impossible in our universe. It may not even be post-scarcity. We're shown the lives of the elite, but there are Federation colonies that practice traditional agriculture, mining colonies, trade in goods, starvation and chaos due to technological breakdown and sometimes a massacre or two. Certainly the Original Series didn't seem post scarcity, merely advanced (from the point of view of the 1960s.)
And as far as privacy goes, it probably depends on the series, but in the most Utopian ideal of the Federation, would anyone even use encryption? Surely the desire for secrecy would be considered a form of atavism, something humans would have evolved beyond?
But peering into your cybernetic mind extension would be more akin to compelling you to testify against yourself, which is constitutionally prohibited.
Hi, who did you vote for in the previous US presidential election? Primary? To what god[s] do you pray?
According to you, you see no problem in being compelled by court order in revealing this information. Because terrorism.
I don't believe that any government has the right to compel you to reveal your heart or private thoughts.
Another commenter said that you believe that torture can be justified - so maybe you really do believe that you can be compelled to finally acknowledge that 2 + 2 = 5, and that this is an acceptable outcome.
Your profile indicates that you're a lawyer - so if the court orders you to turn over confidential client files, you have no problem with that?
But that's only "reasonable" to ask of a Bank because bank's are in the business of giving time-limited vaults and thus evicting customers. The customer knows the bank is hard and crunchy on the outside, but sees that once inside the vault their safe-deposit box is only behind sheet-metal walls.
It's not only physically fairly easy, but the bank has the equipment on-site and does the lock-drilling routinely.
Apple is not in the business of cracking hardware security modules. And imagine if they make a mistake under intense media scrutiny and the data is lost forever - they'd take the heat and their reputation would be tarnished forever. It's absolutely unreasonable to ask Apple to take such risk, at great cost, in an area they aren't skilled in or practiced at.
> This is privacy advocates trying to shift the Overton window: where a search pursuant to a valid court order somehow becomes a privacy violation.
If one were to characterize a side's entire message as being a rhetorical trick, I think it would be as easy to say almost exactly the same thing in reverse. You're suggesting that the government can compel pretty much anyone to do anything already so we might as well just codify it and move along.
Privacy advocates are thinking of privacy, not the law. By that I mean, there's a certain amount of liberty people are willing to entrust to the government in trade for protection but many people see the cost to their liberties going up without any protection on the table. Monitoring someone used to take a 24/7 staff and now it's the default setting. There's simply, factually, less privacy.
You keep talking of the delicate and time-tested balance, etc. But consider this - (valid) government rules at the will of the people. The people don't think in terms of law, they think in terms of their needs and want. Privacy is a dial - Safety is a dial - government-surveillance is a consequence.
All precedent bows to the people, and to deny the will of the people because the constitution, statute, and precedent (may) not yet be up to date with that will misses entirely, the point of law and government. It doesn't matter if government has been doing something for 1500 years - if it's not providing a value to the people it's not a power the people need to grant.
There's absolutely no reason not to walk the government's powers back if it can't justify needing them.
> It's absolutely unreasonable to ask Apple to take such risk, at great cost, in an area they aren't skilled in or practiced at.
We're not talking about some sort of sophisticated crack here. We're talking about commenting out a couple of lines of code and recompiling the OS. The task Apple is being asked to do is technically very very simple.
We haven't had this exact same debate. The founders of our country didn't really envision us carrying our most personal details in our pocket. This is a new conversation, and should be adjusted accordingly
> ...a law enforcement agency demanding special privileged access to privately owned consumer electronics because it might contain useful crime fighting information.
I believe the device is actually owned by the (cooperative) employer, the San Bernardino County Department of Public Health. The county is happy to provide the device and its contents, but Syed (or county policy) locked it down.
> It seems to me that the U.S. needs to have a broader discussion about what levels of government surveillance and intrusiveness into private lives we are comfortable with.
Actually I don't really want to have that discussion, both because a lot people who would be involved in the discussion probably don't respect privacy, and because the outcome of the discussion anyway is moot: privacy is paramount, regardless of what "most people" think.
Since the enemies of privacy have the entire resources of the state at their disposal - and I'm including the media in that - and since the average person has not proved up to the task of defending his own privacy, it seems that the only way we're going to have it is by making it a technological inevitability.
While we're at it, I'll assert the same of liberty in general. Contemporary nation-states have transitioned from simmering hostility to liberty and civil rights, to open and sneering contempt.
I wonder if these "national security" people sit around longing for the next non-white person terrorist attack in order to spring their plans into action.
EDIT for the downvoters, my point about non-white people is that terrorist attacks by white people, such as all the mass shootings, don't seem to trigger the grand plans that these national security types like to execute.
The security/intelligence establishment were just waiting for an opportunity to put them into action and that act of terrorism provided all the justification.
They are gross opportunists of the most obscene order.
They spend the entire time pushing their plans into execution (that's in plain sight). And when a terrorist attack happens, they probably just commemorate, adjust tactics, and follow on.
That's a hair's breadth away from the lunatic fringe. It's a simple logical leap to the conspiracy theory that the shootings are false-flag operations put in place by the powers that be to initiate legislation on encryption.
"My guess is you could spend a few million dollars and get a capability against Android, spend a little more and get a capability against the iPhone. For under $10 million, you might have capabilities that will work across the board". Go ahead, good luck - Apple
That's not an improvement. If Apple is necessarily in the loop, at least they have a chance to fight it in court. If the FBI can do it themselves, that's one less procedural speedbump.
Further, if Apple is necessarily in the loop Apple won't close whatever holes the government tools utilize. If Apple isn't involved there's a chance they'd close the holes even if by accident.
This may be a stupid question and obviously Apple would never do it, but if Apple decided they wanted to ignore these requests, and stopped selling Apple devices in the US until the government backed down, do you think public opinion would force the government to comply? Just wondering how much power such a huge company has.
This is way more in the publics view compared to when Quest went through this though. I dont believe the government has the ballls to jail Tim Cook, CEO of one of the worlds most valuable companies.
Indeed, I think any pressure on Cook will come in the form of sanctions against the company itself, forcing the board's hand into firing Cook or convincing him to back down on his stance. Perhaps a deeper investigation into offshore tax havens by the IRS than other companies are currently under? The IRS is a powerful weapon of intimidation in the government's arsenal, and has been wielded many times in the past.
Given that they're a US corporation, ignoring the requests could result in Tim Cook (and others) in handcuffs. Ignoring the government is not a safe option.
That's where it would start. Then we'd see if public opinion was enough to rescue them. And I doubt that public opinion would be strongly enough in favor of Apple to save them. There's a lot of people who love Apple, true; but there are a lot of people who love national security, too. (And yes, I am aware that there is not an inherent contradiction between Apple's stance and national security, but I'm not sure that enough people understand that to make public opinion come down hard on Apple's side.)
And his case feels like carbon copy of Lavabit, where he couldn't defend himself (literally!!!)
But Nacchio was prevented from bringing up any of this defense during his jury trial — the evidence needed to support it was deemed classified and the judge in his case refused his requests to use it.
You aren't likely to ever get an open source baseband, so first off you're going to have to have a platform with very clean separation and little or no DMA.
Then it's going to need a hardware key separation something like what apple is doing with the enclave approach.
My hope has been that we could pressure the government to open up more radio bands for public use, then create an open source baseband that communicates on those channels. This would have the side benefit of eliminating our reliance on the large carriers for our communications.
There is some spectrum in the 600MHz range the government have been talking about re-purposing, but frustratingly they seem more interested in giving that to corporations than letting the public use it.
Then you can engineer the over the air protocol to be anonymous. Tower operators could be paid by bitcoin.
A truly secure cell phone network is possible. The problem is that certain people don't want us to have secure systems, and we're somewhat reliant on the government to afford us that opportunity.
I would imagine that if you focused on how it would help out poor Americans by offering them something cheaper, you'd get more traction than if you focused on privacy.
The problem with an open source baseband isn't so much the availability of bandwidth (although that is an issue) but maintaining control of TX and RX power and shape. You'll never get regulatory approval for a device that is both user modifiable and capable of incorrect signalling (at least, not without some other controls, see e.g. HAM licensing)
No need. The only thing apple must do is change the ios key management.
1. Once you buy the phone you (via itunes) create a RSA key pair. Put one of those in the phone. That key is set and bootloader uses it to verify loaded updates.
2. ios updates come to you signed by apple, you must resign them with your itunes and then they could be loaded.
So you obtain the ability to sign your own software on your own device.
In that case no amount of Apple assistance can help FBI until they obtain your private key.
That would require Apple to give up total control of the software ecosystem, which they're not willing to do. iOS is their kingdom. This is another factor in play here. Apple could make devices "yours" and therefore un-hackable in this way, but to do that they'd have to surrender their own master key. Apple wants to keep their master key but not hand it over to the feds since they want a market reputation as the Swiss Bank of device vendors, so they have to go to the mat on this. If they lose either they lose the security/privacy crown or they lose the app store walled garden.
There are IMHO two reasons for Apple's platform fascism. The more self-interested one is that they want app store profits. The second more user-focused one is that they really want to keep iOS from turning into the shitware and malware disaster that Windows has become. On one hand Apple's walled garden keeps out certain forms of innovation and lets them dictate the terms, but on the other hand it lets them exclude trash like Superfish or Comodo's "security" junk.
If you've lost that key you can't get in anyway. That's what a factory reset (which would not include the old decryption key) update is for. That scenario, updating with removal of the previous configuration, is pretty much baked in for both the case you mentioned and the reselling/refurbishing the device circumstances.
It's very difficult to bootstrap something like that and also make it updatable. If the key can be changed, then some software X can update it, and that software itself must verify the key. You have X updating Y and Y being ultimately responsible for updating X. Not so simple.
Given that nothing like that is out there as far as I know, you'd need a more specified model to prove it's even possible. It's not obvious from the proposal above.
- loader A stores a (PubKey,NextLoader) and has the ability to blank (via actual blanking or deletion of an encryption key) the entire device.
- loader A provides a new-key(PubKey,NextLoader) method which blanks the remainder of the device.
- Loader A also provides a update-loader(NextLoader) method that doesn't, but also doesn't update the PubKey. Before accepting the new NextLoader, it verifies it against the stored pub key.
Could also allow things like updating the PubKey if the update is signed by the previous PubKey
Spec presumes only access is via the loader A api, PubKey would really need to be stored somewhere safe (HSM, TPM, etc) to discourage direct hardware access.
Probably also could use the PubKey to encrypt the NextLoader.
New phones ship either:
- without a pubkey or next-loader, and require initial provisioning to do anything (fairly inconvenient)
- ship with a flag set that prevents updating the loader, ie: requires updating the key (less secure, probably need to specify further how this occurs to avoid the security hazard from un-updated phones from being too great).
It doesn't. The model I presented presumes that it is unchangeable.
> anyone with physical access for five minutes can permanently brick the phone, without opening it
Pick one:
- you can always get your phone working, even if you forget your key & only you can apply updates to any software on the phone. Anyone can replace the loader (but this wipes all other data).
- you can always get your phone working, even if you forget your key & only a third party can provide new versions of the loader.
- if you forget your key, your phone is permanently bricked.
> Also, how can loader A modify itself?
It can't.
In general though, it's fairly straight forward to have code copy itself into ram & run from there while overwriting it's source. The problem is that opens the potential to brick the phone (just like any method that allows updating the loader).
To avoid bricking in all cases, one _must_ assume that there is some un-replaceable software (or hardware mechanism to start software).
Lots of things become possible once one is willing to decap ICs to get at the internals.
I'd expect security consious parts (ie: all of the theoretical "loader A") would need to run in SRAM (ie: ram that is in the same IC and thus harder to get at than external DRAM chips) or some other mechanism.
At that point, it becomes a question of physical hardening within the ICs. Some manufacturers have done things like put metal layers over fuses (to prevent them from being changed), I'd imagine the same could be done (at some cost) for a larger area of the chip. I'd imagine HSMs (hardware security modules) and TPMs (though these aren't as good) probably implement some of that. There also exist some chips targeted towards security purposes (not aware of any processors off hand) that could be used .
Aassuming this was real, Apple would store the private key in Keychain[1]. Keychain is encrypted with your login password generally, and can have an ACL to only allow iTunes.app as an allowed application without further user prompting.
> You don't see them going after servers, because the servers are far more vulnerable.
Really? Every kind of server? I'm sorry, but that's some ridiculous statement.
A device running proprietary software that governments have physical access to (after it was confiscated) is less vulnerable than (possibly) your own device running open source software that nobody except you has physical access?
Governments have seized servers, governments have hacked servers, governments have had colocation companies provide them with access to servers, governments have gotten warrents to enter private residence fuck with computer and then leave without target ever knowing. The NSA has probably gotten backdoors/vulnerabilities built into server chipsets / motherboards.
You are naive if you think any device is secure against a determined and well funded actor.
We are talking about relative security here. Do you really think that keeping stuff on your phone is more secure than keeping it elsewhere?
Let me remind you, than if I'm not mistaken in US law enforcement can search your phone without any sort of warrant. Let's assume that you keep your stuff just on your own PC at home, then that at least would require a warrant.
> Do you really think that keeping stuff on your phone is more secure than keeping it elsewhere?
I think it certainly could be and probably is. Apple seems to be taking phone security pretty darn seriously with combined hardware/software approaches.
Furthermore, my servers sit in a datacenter or my apartment that I rely on somebody else to look after. Really though, you could probably bribe/threaten/warrant your way into those places easily enough and just pull RAM/HDDs to your heart's content. On the flip side, my phone stays in my pocket or next to my bed. If you want my phone, you have to arrest me/steal it from me/whatever.
> US law enforcement can search your phone without any sort of warrant
During a lawful arrest, and the search has to be documented and relevant to the arrest. Furthermore, that assumes that they know my password (Which I'm not currently obligated to give to them).
The statement "servers are more vulnerable [than phones]" doesn't mean "every server is more vulnerable than every phone". It's a more general point; trusting any given server is a greater risk than trusting any given phone. The fact that you can harden a server against attacks doesn't mean that your data being stored in the cloud is safer because, on the whole, people don't do much more than the minimum. Phone manufacturers do do more than the minimum.
Nope. And in any case that's one area in which "security through obscurity" can be useful. Presumably you have your phone on your person, but the fact that you have a server somewhere has to be determined.
And if we are being completely paranoid, then you can have some form of Dead man's switch or "self-destruct" option. You have a right to make a phone call, right?
Not at all. That particular project uses https://github.com/GoogleChrome/chrome-app-samples/tree/mast... in Chome. It is a server, and it runs in Chrome. I've made an extension that works as an API so that external clients can connect to it to access the DOM and change parts of a webpage.
It'd be stupid, but there's no reason why you couldn't use a system like that, running in Chrome, listening on an IP address, to do pretty much anything a "real" server does. The user wouldn't know. It's just a server, or "the cloud".
It depends on what kind of data it is, but sure, Google DC is fine, as long as you use software/crypto that is considered to be secure and encryption keys are truly yours.
But I'd say to look at how people hiding their money from their governments are doing it. Or look at Snowden. Let the law help you, even if it is the law of another country.
That's not enough if the machine can be tampered with and your data or keys copied while in transit or use. You can hide as long as the government doesn't know who you are. When they do, it's game over for you.
Just in case I'm missing it, the story is that there's a National Security Counsel "Decision Memo" defining a strategy, but that memo has not been leaked?
Simple question: what prevents the FBI from removing the components from the phone and using software they themselves wrote to drive the hardware crypto and decode the data they want?
It can't be that difficult, if you have FBI-class resources and some help from the NSA, to lift the components and make them work on a copy of the encrypted data.
If I understood it correctly, the UID is still not directly readable even though the actual computation happens in the same CPU, not inside a secondary secure environment.
I think this issue is important and I hope Apple prevails over the FBI. However, I'm also left feeling that they're subject to this request only because of what amounts to a security flaw in their own devices.
How/when can I run a phone OS that simply isn't subject to such known flaws and corporate manipulation? What are my options?
Sigh, I guess it's time to enable Click-to-play again[1]. I wish there were a way to just automatically pause on load, without needing to completely disable flash.
Firefox supports loading plugins like flash on demand on the Addons settings page. Additionally, Firefox has a setting "media.autoplay.enabled" to prevent HTML5 media from playing automatically. However, some websites assume autoplay succeeded and behave wrongly. For example, YouTube's paused/play button state is backwards.
My Facebook feed is full of people ragging on Trump for being on the wrong side of this issue, but they are silent about Obama:
"In a secret meeting convened by the White House around Thanksgiving, senior national security officials ordered agencies across the U.S. government to find ways to counter encryption software and gain access to the most heavily protected user data on the most secure consumer devices, including Apple Inc.’s iPhone, the marquee product of one of America’s most valuable companies, according to two people familiar with the decision."
In 2007-8, Obama the candidate railed against warrantless surveillance as allowed under the Patriot Act.[1]
Yet, when he came into power, he changed his position and supported all of the NSA's programs. It was only after the Snowden revelations that he said "we need to have a national dialog" and then pushed through a law that slightly narrowed the scope of surveillance but apparently left the core of the programs intact. To our knowledge, the NSA still has a tap on AT&T's Atlantic hub that can scan billions of packets an hour. The NSA still is as capable as before, only perhaps a bit more circumspect about it. Little has changed.
Obama already gave up this fight right away. He came on board promising to fight it, and doing the exact opposite. I guess I'm saying: We already know where he stands, therefore, he's not interesting anymore
I don't think there's much difference between a backdoor and that. A backdoor can be "just a vulnerability", and that's what the FBI is asking Apple to create - a vulnerability in its security system.
It's kind of like saying "we don't want Apple to break its AES-GCM encryption, we just want it to replace it with RC4." Or "we only want Apple to support export crypto protocols as well, so we can downgrade to them when we do our attacks".
Whether we call it a "backdoor" or "vulnerability" or "just don't make it that secure" thing, the end result is the same. The FBI wants Apple to weaken its security, and that weaker security can and will be exploited by malicious actors, too (even if you're assuming it won't be abused by the FBI and the police itself, which of course it will be).