Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The main argument here is that we are afraid that governments can demand, through court, that Apple should add some other hashes than the ones they start with.

What is preventing those said governments from going through the same legal process demanding that Apple should implement such a system from scratch?



Elections. If people start demanding maximum security, it can become a popular opinion. This can definitely happen but at the moment implementation of listening devices to everyone is unpopular.

It's useful to remember that organisations are made of people with individual aspirations. Some would really want to have listening devices implemented to everyone so that they can do their jobs better(usually the intelligence people, the police etc), others will want it for less "Acceptable" reasons but at the end of the day the people who can allow or disallow it will have their careers determined by the general population in a very public manner.


I would assume that elections isn't a guaranteed option in this though experiment.


Of course, you can coups, you can have wars etc. That's why people should watch out to keep the Democracy intact, go vote and have a healthy level of interest to the political processes.


> Elections.

The problem: at least 90% of the population doesn't care about privacy, and scandals (see Facebook etc) have no influence on this whatsoever, so awareness-campaigns are not likely to work either.


Generally the US government can't compel work to be done through subpoenas. Adding a hash is probably not considered work, but writing a bunch of code is. Typically the government can only compel parties to divulge information - such as "use your existing systems to tell us which users have this image on their phone".

Forcing citizens to perform work would be reserved for something that's part of a punishment for a crime conviction, such as prison labor. This would be "design, implement, test, release a brand new system to tell us which users have this image on their phone."

There's some academic debate over whether the "all writs act" can compel work/action/speech, but that would be the only avenue I'm aware of, and it's on pretty shaky ground for something like this.

If the system doesn't exist, Apple could fight an all writs act type request. If the system does exist, they probably can't really effectively fight a FISA order/subpoena to add some hashes in.

If they had never invented it, they might have been able to tell China "its not technically feasible" and potentially convince Chinese leadership that its even true. But at this point that game is up and China could definitely compel Apple to finalize and release this.

https://www.justsecurity.org/29634/readers-guide-magistrate-...

https://www.newyorker.com/news/amy-davidson/a-dangerous-all-...

https://arstechnica.com/tech-policy/2016/03/feds-used-1789-l...

https://thefederalist.com/2016/02/19/cut-the-crap-apple-and-...


In this kind of situation I'd say the more realistic "risk" isn't subpoenas or court orders or whatever, at least not standing alone. Instead you just have the government mandate that vendors provide this kind of capability though positive law. Then later legal processes like subpoenas or other orders simply use the capabilities that you mandated.

Think basically CALEA, which required telecom providers to change their systems to better enable law enforcement wiretaps, as existing methods weren't quite keeping up with telecom digitization. You don't need to use the wiretap orders themselves to make AT&T build in your access points.

Content matching by government mandate for various things feels kind of inevitable... It's already out there in terms of plans for requiring it in the copyright sphere, and obviously for CSAM.

Anyway, I don't know that whether Apple actually deploys this system makes that much of a difference in this sense: the idea is already out there. Which is basically what you said.


Yeah, it's legislation that Apple is trying to get ahead of here. The conversation about government mandated backdoors last year where Graham threatened Apple to either do something or legislation will [1].

There's also the emails that came out where Apple thinks they have a CSAM problem in iCloud photos and they don't want them there [2].

I think a lot of people are missing the larger playing field when discussing this issue. Apple is going to do something either by choice or by force.

[1] https://9to5mac.com/2020/02/21/backdoor-to-encryption/

[2] https://appleinsider.com/articles/21/08/20/apple-exec-said-i...


Super interesting. But what about the case where the hypothetical government develops the system themselves and requires apple to add it to their product? Of course apple could deny and possibly be blocked from operating in that market. I just find the whole situation fascinating


Whistleblowers




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: