Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I find it laughable whenever someone says "this is the last straw" because it just shows how incredibly misinformed they are.

Yes, backdooring E2E encryption in general is a bad idea. However, consider two things:

* iCloud Photos was never E2E encrypted in the first place. They already can scan your photos all they want server-side, and they have been scanning for CSAM since 2019, while Google has been scanning for it since 2009. Yes, if iCloud Photos were to become E2E encrypted leaving in a backdoor like this could be bad, but it's still the lesser of two evils. Would you rather they keep photos non-E2E forever and have even more unfettered access to them than a "backdoor" allows? It does NOT scan photos that are not uploaded to the cloud, despite being on-device. And it's important to note the threshold and manual human review system put in place before the authorities receive any notification at all.

* For iMessage, all this entails is warning children under 18 about explicit content, and optionally notifying parents if the child is under 13 and the parent opted in. (I don't think it even sends the photo itself to the parents, but that's not explicitly clarified anywhere.) At no point do Apple or the authorities learn the contents of E2E encrypted iMessages. (Also worth noting: if you use iCloud Backup, your messages are no longer E2E encrypted in the backup, as Apple holds the keys to that. This was true even before the new system was introduced.)



> It does NOT scan photos that are not uploaded to the cloud, despite being on-device.

Yet. Once it's on the device, it's a MUCH smaller step to use it in other ways. It's certainly easier fro governments to argue that they should be able to force it to be used arbitrarily... you know, for the children/terrorists/etc.

> And it's important to note the threshold and manual human review system put in place before the authorities receive any notification at all.

Until it's not. Once again, once it's in place, it's a lot easier for malevolent actors (governments) to force it to be used other ways.

This a back door. Plain and simple. The fact that it's not _currently_ going to be used for evil (depending on your definition of evil) does not mean it won't be in the near future. Back doors are bad. How many times does this need to be said?


> Yet. Once it's on the device, it's a MUCH smaller step to use it in other ways

We crossed this bridge a long time ago. Apple already has on device Neural Nets processing everyone one of your on device photos. That’s what powers spotlight search and “photo memories”.

Simple fact of the matter is that this isn’t the top of some slippery slope, it’s half way down one. A slope we started down when we figured out how to put powerful Neural Nets on mobile devices in people’s pockets.

> Until it's not. Once again, once it's in place, it's a lot easier for malevolent actors (governments) to force it to be used other ways.

Which is why Apples current solution makes it cryptography impossible to decrypt photos until a large enough number of suspect photos have been uploaded.


The key difference, of course, is that when the neural network classifies certain types of content, it doesn't forward it to a centralized server "for review"


And depending on that review you could find yourself on the other end of some "questioning" from law enforcement.

Yes, you might laugh and say that won't happen, but on-device scanning is the first step.

In less trustworthy countries it's not that farfetched to imagine what this can be used for.

So Apple must back down now or face the consequences in the form of loss of reputation and eventually loss of sales.


> Yet

I keep seeing this jump. There's no evidence this will happen. Apple can already technically do anything they want to compromise the security of your device in the next software update, so could Google or Samsung or any other company. But when in Apple's history have they done this? There is zero reason to believe this is the next step other than speculation and fear mongering.


> Apple can already technically do anything they want to compromise the security of your device in the next software update

But they're making it easier for governments to come along and force them to do more. Or even for themselves, but I tend to think they're less of an issue.

I know "it's a slippery slope" gets overused... but if you keep taking baby slips down that slope, it only gets slipperier. You should avoid taking as many of those steps as possible.


Anyone can imagine a hypothetical future feature and oppose it. What if Apple one day replaces all my music with Best of ABBA? That would be terrible, but they haven't done or proposed it, so why argue about it?


Because that's not what's being argued here. Nobody in power cares enough to mass load ABBA onto your phone. But there's very powerful nation states who care, more than they care about anything else, to maintain power at any cost.


Could anyone have imagined law enforcement using Corona contact-tracing data for other purposes ?

Because that actually happened, and in a democratic country even.

So it's not hard to imagine what less democratic countries could demand of Apple.

https://www.abc.net.au/news/2021-06-29/queensland-coronaviru...


> But they're making it easier for governments to come along and force them to do more. Or even for themselves, but I tend to think they're less of an issue.

It is as easy as always been. Only problem is that this might give them new ideas. As the most of the politics are probably non-tech people, they don’t know what is possible.

For tech person, functionality like this (on-device scanning and flagging) is super trivial to add. Antivirus engines have existed decades.


> Would you rather they keep photos non-E2E forever and have even more unfettered access to them than a "backdoor" allows? It does NOT scan photos that are not uploaded to the cloud, despite being on-device.

Yes I'd rather they do this. The fact that they're implementing on device checks doesn't suggest to me that they will be deploying E2E encryption. It suggests to me that they will be expanding on device scanning to all content in the future.

If they were going to make iCloud E2E encrypted, it would be a clear win to announce this at the same time as deploying on device scanning.


Their PR did not handle this well. If you look at the spec, new encryption level has been added, which allows access by Apple only if CSAM hash threshold is reached. It is E2EE with backdoor now.


Unless you have a public reference, I really doubt this is the case.

Because they’d also need to be announcing that you can no longer reset your iCloud password and recover to a new device. And I’ve not seen anything that suggests this.

So I suspect it is encrypted at rest, with a key known to Apple as before as well as this CSAM approach.


There is public reference on Apple site[1].

Citing final phrase on the paper to TLDR their system:

> Apple is able to learn the relevant image information only once the account has more than a threshold number of CSAM matches, and even then, only for the matching images.

This applies only for images, so you can still reset your password. Technically, there are two layers of encryption on images. Regular server-side encryption and this "E2EE like" encryption, which allows access for CSAM matches in specific threshold.

[1]: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


This document contains the following:

> As part of setup, the device generates an encryption key for the user account, unknown to Apple.

The question is, how is this generated. Can it be re-derived from information Apple has? If not, how will Apple handle cases where the user loses or breaks their device?

Is it derived from the iCloud password? Currently Apple can reset your iCloud password and restore access to your images. Will Apple no longer be able to do this in the future?

It’s really unclear to me, and I’d want explicit answers to these questions personally.


This seems to be explained on white paper of PSI system[1]. A lot of math is included, but on the page 30 there is a mention that different devices can be used. I am not the one who can explain that well.

[1]: https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu...


Sure, different devices can be used they share the same key as stated in the document.

But it’s still not clear how that key is derived. It’s not clear, as implemented that Apple do not hold a master key to decrypt all data (as they do currently).

In fact, if the key is randomly generated, if you have one device (as many users do) and you lose that device. Do you lose all your data? Even if you have your iCloud password?

It doesn’t make sense. It would be a massive change to how iCloud currently operates and is used. And I find this extremely unlikely.

Right now, you can browse your photos online. That functionality is going away?

There are seemingly many open questions. But given that there’s no clear statement from Apple, I’m inclined to believe that they retain the ability to decrypt all data.


Most likely you can’t browse your photos online anymore, unless they add some kind of method to export keys from the device(s). I speculate that it is possible to lose all of your data if you lose all of your devices. There might be option to create local backup from device keys, so it would not be the dead end.


Given the lack of an explicit announcement this seems very unlikely.

I don’t think Apple are stupid, it would have been a clear PR win if they said “we’re adding E2EE”.

Given no explicit statement, and how drastically it changes the nature of their service, I don’t think your speculation is justified.


The problem is, that this was not supposed to be released properly yet. Missleading leak caused them to hurry. About E2EE it is not speculation because it is literally on their papers what I linked?


It's not a backdoor if it's a public part of the system / protocol.


Hmm... that is technically correct.


Exactly. This has all been done for 10 years in various forms.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: