I was using iCloud Photo Library before this was announced.
None of my photos have been scanned, nor ever will be unless I choose for them to be. I don’t have to do anything to achieve this. They won’t scan anything unless I decide to go ahead.
That is the very meaning of opt-in.
Opt-out typically means that someone will go ahead with something unless you decline. This is not that.
I do agree that if I don’t want on device scanning in future, I will need to choose another could photo service, but in the meantime, nothing will be scanned without me taking positive action to initiate it.
Encryption does not help, Apple still is responsible.
If Apple intends to let the user store photos in iCloud (or send by imessage) encrypted, they either have to keep the keys, so they can decrypt and scan the photos or or to keep the user from uploading incriminating content.
Apple found a third way: they will only get to reconstruct the keys if the user uploads too many pictures triggering alarms.
The US aren't the only government with a stake in that. And countries like China, Saudi, the Emirates have a lot of leverage. Financially and diplomatic. Heck, Facebook bowed to Myanmar just to get the users there.
Every cloud infrastructure holder is required for doing that. Closing an eye does not take a duty away. You must be actively pursuing that. Encryption would start flood of new laws
Tarsnap exists so either it is legal when done right or tarsnap is a walking dead and I haven't heard anything to that effect from any credible source.
I guess that service slightly goes out of the scope for active scanning, because it is for general backup, not a cloud especially for photo sharing and storing.
And that is my point: by tying oneself to the mast, denying oneself the access to navigate after the sweet sweet sound of user data, it becomes possible to sail straight past the sirens.
Today this is less about physically tying management and physically putting wax in the crews ears and more about technically and legally making oneself unable to touch the juicy juicy customer data.
Encryption itself is not illegal, but it might make harder to comply other legal requirements. I have just heard this many times, and now I read the whole law (curse me). It says on 2258Af part especially that there is no requirement to find evidence of CSAM material all the time. However, if NCMEC especially shares some information about visual depictions and asks to stop redistribution, then provider is required to comply in some cases. For example if they share hashes and these should be stopped. To be able to stop this data, then search is required and complying this with E2E encryption is not possible.
Ones they know of…
> What other technical approach are people advocating for?
Apple already has a technical solution, encryption.