Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> any company storing images on their infrastructure in the US must report pedophilic images to the US government

Ones they know of…

> What other technical approach are people advocating for?

Apple already has a technical solution, encryption.



> Apple already has a technical solution, encryption.

How does encryption help prevent porn being sent to pre-teens?


That’s a completely different feature than the one we’re discussing. These things were announced together, but they are not the same.

Nobody is objecting to opt-in clientside content filtering.


Both of the features involve opt-in client side content filtering.

The only objections are to that.


The CSAM scanning is not opt-in.

Sure, you could stop using iCloud. That’s opt-out.


That’s not correct. This applies only to iCloud Photo Library, not to iCloud as a whole.

iCloud Photo Library is an optional feature, and there are numerous alternatives.


Doesn’t matter, it’s still opt-out if you were using iCloud Photo Library before these features were announced.

It’s really ridiculous to try to call this “opt-in”.


I was using iCloud Photo Library before this was announced.

None of my photos have been scanned, nor ever will be unless I choose for them to be. I don’t have to do anything to achieve this. They won’t scan anything unless I decide to go ahead.

That is the very meaning of opt-in.

Opt-out typically means that someone will go ahead with something unless you decline. This is not that.

I do agree that if I don’t want on device scanning in future, I will need to choose another could photo service, but in the meantime, nothing will be scanned without me taking positive action to initiate it.


> Opt-out typically means that someone will go ahead with something unless you decline. This is not that

That’s exactly what this is. If you use iCloud Photos your pictures will be scanned unless you explicitly disable iCloud Photos.

How is that not opt-out? You never get asked if you’d like to opt-in to have your images scanned for CSAM.


Encryption does not help, Apple still is responsible. If Apple intends to let the user store photos in iCloud (or send by imessage) encrypted, they either have to keep the keys, so they can decrypt and scan the photos or or to keep the user from uploading incriminating content. Apple found a third way: they will only get to reconstruct the keys if the user uploads too many pictures triggering alarms.


Source? I am not aware of a law in the US that requires Apple to actively scan images, or to store them unencrypted (or keep copies of the keys).


The US aren't the only government with a stake in that. And countries like China, Saudi, the Emirates have a lot of leverage. Financially and diplomatic. Heck, Facebook bowed to Myanmar just to get the users there.


Every cloud infrastructure holder is required for doing that. Closing an eye does not take a duty away. You must be actively pursuing that. Encryption would start flood of new laws

https://www.govinfo.gov/app/details/USCODE-2011-title18/USCO...


Tarsnap exists so either it is legal when done right or tarsnap is a walking dead and I haven't heard anything to that effect from any credible source.


I guess that service slightly goes out of the scope for active scanning, because it is for general backup, not a cloud especially for photo sharing and storing.


And that is my point: by tying oneself to the mast, denying oneself the access to navigate after the sweet sweet sound of user data, it becomes possible to sail straight past the sirens.

Today this is less about physically tying management and physically putting wax in the crews ears and more about technically and legally making oneself unable to touch the juicy juicy customer data.


Those laws do not exist (yet?). You can’t justify this as a compliance measure for legislation that does not exist.


Yes, but current laws also restrict storing images as E2E encrypted, so there is dilemma?


Where are you getting this from? That’s simply not true. It’s perfectly legal to “store images as E2E encrypted”


Encryption itself is not illegal, but it might make harder to comply other legal requirements. I have just heard this many times, and now I read the whole law (curse me). It says on 2258Af part especially that there is no requirement to find evidence of CSAM material all the time. However, if NCMEC especially shares some information about visual depictions and asks to stop redistribution, then provider is required to comply in some cases. For example if they share hashes and these should be stopped. To be able to stop this data, then search is required and complying this with E2E encryption is not possible.


Please show where those legal requirements have been applied to E2E encrypted files.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: