Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a huge gap between "we will scan our servers for illegal content" and "your device will scan your photos for illegal content" no matter the context. The latter makes the user's device disloyal to its owner.


The choice was between "we will upload your pictures unencrypted and do with them as we like, including scan them for CSAM" vs. "we will upload your pictures encrypted and keep them encrypted, but will make sure beforehand on your device only that there's no known CSAM among it".


> we will upload your pictures unencrypted and do with them as we like

Curious, I did not realize Apple sent themselves a copy of all my data, even if I have no cloud account and don't share or upload anything. Is that true?


No. The entire discussion only applies to images being uploaded (or about to be uploaded) to iCloud. By default in iOS all pictures are saved locally only (so the whole CSAM scanning discussion would not have applied anyway), but that tends to fill up a phone pretty quickly.

With the (optional) iCloud, you can (optionally) activate iCloud Photos to have a photo library backed up in the cloud and shared among all your devices (and, in particular, with only thumbnails and metadata stored locally and the full resolution pictures only downloaded on demand).

These are always encrypted, with either the keys being with Apple ("Standard Data Protection") so that they're recoverable when the user loses phone or password, or E2E ("Advanced Data Protection") if the user so choses, thus irrecoverable.

It seems to me that in the latter case images are not scanned at all (neither on device nor in the cloud), and it's unclear to me whether they're scanned in the former case.

https://support.apple.com/en-us/102651


Apple doesn't do this. But other service providers do (Dropbox, Google, etc).

Other service providers can scan for CSAM from the cloud, but Apple cannot. So Apple might be one of the largest CSAM hosts in the world, due to this 'feature'.


> Other service providers can scan for CSAM from the cloud

I thought the topic was on-device scanning? The great-grandparent claim seemed to be that Apple had to choose between automatically uploading photos encrypted and not scanning them, vs. automatically uploading photos unencrypted and scanning them. The option for "just don't upload stuff at all, and don't scan it either" was conspicuously absent from the list of choices.

Why, do other phone manufacturers do this auto-upload-and-scan without asking?


I think FabHK is saying that Apple planned to offer iCloud users the choice of unencrypted storage with server-side scanning, or encrypted storage with client-side scanning. It was only meant to be for things uploaded to iCloud, but deploying such technologies for any reason creates a risk of expansion.

Apple itself has other options, of course. It could offer encrypted or unencrypted storage without any kind of scanning, but has made the choice that it wants to actively check for CSAM in media stored on its servers.


And introduces avenues for state actors to force the scanning of other material.

This was also during a time where Apple hadn’t pushed out e2ee for iCloud, so it didn’t even make sense.


This ship has pretty much sailed.

If you are storing your data in a large commercial vendor, assume a state actor is scanning it.


I'm shocked at the amount of people I've seen on my local news getting arrested lately for it and it all comes from the same starting tip:

"$service_provider sent a tip to NCMEC" or "uploaded a known-to-NCMEC hash", ranging from GMail, Google Drive, iCloud, and a few others.

https://www.missingkids.org/cybertiplinedata

"In 2023, ESPs submitted 54.8 million images to the CyberTipline of which 22.4 million (41%) were unique. Of the 49.5 million videos reported by ESPs, 11.2 million (23%) were unique."


And, indeed, this is why we should not expect the process to stop. Nobody is rallying behind the rights of child abusers and those who traffic in child abuse material. Arguably, nor should they. The slippery slope argument only applies if the slope is slippery.

This is analogous to the police's use of genealogy and DNA data to narrow searches for murderers, who they then collected evidence on by other means. There's is risk there, but (at least in the US) you aren't going to find a lot of supporters of the anonymity of serial killers and child abusers.

There are counter-arguments to be made. Germany is skittish about mass data collection and analysis because of their perception that it enabled the Nazi war machine to micro-target their victims. The US has no such cultural narrative.


> And, indeed, this is why we should not expect the process to stop. Nobody is rallying behind the rights of child abusers and those who traffic in child abuse material. Arguably, nor should they.

I wouldn't be so sure.

When Apple was going to introduce on-device scanning they actually proposed to do it in two places.

• When you uploaded images to your iCloud account they proposed scanning them on your device first. This is the one that got by far the most attention.

• The second was to scan incoming messages on phones that had parental controls set up. The way that would have worked is:

1. if it detects sexual images it would block the message, alert the child that the message contains material that the parents think might be harmful, and ask the child if they still want to see it. If the child says no that is the end of the matter.

2. if the child say they do want to see it and the child is at least 13 years old, the message is unblocked and that is the end of the matter.

3. if the child says they do want to see it and the child is under 13 they are again reminded that their parents are concerned about the message, again asked if they want to view it, and told that if they view it their parents will be told. If the child says no that is the end of the matter.

4. If the child says yes the message is unblocked and the parents are notified.

This second one didn't get a lot of attention, probably because there isn't really much to object to. But I did see one objection from a fairly well known internet rights group. They objected to #4 on the grounds that the person sending the sex pictures to your under-13 year old child sent the message to the child, so it violates the sender's privacy for the parents to be notified.


If it's the EFF, I think they went out on a limb on this one that not a lot of American parents would agree with. "People have the right to communicate privately without backdoors or censorship, including when those people are minors" (emphasis mine) is a controversial position. Arguably, not having that level of privacy is the curtailment on children's rights.


>The US has no such cultural narrative.

The cultural narrative is actually extremely popular in a 10% subset of the population that is essentially fundamentalist christian who are terrified of the government branding them with "the mark of the beast".

The problem is that their existence actually poisons the discussion because these people are absurd loons who also blame the gays for hurricanes and think the democrats eat babies.


Apple is already categorizing content on your device. Maybe they don't report what categories you have. But I know if I search for "cat" it will show me pictures of cats on my phone.


Yeah it’s on by default and I’m not even sure how to turn off the visual lookup feature :/

Yet another reason why my next phone will be an android.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: