Such a Faustian bargain; giving up the ability to store child porn in the cloud for reliable backups and seamless access across devices... I can't believe people put up with this!
And giving up the ability to carry a gun in exchange for safe streets for everyone! Lunacy! /s
If it would 100% eliminate child pornography and not produce a residual erosion of personal privacy no one would be arguing against it. It won't solve the problem. It will erode privacy. It's perfectly reasonable to be not ok with it.
Why does the elimination have to be 100% and the privacy erosion be 0 for it to be worthwhile?
To be clear I'm not saying that Apple are right to do this or that there aren't privacy concerns. What I am saying is that those thresholds seem unreasonable.
Because that's the only way an individual's value system wouldn't come into play. If it's anything less than completely fixed and anything more than no erosion of privacy, then it just depends on how important privacy is to the person. I personally think privacy is super important, and it would take massive changes in how society and our government work before I was willing to give it up for just about anything.
Sure. It's a bad faith argument though. Nowhere in the world, with or without guns, has absolutely safe streets. So where on the sliding scale of safe it becomes worth it is subjective.
What information leaks from the system? That a user isn't storing child porn on Apple servers? That's literally the only information emitted by the system _unless_ you have actual child porn on your system.
One, software sucks and is full of security holes, so whether or not Apple intends to leak any other data is irrelevant, because other data will leak regardless.
Two, you're assuming this is the only thing they are going to intentionally do with your data. That will change. In an infinite future, things change. Every decision is eventually overturned, and it's luck of the draw which direction that change is in.
Three, the U.S. defence apparatus has proven time and again that they will violate the law and use whatever means necessary to spy on who they want, and this is another kitchen window for them to climb through.
> Can you point to a single instance where it was used for ulterior purposes?
That's the sick genius of parallel construction: without leaks, we'd never be able to know. (And yes, there are plenty of instances of abuse, brought to the public despite the risk whistleblowing entails.)
Ask yourself this: even if the "right" people are in control of this system now, could the "wrong" people ever gain control of it? If the answer is yes, maybe the system should not exist.
Then to be consistent, you cannot use closed source, proprietary systems. That's the trade off.
You _NEVER_ know what Apple is doing under the covers. This principle is well understood and has been espoused by RMS for over a decade.
But saying "well Apple could use this system for evil if they want..." is silly. Apple could ALWAYS use their system for evil if they wanted. This addition does nothing to improve that position.
> This system already exists on iCloud. It's been in use for 2 years.
Source? That is not true as far as I'm aware.
> Can you point to a single instance where it was used for ulterior purposes?
What do you expect them to do with they receive a national security letter? Shut down? Go to prison for decades? Of course not. They will comply, as they have always done when the technical means to do so have been available. Why do you think we would even be told about it in situations when it can be kept secret?
>What do you expect them to do with they receive a national security letter?
They can't do _anything_ with an NSL on your data because they don't have the encryption keys. Without this feature, they control the key. With this feature, you control the key and they only gain access when >N tokens cryptographically indicate hits on the CSIM database. Those have to come from _your_ device. And those tokens only give access to the data in question.
Everyone understands that Apple can do anything with unencrypted content, but now Apple can also identify encrypted content because they make the phone hash it before it's encrypted. Apple is essentially lying when they say this can only be used to identify child abuse material because it can clearly be used to identify whatever Apple tells it to identify. The system doesn't care if it's used to target whistleblowers or child abusers. Apple doesn't even need to know what content they are made to target. The government could just give them a hash and demand to know who has a file corresponding to it, with zero risk that an Apple employee will leak anything the government want to suppress.
There is no reason Apple can't give you E2E encryption without any backdoors or spying features. They should act on abuse when it's reported to them by someone who can give them the key, but otherwise not concern themselves with the content that's stored.