Seriously, who are they expecting to pay for that? AI vision detection run against _every image every person sends to anyone_, among other things, will get ridiculously expensive.
Half of the reason Microsoft is pushing "AI PCs" with special hardware is so they can push their spying to on-device and reduce all the extra costs the data processing they're imagining for things like automatic-screenshot-analysis-every-x-seconds will need.
And they're pretty much experts on spying on users. They've been collecting so much data for so long that apparently they've found a way to utilize what they collect in a way that makes the costs balance out in the end. Whether thats with government access, preferential antitrust treatment, or some actual financial method that directly affects the bottom line, I don't know. Somehow it's worthwhile for them. BUT -- when even Microsoft is looking for more efficient ways to spy on people, and forcing new hardware to support that effort, you know the data collection and analysis technique is definitely not ready to be made a legal mandate.
It doesn't make sense at all for some EU decision makers to decide it's acceptable for their citizens to bear the cost of so much data processing.
....wait, how much do large players in AI contribute to these politicians campaigns? Or if not them, who is really pushing this? It seems like someone should really try following the money on this one.
Let me take the other side here. The western world couldn't make it without sovereignty. I do realize that it sounds bad that few states would have such power. But make no mistake - if they won't do it, other actors would, I think that your interests reconcile with a democratic state much more than the other crooked actors.
It's just a matter of lesser evil in my humble opinion.
I give you that if the voter opinion was not influenced by media reports. However, since that is pretty much impossible, no... There is actually no way the "overwhelming majority" will ever want that, without being influenced...
You answered yourself : remember Apple’s implementation of CSAM detection.
We don’t own our devices anymore and we now have very limited control of what is executed or not so there is nothing stopping developers to run those legal spywares on the device since our only option if we don’t like what an app does is to not use it.
I don't know how to help folks that didn't treat the apple csam fiasco as a massive wake up call to ditch the ecosystem.
We have linux phones these days, caly, and grapheneos. There really isn't reason to give up on general computing. (Ignoring the propriety baseband blobs.)
Apple's backing down on that very sound initiative was a failure and a red flag indeed. They had the chance and the weight to pull it off and set a standard, but instead basically gave in for regulators to implement whatever spying laws they want.
Sadly, nothing else comes even close to Apple in terms of security and privacy, especially for someone who is not an infosec specialist and doesn't have time to read CVEs all day.
> Sadly, nothing else comes even close to Apple in terms of security and privacy, especially for someone who is not an infosec specialist and doesn't have time to read CVEs all day.
Even for someone that is not an infosec specialist, they should be using something like Graphene for phones and something like Qubes for their OS.
Apple isn't great at all honestly, at least in terms of MacOS security - they mostly benefit from not being worth the time to target.
While you make good points, I still wouldn't trust Apple to not scope creep over time. Client side scanning of hashes for csam presents the entry point they need to establish client side scanning as a norm. It's the preverbial inch. Give it a year, or months even, and watch that grow to include scanning of text for terroristic threats, or of teens' chats for grooming, or depression, etc. Then watch that data become a gold mine for both the gov and for advertisers.
The slope is so slippery that it's ot worth the risk, imo. It paves the way to reduce general computing even further, which is already quite restricted on apple devices to begin with.
Apple's proposed algorithm was probably the best so far.
The problem is not going away, we in tech are partly responsible and we should promote good ways to deal with it. If we don't then a solution will be found anyway, it'll just be a bad one.
Half of the reason Microsoft is pushing "AI PCs" with special hardware is so they can push their spying to on-device and reduce all the extra costs the data processing they're imagining for things like automatic-screenshot-analysis-every-x-seconds will need.
And they're pretty much experts on spying on users. They've been collecting so much data for so long that apparently they've found a way to utilize what they collect in a way that makes the costs balance out in the end. Whether thats with government access, preferential antitrust treatment, or some actual financial method that directly affects the bottom line, I don't know. Somehow it's worthwhile for them. BUT -- when even Microsoft is looking for more efficient ways to spy on people, and forcing new hardware to support that effort, you know the data collection and analysis technique is definitely not ready to be made a legal mandate.
It doesn't make sense at all for some EU decision makers to decide it's acceptable for their citizens to bear the cost of so much data processing.
....wait, how much do large players in AI contribute to these politicians campaigns? Or if not them, who is really pushing this? It seems like someone should really try following the money on this one.