The average person isn't even aware that VLC is an illegal circumvention tool in the United States under the DMCA. I fail to see how restrictions on encryption software would have a different outcome. Encryption is built into so many products, it's part of daily life and most people don't even know they're using it.
> I fail to see how restrictions on encryption software would have a different outcome.
The enforcement of any law passed would of course be impossible.
The reason we fight it is the implications. While our government and people depend on access to such encrypted communications, our society won't be focused on the right kind of ways to keep each other safe.
So, we should educate each other on the facts and prepare each other for a future where more and more criminals will communicate in hidden ways. On balance, we will be better off with encryption, because there are more good things that come with it than bad. But the bad things are something we can either choose to prepare for, or pretend they do not exist. I choose to prepare.
While they can't actually get rid of encryption, they can definitely make it so that actual secure cryptography is not the default in the devices and sites that the majority of purple use. If Chrome, Amazon, Android and iPhone don't support strong encryption (at rest and/or in transit), then for almost all situations encryption will be weak.
This is particularly problematic when there are network effects (e.g. messaging, email, standards acceptance), because anything that becomes the standard is an immediate target.
Encryption is therefore also an enabler of the rights to freedom of expression, information and opinion, and also has an impact on the rights to freedom of peaceful assembly, association and other human rights
I haven't been following the whole encryption debate in detail recently, but it seems there's one point which hasn't been discussed much: What about encryption which works against users' freedom, things like DRM, "trusted" boot and the whole idea of mandatory forced trust, etc.?
On the one hand, I'm against government surveillance. On the other hand, I'm also against DRM and user-hostile locked-down systems. Government surveillance, breaking DRM, jailbreaking, rooting, etc. all rely on cracks, the "imperfect" nature of security in some form or another. That's why I think this is a particularly perplexing issue, and bluntly saying "encryption is good and we should have more of it" is not seeing the whole perspective.
>What about encryption which works against users' freedom, things like DRM, "trusted" boot and the whole idea of mandatory forced trust, etc.?
Isn't this a "freedom from/freedom to" distinction?[0]
People should be free to use encryption. People should be free to use software to break DRM. People should be free to create, use and sell hardware which limits the behavior of software. Using hardware to hide keys from software for instance is a common DRM usecase, but it has applications in security and privacy as well (SGX or Apple's enclave for example). People should be free not to use DRM as well.
The problem, it seems to me, is should technology be used to enhance or degrade a users agency and security? The technology, including hardware which limits software behavior, can be used in both directions.
Things like DRM can be fixed from the user demand side. If there are enough products on the market, then users can simply choose not to use products with DRM.
On the other hand, with issues like "banned encryption", if the government has its ways, there won't be any choice at all for the users.
Whether or not in the real world there are enough choices with regards to DRM and why or why not is a question for another time.
What is left to say about this? Does anyone have any new insights?
I want encryption so I can perform online transactions safely. Or at least safe enough that the financial services I use online can ensure the movement of my money.
If I want encryption to communicate online then it all comes down to who you trust. Who do you trust? Certainly not any provider of encrypted messaging services, since we can't audit their code, can't guarantee there isn't a TLA (Three Letter Acronym) plant working there, can't guarantee it's not a TLA front, can't guarantee the service provider isn't subject to a NSL etc etc.
I'm not sure I'm convinced this is a human rights issue. How would we enforce the right? Rights that aren't enforceable aren't much use. Negative Rights[2] and all that.
The UN's 'The Universal Declaration of Human Rights'[1] is a nice document, but it's existence hasn't made those things real. Adding another article, or interpreting the right to encryption in to an existing article, does not, and can not, ensure that right.
What am I trying to say? Probably something like: if there are people who are inclined to, and have the power to, infringe your right to privacy, you're probably pretty screwed.
> What is left to say about this? Does anyone have any new insights?
There is one thing that I'm really, really wondering about. Are large cities like London and New York basically screwed?
Let me explain: One large city with 10-20 million people is obviously far more vulnerable to terrorism than 10 smaller cities of 1-2 million each. Obviously Amnesty is correct that encryption is good for freedom of speech, human rights etc. Also, encryption is necessary for business/banking in a modern world. But let's not kid ourselves, terrorists can also hide behind encryption.
Terrorism is nothing new, and there will probably be more of it in the future. I'm not a law expert, but my impression is that the countries in Europe with the worst laws for privacy is the UK and France. Probably because that is where you find the largest cities: London and Paris. These people are scared/worried. I can see no other reason why they would enact laws like these.
So in the future, when there are more terrorism attacks, knee-jerk politicians (and their voters!) will probably want even more laws restricting encryption. This will make these places even worse for both human rights and business.
The industrial revolution basically created mega-cities. Is modern computer technology making them impractical?
I see your point. Much of this vulnerability is psychological, though. If you look at how many people have died from terrorism in the past, it's not many, at least compared to things like cancer, heart disease or traffic accidents. Still, I remember how most Americans were more than happy to give up their freedoms and human rights after September 11, in order to fight terrorism. Never mind that those freedoms and human rights were probably much of the reason why the US became great (compare it to say Argentina or Russia). Ironically, I think in order to preserve freedom during periods of terrorism, we would have to do away with democracy, since most voters are such morons.
Cool. I do think we should stay on offense about this.
The DOJ has said they won't let up seeking legislative and court room power to demand warranted access to encrypted data.
The problem with this is that as a society, we are not all focused on the right ways to keep each other safe. Of course, we as technologists know that no such law is enforceable given the existence of free and open source software. But the rest of society doesn't get that, and some portion of them are simply taking our word for it. Given another terrorist attack in which encrypted communications is somehow shown to be a factor, some of the public could swing the other way.
I believe it is our civil duty as technologists to educate each other about this issue in a respectful manner. We must not assume there is some nefarious government position. That hurts our ability to convince those people who do trust the government. And, those are precisely the folks we want to convince. We're on the winning side of this ride now and we should ride the wave as long as it is carrying us towards greater public understanding of encryption and technology. We'll be safer on balance and have a better IT industry if we do so.
When everything becomes a human right, you cheapen the entire concept. So many times I read in the news "this latest issue is a human right, and that thing, etc, etc".
Let's keep human rights to the basics.
If your goal is privacy as a human right, then make that the human right, and stop there.
Not the tool you use to accomplish it! Next a computer will become a human right, and then electricity, and a keyboard. Where does it end?
Encryption is a tool, not a goal in and of itself.
Yes and that's how Amnesty references it, as an enabler of other human rights:
Encryption is therefore also an enabler of the rights to freedom of expression, information and opinion, and also has an impact on the rights to freedom of peaceful assembly, association and other human rights.
Internet access (Right to Internet) is also tool, yet it is highly linked to other fundamental human rights(1) (right to free speech, right to development, right to freedom of assembly), so encryption , imho, should be treated of the same manner as enabler of right to privacy. How good is human rights alone if tools to ensure them were banned?
Read it again. Encryption is a humans rights ISSUE, it's not being pushed as a human right in itself. Encryption affects your human right to privacy and free expression.