This fake crisis of "Going Dark" as if we haven't been that way for all of time before the Internet is dangerous. The "scary" notion that the police won't be able to read everything about everyone is being recast by the Feds as if it really is national crisis. Benjamin Wittes suggests on Lawfare we make Common Carrier Immunity conditional on the company being able to make all data available in the clear to the government. This is dangerous ground.
While I oppose things like encryption backdoors, I think it's disingenuous to say this is a "fake crisis." The 4th amendment has always required balancing security and privacy--that's why the distinction between "unreasonable searches" and reasonable ones appears right there in the text. And society has always balanced those two interests with a simple mechanism: the police can only search with a warrant, but once they have a warrant their power to search is nearly limitless.
Thus, the government has always had a way to search the long-distance communications mechanisms of its day. Wiretapping telegraph or telephone lines was not deemed a violation of the 4th amendment until 1967. Even after that, a wiretap was authorized with a warrant.
Now, you have a pervasive mechanism of long-distance communications, and it's increasingly opaque to the government, even with a warrant. That's an unprecedented state of affairs.
What's unprecedented to an even greater degree is the scale at which mundane communications are now recorded. In the past, the bulk of private communications were opaque to the government, even with a warrant, because people exchanged private information mostly verbally, face to face.
Today many analogous conversations take place online, in instant messages, in email, and in other persistent media. Even when the content of a communication isn't recorded as an artifact of its medium of expression and transmission, there's almost always a record left behind to indicate that the communication took place, revealing associations and hinting at what was said.
So by emphasizing the legal history of searches of "long-distance communications", you've moved the goalposts by a mile. Changes in the way we communicate with one another, reflecting changes in communications technology (and changes in society), mean that governmental powers in searches and surveillance, powers which formerly applied only to relatively rare forms of communication, now apply very broadly. At the same time the government's cost of performing such searches and surveillance has plummeted, multiplying the power of police and spies to monitor our words and use them against us.
And maybe in such a way that creates the worst of both worlds. The sufficiently informed and conscientious covert actor can find some way to keep much of their communications obscured. This utterly freaks out state apparatuses that consider legibility a key goal. So they work harder to more broadly collect signals AND chip away at obscuring methods. Would-be coverts escalate. So does the state.
End result? It could well be that the motivated/educated still can keep enough of their secrets hidden, but communications for the mass of people who don't have the reasons/resources for participating in the arms race end up largely transparent. And, well, if the resulting panopticon falls short of realizing its original purpose, it's still bound to have a certain amount of utility one way or another -- at least, not without a high degree of accountability.
Prior to the Internet, cipher schemes existed that could delay or foil the best government analysts, and that took significant government efforts to even attempt to crack. It was never illegal to use such a code and transmit the result on a letter, postcard, or phone. And a government warrant would not compel the decipherment of such a scheme, unless someone had possession of a physical key usable for decipherment.
The two differences now: we've made that technology available to non-experts, and we hope that no amount of government effort can crack the scheme (as opposed to obtaining the key).
I agree with your comment that it's not a "fake crisis", though. It's one with only one right answer, but it's a "crisis" in the sense that no possible path forward will make both parties happy, so we fundamentally need the government to either realize they're wrong or to lose. Government positions don't change easily, and governments do not like to lose.
While it bothers me to see terms like "common ground" used, as they imply that both positions have grounds worth considering, I do think one of the few paths that has a hope of success is to convince the government that there exists a position they can adopt that doesn't look like it goes back on their current stance.
It's mostly not about banning encryption though. The debate mostly centers on regulating the products and services companies can provide to facilitate such communications. If there was a precursor to companies promising to make it easy for regular joe criminal to encrypt his postal mail, it's conceivable that the government would have fracked down on those companies.
> It's one with only one right answer, but it's a "crisis" in the sense that no possible path forward will make both parties happy, so we fundamentally need the government to either realize they're wrong or to lose. Government positions don't change easily, and governments do not like to lose.
It's not the "government" versus "the people." It's a small group of people who strongly support surveillance, a small group who strongly oppose it, and a mushy middle that tends to lean towards whatever makes them feel safe. People in each group are represented within government, though for obvious reason people in the first group tend to gravitate toward positions involving national security or defense.
> It's mostly not about banning encryption though. The debate mostly centers on regulating the products and services companies can provide to facilitate such communications.
Banning effective encryption, or banning commercial encryption, is still effectively banning encryption, or forcing it underground and casting suspicion on it.
> It's not the "government" versus "the people." It's a small group of people who strongly support surveillance, a small group who strongly oppose it, and a mushy middle that tends to lean towards whatever makes them feel safe. People in each group are represented within government, though for obvious reason people in the first group tend to gravitate toward positions involving national security or defense.
The people within government who oppose backdoors have yet to be very vocal, or effective. I'd certainly love to see a large outpouring of support from government, to counter the level of support for the pro-backdoor position.
Many people in the tech community don't want to get behind those in government who oppose back doors because they are not absolutist in their rhetoric about privacy. They want to assert there there is not even a debate to be had between security and privacy and that privacy should always win. But when the vast majority of the actual voting public is concerned just as much about safety as about privacy, if not more, that's not a tenable position for elected and appointed officials to take.
Framing the issue as security/safety versus privacy would accept the rhetoric that keeping encryption legal and non-broken would reduce security/safety. (Which aligns with the similar rhetoric about pervasive surveillance.)
The comments by the NSA director didn't seem to have any significant effect on the general government message. As for the White House position (which I had not seen the announcement of, so thank you for the link), they said they won't seek legislation, but they hardly need to at this point; those in Congress seem more than happy to keep proposing such legislation, and I've seen no suggestions that the White House would veto it if passed.
> The debate mostly centers on regulating the products and services companies can provide to facilitate such communications.
You're calling them products but the relevant thing they want to regulate is still more speech.
If you want to communicate with your friends in code then you first have to communicate the code itself. In this context the code is code, but code is speech.
Code is not speech any more than electronic circuits are speech. That is to say that in unique contexts when the code is itself a means of expression code may be speech,[1] but not when it is used to build something that enables communication.
[1] Bricks can also be expression in unique contexts. That doesn't mean that bricks are speech.
A brick or electronic circuit is speech when being used as a medium of expression. Code is speech all the time because it can't be anything else. It's pure information. You can't email someone a brick.
It happens we have machines that will turn that information into action, but the code isn't the machine or the action. It's just a type of speech that machines can understand too.
People are always wanting to regulate speech by combining it with a machine, but the machine and the speech are separate. They don't have any specific relationship. Apple makes a) a general purpose computer and b) computer software. But (modulo DRM/copyright) you could run that software on any general purpose computer and use that general purpose computer to run any software.
It's like trying to regulate what information you can print in a newspaper by claiming you're regulating the printing press.
It is speech. If you wanted to you could even go find the source code and translate it into english in such a way that a relatively competent programmer could turn it back into code ("If the first bit in the byte is 1 then do .... other wise do ....").
But it's clearly also a tool. I've never read the source code to the software I use to encrypt my hard drive. It's unlikely that I ever will. I just care that it does the job I want it to.
Trying to say that it's either one or the other is silly. It's both.
But just because it's speech doesn't mean that the government might not have an interest in regulating it. The first amendment is not absolute. I can imagine a great many prima facie arguments supporting the idea of regulating encryption software. The fact that code is speech is not, in and of itself, a defense against any of them.
As with most cases of constitutional law, it comes down to weighing competing interests. Failing to acknowledge these varying interests fails to acknowledge the actual question at hand.
> Trying to say that it's either one or the other is silly. It's both.
I'm not trying to say that it's one or the other. I'm trying to say that there is no part of it that isn't speech. There is not a part which is a tool and a distinct part which is speech. The whole of it is speech. All you're saying is that it's possible to use pure speech as a tool. But what of it?
You can't win by talking about balancing because encryption software is meta. You can use it to distribute it. If people who are breaking no law have the right to be able to communicate without government surveillance then the government would have to violate that right universally to enforce any rule restricting the distribution of software, because distributing software over a secure channel is indistinguishable from any other communication of the same size. It's hard to imagine anything that could justify that level of intrusion, and certainly not anything that has been proposed as a countervailing interest in this context.
Code qua written expression is speech. The government can't ban, for example, the publication of encryption code. But that doesn't mean the government loses the ability to regulate the operation of a product just because the operation is implemented using code rather than electronic circuits.
As to your printing press hypothetical, I don't think it applies. Say the back door is something like "must keep the decryption key around in case a warrant comes in." That doesn't entail any modification to the "speech" coming out of the device, does it? So how does it restrict speech?
> But that doesn't mean the government loses the ability to regulate the operation of a product just because the operation is implemented using code rather than electronic circuits.
Maybe it helps to better define what you mean as the product whose operation is to be regulated. If it's the hardware, it's a general purpose computer that can run any software. We quickly go to a bad place if you can't buy such a thing, e.g. side-loading on Android is prohibited, Raspberry Pi and RISC-V are prohibited, every device must prevent you from compiling a custom program and running it, etc.
But if it's the software then you're banning the publication of encryption code.
Obviously the confusion stems from Apple being the go-to example and Apple not only providing both the hardware and the software but also actually enforcing the kind of restrictions on what software the user can run that would be unreasonable as a requirement imposed by the government on the entire market. Apple could [try to] prevent you from using encryption software on an iPhone, but it makes little sense to require only them to do that if anyone can still run it on an Android phone or PC. But the alternative is that nobody can buy anything capable of running Debian or OpenBSD (or even Windows).
Moreover, the operation of the product is the thing carried out by the owner, not the manufacturer. Apple makes a nice machine and a big detailed list of things you can do with it but the user is the one choosing which buttons to press.
> As to your printing press hypothetical, I don't think it applies. Say the back door is something like "must keep the decryption key around in case a warrant comes in." That doesn't entail any modification to the "speech" coming out of the device, does it? So how does it restrict speech?
So there are two questions here: One is, can I have the code that doesn't keep the encryption key? If not then it restricts the speech you can receive and that of other people who want to give you that code (perhaps so you can use it to communicate more sensitive information with them).
Then there is the key itself. Keeping the key (or sending it to Apple or Uncle Sam) is compelling speech. The key is also information and the key + ciphertext is equivalent to the plaintext. It's equivalent to a requirement that you keep the plaintext of all your communications.
It would effectively be compelled written testimony before the fact. Either you keep everything written down or go to jail for not having it.
> It's equivalent to a requirement that you keep the plaintext of all your communications.
> It would effectively be compelled written testimony before the fact. Either you keep everything written down or go to jail for not having it.
While I agree with you, that particular argument won't necessarily succeed, considering https://en.wikipedia.org/wiki/Sarbanes%E2%80%93Oxley_Act and similar laws regarding reporting and information retention policies. You don't want the government equating policies to protect user communication as equivalent to destruction of evidence.
I fail to understand this point. Are you saying that because you have to transfer the tool (encryption software) over a medium, the tool is now speech?
I am pretty...unsympathetic...to this interpretation.
We could also say that Americans have always had a right to encrypt their long-distance communications, and the government is now questioning this right for what may be the first time.
As Eben Moglen also observed in a closely related context, Americans have also always had the right to use languages or shorthands of their choice when communicating, many of which significantly (perhaps even entirely) hindered the government from understanding them, and sometimes by design.
"While I oppose things like encryption backdoors, I think it's disingenuous to say this is a "fake crisis." The 4th amendment has always required balancing security and privacy--that's why the distinction between "unreasonable searches" and reasonable ones appears right there in the text."
That would be mildly interesting - if this were a 4th amendment issue. It's not.
It is a first amendment issue. If I choose to communicate with you with a (seemingly) random stream of numbers, that is protected by the 1A of the Bill of Rights. Just like a KKK rally.[1] Just like burning a cross.[2] Just like Piss Christ.[3]
I often wonder whether this is, in fact, a 2nd amendment issue. Consider: until very recently encryption technologies were considered a banned export ( armaments ).
The government has always been able to spy on people, but before the internet, it was hard, so there was a motivation for them not to waste time and resources on it without real justification. (Not to say that surveillance never happened for stupid or malicious reasons, but there was at least a practical cap.)
In the age of the internet, that protection has been lost, and the government has taken advantage of the increased ease of surveillance to try to monitor everyone, all the time. Unfortunately, there's no way to put the genie back in the bottle--we either put up with ubiquitous surveillance, or fight it; either way, the era of limited, focused surveillance is over.
Warrants have never enabled truly limitless powers of search. Aerial search, blood tests, etc are new inventions, and nothing says that technology always favors the searcher. "Unprecedented" just means "get used to it."
Interesting. What limitations on search warrants are you thinking of when you say that? What are some examples of things investigators wanted to search, or perhaps could search pursuant to a warrant, that were later found to be so beyond the pale that no warrant could authorize them?
Interesting. Why do you think the question is what is beyond the pale. Interesting. Or maybe "condescending" is the word. Math, physics, and realty have always limited search. Try executing a search warrant on a satellite. We don't require satellites to come down out of orbit for a search warrant. Similarly, turning math into contraband would be equally stupid.
Unlike this comment, mine was written in good faith. I thought you might have had an example of a set of circumstances in which a judicial search warrant was deemed, perhaps by some higher court, to have been unconstitutional owing to what it tried to search.
That's not the issue. The issue is whether the majesty of government trumps reality. In reality, math obscures secrets perfectly. The choice is whether to make that math illegal.
This isn't a unique case. Sovereign power is potent, but it isn't unlimited in theory or practice, and it isn't unchanging. Genetic modification, cheap aviation, robots, 3d printing, cryptocurrencies, etc. challenge sovereign power and related stakeholders. For good and ill.
It is doubtful that genetic engineering can be meaningfully regulated. That's probably got consequences greater that perfect secret-keeping. Governments will have to get over it.
The reason you don't hear about that as much is that authoritarians and control freaks don't obsess on it. Or don't know that maybe they should, because you can encode a lot of information in dna and smuggle it inside a tiny insect.
The facts which Wittes gives are way off. He links to "civil suits against the companies who provide service to ISIS and other bad guys". The civil suit is regarding a Jordanian police officer who shot and killed an American military contractor. Other than this lawsuit, almost nothing is linking the officer to ISIS. If you read the lawsuit, they really have nothing linking this officer to ISIS either, other than a vague statement from a supposedly official ISIS source.
So the link to this military contractor's death to ISIS is very tenuous. This has to be established first, and there is almost nothing to establish it on - he was a Jordanian police officer, not an ISIS militant. Then you have to say Twitter is engaged in running afoul of material support of terrorism laws. These laws generally would mean something along the lines of that in some back office of Twitter, Jack Dorsey was storing up Kalashnikov rifles which he was going to ship to the Taliban in order to assault American bases there.
In 1929, the Republican (!) Secretary of State rolled up all the old World War I spying operations saying "Gentlemen do not read each other's mail". How far the US has fallen in freedom from the day Stimson uttered those words a century ago.
> In 1929, the Republican (!) Secretary of State rolled up all the old World War I spying operations saying "Gentlemen do not read each other's mail". How far the US has fallen in freedom from the day Stimson uttered those words a century ago.
He also turned it all back on, to 11, as the secretary of war, once stuff heated up in Europe.
For pretty much any political issue the people caring are a minority (on each side) with middle being populated with people who don't care or don't know what's going on.
Those people on the sides that do care then shape the debate and ultimately the outcome of pretty much any democratic decision (as little as those are still left).
Even look at public wifi. Most non-tech savvy people I know really have no problem using piblic wifi to conduct business and login into important accounts.
They want the path of least resistance, which isn‘t good for security or privacy.
Something learned in my 30y career as low level felon and spending fairly significant amounts of time (to me at least...a week in jail is no picnic!) locked up in both jail and prisons...
95% of cases are made via informants and not CSI-type investigations, so the whole "going dark" thing, I feel, isn't going to affect law enforcement the way a lot of people think it might.
I did notice, however, my last time through the system last year, that the State is now making a bunch of cases using cell tower information to put the defendants near the crime scene during the approx time of the situation.
Going dark has a significant effect on law enforcement. It'll force them to rely on active investigations instead of passively collecting data that they can review at their leisure.
I think this will reduce their ability to target people for political reasons.
"isn't going to affect law enforcement" - at least the way it is now (and has been forever), but it greatly affects the way law enforcement would _like things to be_.
And I'm not at all happy with the implications of what "they" want...
My kneejerk reaction to this is to panic about the opposite of what the concern here seems to be. Seems to me like ubiquitous surveillance via any and all available technology is a bigger threat than "going dark" would be. I know encryption can provide an obstacle for law enforcement but my inclination is to worry about privacy in society over potential lawbreakers.
It seems to me that all this report is saying is: "Hey government, don't worry about tech 'going dark', we will still have the ability to spy on people through their poorly implemented Internet of Things devices, services that won't use end to end encryption, metadata, and because software is still fragmented."
But they don't seem to even slightly condemn the simple fact that governments are turning into surveillance machines...
The submitted berklett "Don't Panic" article has a "Bruce Schneier" link to his "Security or Surveillance?" article on lawfareblog.com. It's also good reading.
If we could only make them understand that forcing the good guys to not encrypt doesn't take encryption away from the bad guys. Legislators don't understand that encryption doesn't have to be made by Apple for the bad guys to have encryption. A shared key and XOR gives you unbreakable encryption. A high school comp sci. kid could implement that.
Incorrect, if the key is shared the encryption is breakable. If you're just doing one round of XOR it's pretty easy to break the key given a known plaintext, in the same way that AES is very breakable in ECB mode.
For unbreakable encryption, I'd suggest XORing with the contents of /dev/random (assuming /dev/random is unknowable). Of course decryption may be an issue.
Some people when faced with an encryption problem think 'I know, I'll use a non-repeating random text to XOR against the plain text', now they have two encryption problems.
Wrong. It's just not secure in all contexts e.g. your example where you have a known plaintext that you can use to crack the key which then lets you decrypt something encrypted with the same key. You're discovering why XOR isn't widely used in a real-world setting.
To summarize: end-to-end encryption is making online surveillance harder. However, standards for end-to-end encryption are still fragmented and companies have incentives to not adopt it. Besides that, connected sensors, 'the internet of things' and unencrypted metadata give new surveillance possibilities.
"legitimate government interests" should be enough to scare everyone. It's not for criminals, just like gun control is not about criminals. Think the worst, 'cause it don't get any better!
Why do we need to agree? That's how this works. You compromise a bit, and they don't budge. Then you compromise again. Repeat the process until all liberties are gone.
Also, why is this framed as "going dark"? That's the ignorant people's wording. Why doesn't the author call it what it is: key escrow, or back-dooring? Use the terms of the industry you are talking about.
> If you want to beat it, brand it. Obamacare, The Surge, War on Terror.
The Surge and the War On Terror were branded by their supporters. Obamacare was branded that way by those trying to defeat it, but it hasn't worked. So, not sure your examples illustrate your point.
>The receptivity of the great masses is very limited, their intelligence is small, but their power of forgetting is enormous. In consequence of these facts, all effective propaganda must be limited to a very few points and must harp on these in slogans until the last member of the public understands what you want him to understand by your slogan. As soon as you sacrifice this slogan and try to be many-sided, the effect will piddle away, for the crowd can neither digest nor retain the material offered. In this way the result is weakened and in the end entirely cancelled out.
There's an underlying issue that the report dances around, but doesn't address directly: sovereignty. Understandably enough, US authorities expect US companies to comply with US law. When foreigners are involved, as users or counterparties, things get iffy. Diplomatic relationships, treaties, agreements, etc become dispositive.
For example, the US supports Chinese dissidents, and maybe Thai dissidents, but for sure not ISIS. And so decrypted ISIS messages would be widely shared, but decrypted messages from Chinese dissidents would not be shared with China.
Old-school sovereignty just doesn't work on the Internet. If the US pushes hard enough, some firms will fold. But some may just leave. Consider the extent to which Apple has already left the US, for tax purposes.
Regulating strong encryption out of consumer products is functionally the same as banning it. Using encryption would draw special attention. Encryption needs to be routine in order to protect the public, outside of special situations.
Imagine if its WWII and that they asked for all mail to be un-sealed..Shocking but did almost happen..as far as they got was asking Military Personnel to 'volunteer' not to seal mail..
Who? Where? In the UK that would have meant pretty much every radio would have had to be given up. At least every radio I have seen from the period had LW, MW, and SW.
It's not just about tech. The real issue is that government is running up against scaling limits.
Arguably, a government can only rule over how people relate within boundaries it can defend and control. Previously the boundaries were physical geographies, and then regulated channels (mail, PSTN, etc). Now, we have a kind of fractal boundary of peer-to-peer connections that provide tremendous freedom to organize and transact on a diminishingly microscopic scale.
Sovereignty is zero sum.
Crypto provides a kind of micro-sovereignty to users, and for a few privileged or outlying people this is an acceptable risk, but when you have constituencies of people achieving that micro-sovereignty, it cuts into the sovereignty of the state at critical level.
Imagine the strategic consequences for U.S. national security if Rhode Island became it's own country, with an impenetrable laser air shield, with it's own allies, currency, tax laws, extradition treaties, defense systems, resources, etc. It would be such a constant threat, it would make more sense to just invade.
Tor and similar systems could reach that critical mass, where they become a constant threat to the sovereignty of nations. Tech is naively forcing hard questions about the conventions that provide "stability."
The feds know they might just have to just outlaw crypto. The technology exists to detect and round up most people who use it, or enough of them that it will be hard to find people to use it with. If they have to, they will.
This dance they are doing is political posturing, testing the edges to see what kind of resistance they get, and how much political capital it is going to cost.
Like voting and graffiti, if crypto really changed anything, it would be illegal.
https://www.lawfareblog.com/out-box-approach-going-dark-prob...