FTA: "Instead, the Post reports, the “administration will continue trying to persuade companies that have moved to encrypt their customers’ data to create a way for the government to still peer into people’s data when needed for criminal or terrorism investigations.”
While eschewing attempts to legislatively mandate that tech companies build backdoors into their services, the president is continuing the status quo – that is, informally pressuring companies to give the government access to unencrypted data."
Status quo is that the government doesn't respect the privacy of its citizens.
Agreed, but it's still a massive improvement over mandating that all encryption used by citizens be thoroughly crippled. The status quo should be that the degree to which Government can intrude into citizens privacy is limited by law and subject to open public debate. We're not quite there yet, either in the US or here in the UK, but dropping the idea of crippled encryption is at least a step back from the brink even if it's not enough steps back for my liking. It's at least an implicit acknowledgement that the EFF/privacy hippies aren't completely wrong.
I think threatening to ban all secure communication in a psychotic unrealistic policy position and having important "serious" people like the director of the FBI push for it isn't really intended to get it codified into law, rather to elicit responses like yours -- it's designed to massively push the Overton window against privacy so that we can feel okay about how bad things currently are -- at least the insane unrealistic threats that were thrown around weren't followed through.
No, unfotrunately I think our 'security' establishment really are so divorced from reality that they think crippling encryption is a good idea. It would be nice to think that they're actualy technically competent and it's all a ploy, but the truth is depressingly mundane. They handed the Chinese government and other authoritarian states cover for demanding back doors in encrypted products on a platter, and sold out our privacy, without any clue how utterly stupid and self defeating an idea it is.
My thoughts mirror @wfo's, but more simplistically. It is a negotiation tactic. When seller A wants to sell something for $20, it is initially offered for $30. When buyer B's $25 counter offer is accepted, B is pleased with the result because they've purchased the item under asking price, while A is pleased for having sold at a premium.
I don't understand why respect would be part of this, or loyalty, patriotism, or whatever. Do you really think the best strategy to defend against an adversary is to shame them into behaving?
This is about what's cryptographically possible and what isn't. I don't frankly care if anyone respects my privacy, what I want is the ability to shut adversaries out, period, using math.
This decision means companies can say 'no'. That's a win, because it lets the math be possible in the first place.
Well, being realistic, we have three avenues out of our totalitarian mess we find ourselves deeply in:
1. Political (use money, shame, votes, and threats, praying that these things still work to effect change)
2. Technological (use encryption, praying technological solutions are implemented evenly and properly and that there are no back doors)
3. Violence
1 and 2 are the workable strategies, and they only work when practiced together. Currently we are seeing the government's #1 try to trample on #2, frequently using #3. If we can whittle a bit more respect for #2 using #1, it'll make the rest of the journey back to liberal democracy easier. Political changes happen through shifted mindsets.
I don't think shame is an effective tactic, because it has no real "stick" component. People will always do what they can.
The only way to stop someone from doing something is to make them unable to do it. That's not solved through shame or appeal to emotion, because the people doing the bad things aren't going to express emotions.
Keep trying to shame them, and you'll simply fail. We need to build something they can't break.
We've agreed as a country that the ability to get a warrant in the case of probable cause is a good thing. How is this not just maintaining that?
I find it hilarious how often privacy advocates manage to forget that we've had this conversation before, and while the government can and has overstepped in many ways, lets not throw out the ability to investigate at all along the way.
Because breaking encryption breaks it for everyone. If the cops can get your data, so can a hacker. There is simply no way to compromise. You either are encrypted, in which case nobody but you can decrypt your data, or you're not encrypted at all.
Remember the "TSA locks" we're all required to use on our luggage at the airport? Nobody but the government was supposed to be able to unlock them. But now anyone who wants to ruffle through your luggage can get universal keys for all TSA locks. What happens when that same scenario plays out with your bank account, your company emails, or any online store you've made purchases from?
I'm not arguing to break encryption. And neither is the white house. They want ways around this stuff- ways that don't break encryption, but if the information is available, then the ability to get to it if necessary.
Backdoors don't work because yeah, it breaks the whole system. But not everything is encrypted with these companies, that's just plain.
Perhaps I am not versed well enough in the subject, but I have a hard time envisioning any kind of system that allows government officials to get around the encryption and peer into the contents but not hackers.
There will always be some sort of secret only the government has access to, and once that secret is leaked it's game over.
You don't give the government access to their own door. The company simply retains the right to access things- you know, the same way we have it now.
All it's arguing, is to say "You don't have to encrypt literally every part of your system and delete the rest" a la what Snapchat suggests they're doing(though we don't have proof).
The White House is pressuring companies to not implement end to end encryption (see, for example, iMessage). Anything short of end to end encryption is considered broken by many people for communications between two parties.
The only way the government can get the equivalent of a wire tap is if there is no end to end encryption. What police do not want is to need to go back to pre-telephone detective work where they need to determine the location at which the parties will communicate (with modern communications there are of course at least two locations) and compromise that location to spy on the supposed criminals.
What if there was a way to give only the US government access to the data? For example, somehow have a shared encryption scheme where the user holds a decryption key to decrypt her data, but the government and the company can somehow use their keys in conjunction to decrypt the data as well. (This means company must be complicit in releasing the data, so they have the opportunity to verify/fight the warrant.) If someone came up with such a secure encryption scheme, would that be a palatable approach?
I ask this because I'm curious whether you (and people in general) believe that hackability is the only obstacle to giving government access to personal data (with a warrant, of course), or if people would still be uncomfortable with such a system.
Personally, I think such a scheme would be a good compromise, but I'm not a crypto expert so I don't know if it's in any way feasible.
I think the general opinion from the 'cryptowars' in the 90s is that there is no such scheme that does't cripple the actual security of encryption.
Key escrow came up back then though I don't think it included needing multiple keys in order to decrypt - not sure if that's mathematically possible or if there's another reason that never came up. I suppose to the government that's not much different than still needing to compel individuals for the key.
Your point about companies is interesting since things like iMessage could be MITM currently anyway to get unencrypted content for government requests. I think most people don't necessarily trust the company behavior here though and given the government's recently revealed behavior with National Security Letters and secret, massive, unwarranted data collection I don't think they should get a pass. I'd rather side on the power structure unable to collect some of the information even if they're unable to investigate.
I think fundamentally introducing multiple ways to get keys takes a secure system and makes it insecure - the access no longer rests on one individual's knowledge. In general I think that's a bad idea - it also doesn't prevent people who want to actually encrypt their information from doing so (it just harms the regular public and dumber criminals).
> needing multiple keys in order to decrypt - not sure if that's mathematically possible
As a matter of interest Shamirs Secret Sharing algorithm (https://en.wikipedia.org/wiki/Shamir's_Secret_Sharing) is quite nice in that respect. The "secret" would be the decryption key. Using Shamirs Secret Sharing algorithm you could split the key into two so you would need both parts in order to work out the decryption key. The government could have a part. The company could have another. Only when both were combined could anyone work out the decryption key.
This would mean neither the company nor the government could decrypt messages until they work together - and therefore reduce the risk of hackers and rogue employees. It wouldn't break the encryption with dangerous backdoors but would allow the authorities to quickly gain access when needed for a specific investigation.
I doubt anyone would ever implement anything like that since it would be (slightly more) complex and you would have to trust that the decryption parts are valid. But I thought I would just mention it in case anyone else might find that algorithm as interesting as I do.
That's a cool algorithm - and while it does reduce the risk of a hack succeeding I think it still could be considered a 'backdoor'. It's just a backdoor that requires to people to work together to open, harder to compromise yes - but I'd argue still not really secure.
If you have a government entity routinely breaking into companies and taking over access to things (targeting sysadmins) then the scheme doesn't work very well. I think most of the argument behind encryption falls on this idea that it either is secure (no backdoors) or it isn't. You have a spectrum of insecure things you can do that are arguably better than nothing, but they're not secure.
I've always been curious, what's the rationale for opening luggage? If there is an issue, should the person be summoned before the plane is loaded and boarded? If they think it's explosive, shouldn't the NOT TOUCH IT?
Not just bombs but every other type of contraband. Guns and drugs are the ones you commonly think of, but animals and plants can be a bigger problem, especially when you look at it in the international level.
They're investigating us, who are not suspected of any crime. Let's throw out their ability to investigate.
Traditional policework goes a very, very long way, and it's time to admit that the terrorism angle is too overblown and intentionally abused to take seriously anymore.
Ultra-narrowly targeted, ultra limited duration, warrant-required wiretaps which are disclosed as evidence in full during court trial are what is acceptable when exhaustion of traditional methods hasn't closed the case. That is the standard we should work from.
But wiretaps over SSL are worthless unless you want to spend a year letting the computer run, which isn't possible thanks to that provision in the law requiring a speedy trial.
I'd say: first throw out the ability for NSA to spy without letting the persons concerned speak about it. Then we can talk about 'warrants in the case of probable cause'. US as a country has lost a lot of goodwill and trust.
And don't forget:
- number of deaths from acts of domestic terrorism: 36
- number of deaths–homicide, suicide, and accidental–caused by firearms: 316,545
You think this stuff is just anti-terrorism? The whole reason I even MENTIONED the idea of a warrant is specifically for normal policework. It's the same as searching someone's house after a murder's taken place- just now, instead of reading their files, it's all in a secure place in their computer.
Having a reasonable way to read that stuff(WITHOUT USING A BACKDOOR. I don't want weaker crypto just for this- but businesses hosting this stuff always have their proprietary ways of handling things) is just a continuation of previous law.
Why is it considered different from a physical safe? The government already has the ability and laws to compel citizens to provide access, why should it be any different with encryption?
It shouldn't be any different. In fact there have been court cases in which individuals have been compelled to provide decryption keys for encrypted disks.
The difference here is that the government wants vastly more access in the digital world than they have in the analog world. It's like asking all safe makers to include a secret combination, unknown to the safe's owner, that can also be used to unlock the safe.
Crypto is all or nothing. You can't have "backdoors" into crypto algorithms that are secret only to the government. It is literally impossible. Someone will find those backdoors, and now all information that has been encrypted is wide open. So you are either advocating for the complete destruction of ecommerce and online privacy, or you are advocating for unfettered access to cryptography software. Which do you choose?
> Crypto is all or nothing. You can't have "backdoors" into crypto algorithms that are secret only to the government. It is literally impossible.
You don't backdoor the algorithm. You backdoor the use of the algorithm--except it is more of a front door than a back door. For instance, when the device encrypts data with a symmetric cipher, encrypt a copy of the symmetric key via a public key cipher using the FBI's public key.
Variant: split the symmetric cipher key into N shares using a secret sharing algorithm, and save each share encrypted with a different public key. Include public keys from major law enforcement agencies (FBI, Interpol), and major civil rights or human rights organizations (ACLU, EFF). Design the secret sharing so that to recover the symmetric key, you need key shares from two major law enforcement agencies and from two of the civil rights organizations.
Note: That kind of leak is extremely rare, so if that turns out to be the biggest weakness in this system then I think it has achieved my goal of showing that the only options are not completely secure and wide open.
If we are using the secret sharing variant, then it means that during the time between the leak and the time the device manufacturers push out an update to replace that key and re-encrypt the appropriate shares with the new key someone who seizes a phone and wants to decrypt it only needs the approval of one major police agency and two civil rights organizations instead of two major police agencies and two civil rights organizations.
If we are not using that variation, then it means that until a new key goes out in an update anyone who seizes your phone (and who has the leaked key) can decrypt it. One way to protect against this would be for the device to have a command that erases the saved encrypted symmetric keys. This command would require that you show it that you have a copy of the corresponding private key before it erases the encrypted keys.
Not all that rare (see: OPM leak, any of the other leaks that happen daily) especially with such an incredibly high value target. If it's a symmetric key style system, then wouldn't all previously encrypted communications be vulnerable if they've been recorded someplace? Email, ecommerce, server logs, everything.
Leaks in general aren't rare. Leaks of high value private keys are very rare.
Offhand the only ones I remember are D-Link's leak of a code signing key this year and AMI's leak of a BIOS signing key a couple years ago. I've probably forgotten some.
Keys of this value are generally not stored online, and are often split using a secret sharing system among several people.
It's not hard to set up a system where all of the decryption of the encrypted symmetrical keys of seized phones takes place on a computer that has no network connection and no interface to the outside world other than CD-RW discs [1].
Of course just because there are ways to manage the private key and its use in such system that are arbitrarily safe that doesn't mean that the FBI would actually handle their key properly. If they do lose control of it, it would be bad.
My point, though, was just to illustrate that it is not the case that providing warrant access requires putting in a backdoor that completely opens up the system to anyone who knows about the backdoor and then hoping to keep knowledge of the backdoor hidden.
Also note I'm only trying to design this for data stored on phones and tablets, not for communication systems or servers.
[1] Thumb drives would be more convenient, but they contain electronics and interface via a port that is very hackable.
RSA had to recall its SecurID tokens after Chinese hackers stole its keys, cloned tokens, and used them to break into defense contractors. So: it does happen.
Are you also going to make all other forms of encryption illegal? If not, I'll just pre-encrypt the message with an actually secure mechanism before sending it through the backdoored system.
I think what you're not understanding is that computers are literal extensions of brains and minds. We have outsourced our memories and our thought to these devices, and we must have the same assurances over them as we would over ourselves. This will become only more clear as the line between a biological memory storage device and a digital memory storage device blurs into nonexistence.
And that is also a conversation we've had before. There is no warrant you can get to force a suspect to testify against himself.
Though, it's cute that you think these "conversations" are meaningful. If the American government thinks you're too interesting to not know more about, they'll black-bag you and ship you off to a secret CIA dungeon where you'll be tortured for the rest of your life. Even if you end up being held without charge in a cushy place like Guantanamo, you'll never be released because you were held without charge and that makes you a terrorist.
I find it hilarious how surveillance advocates manage to forget that your government does not respect rule of law, it rules by law. It is the law.
Sure, but there are many avenues of investigation that were previously available to law enforcement and still are. The question in my mind is does having effectively unbreakable ways to store data do more harm than good? I think it does more good. There are still tons of options available to law enforcement that worked in the past to catch criminals. The sheer quantity of data that is stored about each one of us opens up a lot of potential for harm (which has been massively abused). What we're talking about is simply a return to the state where if you want to gather data about a person you have to mostly start at the point when they become a real suspect not before.
You also can't get a warrant to peer into the other unbreakable data store - a suspect's mind. If there were such a technology would it be ok for it to be warrantable or should some things just actually be private?
The ability to wiretap was a side effect of the POTS. But a side effect doesn't need to be a required feature of future technology. There's a good argument to be made that there's more rights in our constitution for protected and private speech by individuals than there are for the government to have access to it.
Many won't say no, because the advantage of having a government willing to look out for their bottom line is far too large to ignore for the sake of a few users.
I agree. If companies aren't compelled by law to collaborate with the government, this will incentivise the government to find other ways to persuade them.
Many (most) don't encrypt anything anyway and will just comply with warrants and requests. That's not really the point.
The point here is that where we have device and service providers like Apple and Google and whoever-just-bought-LastPass where we do trust them to encrypt our info, the government won't be making rules that compel them to undo that work.
If you didn't trust Google and Apple to begin with, the government compulsion would be meaningless anyway.
It's probably a bad idea to pass policy against passing future policy... That applies both to 'informal deals' and leaving the door open for future legislation.
Really the only reason to pass policy against policy in this case is to score political points with the Hacker News interest group, which is a pretty niche constituency...
* The FBI-NSA-etc. axis already tried to ban nonescrowed crypto. This was in 1997, when far fewer products relied on it, far fewer people used the Internet, and far fewer groups mobilized to oppose it. If even that effort failed, this one was likely to fail as well.
* Excerpt from that 1997 proposal, which was actually approved(!) by a House of Representatives committee: "It shall be unlawful for any person to manufacture for distribution, distribute, or import encryption products intended for sale or use in the United States, unless that product..." http://thomas.loc.gov/cgi-bin/cpquery/T?&report=hr108p4&dbna...
* I disclosed in 2012 that the FBI had drafted a proposed law to require backdoors; that legislation was never introduced, even as a placeholder. My 2012 article:
http://www.cnet.com/news/fbi-we-need-wiretap-ready-web-sites... Of course the FBI's bill could be kept in reserve to become Patriot Act 2.0, just like the FBI-NSA-etc. axis had EPPSCA in reserve, which morphed into Patriot Act 1.0 a month after the 9/11 attacks, as I wrote about here: http://www.cnet.com/news/how-bin-laden-and-911-attacks-shape...
* If the legislative approach is now off the table, as the WashPost piece indicates, look for the FBI-NSA-etc. axis to try more creative approaches. "Oh, you want that $2 billion government contract? You want your new device to be FIPS 140-2 certified? How about that merger or FTC antitrust review? Environmental reviews? Trade? Taxes? It sure would be a shame if things didn't go your way. Maybe you can help us and we'll help you..."
This is why it's worth supporting groups like EFF (I donated last year and need to again before the end of this year). They provide a moral argument that counters that of the Washington establishment--and also provides guidance for tech firms when they're faced with challenges like those above.
How can we know that NSL's aren't being used right now to "persuade" or coerce companies to cooperate? Who needs legislation when you effectively don't have to reveal anything?
I am surprised the petition has yet to reach 100k signatures.
Is it optimistic to think that most of the tech community at least should recognize the significance of this? Even if you doubt its effect on real policy.
I'm not. Who has time to fully comprehend (and act upon) this? Especially when we're already working >50h a week to maintain household status quo. It's hard to focus on the state of society's infrastructure when I'm spending every waking moment working to maintain my own. Yes, I know what is at stake, but one avenue is a long burn and the other is a short fuse. I'll tackle the more pressing issue first.
Disclosure: I work in the netsec industry and only signed this because HN brought it to my attention in the rare spare moments between my daily tasks. To use an analogy, I feel like I can't worry about putting out the forest fire if my house is already on fire in the midst of it. At the same time, I'm throwing money at someone that says they'll help me free up more time to do the forest fire fighting. We'll see if I've made a grave mistake in how I prioritize things.
This is my reaction whenever I hear someone suggest that "we should be rioting in the streets." I have daily personal struggles and work that I'm trying to pin down every day with all my energy. And my remaining time/energy go into finding a mate if I even have any at all.
So it turns out the real way to prevent riots is to make sure everyone is running in the rat race. Or in other words, jobs that only meet ~99% of their needs.
Wouldn't it be good to have hope, and expect change? Our government, our country, our people can change, for the better, or for the worse. I'm sad that you think the status quo can't be changed. It can and should be.
Because the drug legalizers first disobeyed, then circumvented, and finally ignored the law, and the LGBT groups cheerfully use personal intimidation and group shaming rather than wasting their time putting their faith in politicians?
The status quo can be improved but it's a process of beating the system and the people in it until they stop resisting, not choosing better ones or convincing them you're in the right.
A lot of these policies continue because they're considered optimal/necessary by the layer of thinktanks and technocrats that feed Washington its data. I doubt any administration could change them, no matter what was said on their campaign trail.
The president can change many things. They choose not to because there is a cost. Of course, checks and balances with legislators and the courts, but the executive branch has tremendous leeway in executive policy and action. Obama could order the NSA to stop tapping domestic fiber and they would. He chooses not to because he doesn't want the Washinton insiders to attack him. This is our country, not the career politicians. It's not a cliche, it's true. Those of us who are voting citizens should realize that politics need not be a Hegelian compromise, elect people that affect positive change, and punish and vote out those that don't.
The other day I heard someone say (and I'm not exaggerating or excluding key context here), "I think Obama is similar to Hitler. He's changing our Constitution and taking away our freedoms." I couldn't begin to list the things Obama said Bush had overreached on and would end, that he later renewed in one form or another.
Meta: We might have more respect -- in limited measure -- for the government's position if they actually cleaned up their own malfeasance, instead of persistently, predictably trying to bury it.
> the administration will continue trying to persuade companies that have moved to encrypt their customers’ data to create a way for the government to still peer into people’s data
If you are one of these companies that are informally cooperating with the government on this, please state so publicly, in the signup process and by message to current users, so that we can avoid using your services now, or at least when it's discovered later if you weren't honest about it up front.
Well put. We tried to take a very privacy-protective stand when building Recent News (https://recent.io) because of the amount of personalization we do. This warrant canary is part of our privacy policy:
As of [date], we have not received any legal process or demand from any federal, state, or local government that includes a gag order. We have received no National Security Letters, civil subpoenas, search warrants, Foreign Intelligence Surveillance Act orders, grand jury subpoenas, or any other form of compulsory process accompanied by a gag order.
As of [date], we have received no legal orders requiring us to monitor users' future activities or to modify our service.
If we do receive any form of compulsory process from any government entity, we will do our best to ensure that our users' legal rights and privacy rights under the Fourth Amendment to the U.S. Constitution are protected. That includes challenging overly broad orders in court.
It is still valid, I'm happy to say, for [date] values of today.
IANAL, but the judge may order you to keep the warrant canary or find you in contempt. Law is based on intent and if your intent in removing that clause is in broadcasting you are under gag order, when the gag order restricts you from doing exactly that, you may find that will be taken as a breach.
This should not be taken as legal advice, YMMV, yadda yadda.
Why does the Apple one get special kudos? They just say they "don't allow government access to their SERVERS" - they never say anything about the data.
FTA: "Instead, the Post reports, the “administration will continue trying to persuade companies that have moved to encrypt their customers’ data to create a way for the government to still peer into people’s data when needed for criminal or terrorism investigations.”
While eschewing attempts to legislatively mandate that tech companies build backdoors into their services, the president is continuing the status quo – that is, informally pressuring companies to give the government access to unencrypted data."
Status quo is that the government doesn't respect the privacy of its citizens.