The government initially claimed that using Clipper would be
voluntary, that no one would be forced to use it instead of
other types of cryptography. But the public reaction against
the Clipper chip was strong, stronger than the government
anticipated. The computer industry monolithically proclaimed
its opposition to using Clipper. FBI director Louis Freeh
responded to a question in a press conference in 1994 by saying
that if Clipper failed to gain public support, and FBI wiretaps
were shut out by non-government-controlled cryptography, his
office would have no choice but to seek legislative
relief. Later, in the aftermath of the Oklahoma City tragedy,
Mr. Freeh testified before the Senate Judiciary Committee that
public availability of strong cryptography must be curtailed by
the government (although no one had suggested that cryptography
was used by the bombers).
Sounds a bit like some of the conversations going on again, today. The last sentence in particular.
I'm trying to find a source. But when I was first introduced (well, investigated and made brief use of) PGP years ago I read an article on the history of it and Zimmerman. Terrorists were found to be using PGP in the 1990s. That they're using encryption is not new, the same discussion was around back then as we're having today, with many of the same features.
I posted that link on another comment. But it addresses this one to some extent, he also mentions that the terrorist use of encryption was a significant issue in the 1990s. Still can't find the source for my original comment.
French intelligence has heavily infiltrated the regional extremist Muslim community and they had no idea about these attacks in advance. Ergo, the terrorists were using encryption.
That was the speculation I heard the other evening on NPR, by someone lobbying to put limits on private citizens' use of strong crypto.
Yes, french HUMINT has heavily infiltrated extremist organizations, and failed to prevent this attack. It does not mean that it was the use of encryption that allowed ISIS/extremist organizations to execute this attack on french soil.
The french secret service approach is fundamentally different from the US, as they historically rely less on SIGINT.
I am not sure where I read it, but in presume the number of prevented attack is in the thousands.
Does this claim really need to be verified? I mean, pretty much everybody who has Internet access uses crypto these days, whether they're aware of it or not. Even some phones come with full-disk encryption by default now.
I don't know. My impression (from the few articles I saw mentioning crypto / video games) was that it was total speculation (probably before most facts were known) from some "official" source.
Worth noting: "if X is outlawed, only outlaws will have X" usually overlooks how many otherwise law-abiding citizens will legislatively become outlaws because they won't give up their right to X. (Current example: a registration/prohibition law in New York has turned about a million otherwise exemplary residents into "outlaws" as the SAFE Act has a compliance rate of about 4%.) Rather than solving the problem, legislators alienate & criminalize much of the population.
My favorite has always been: "Outlaws do X, but X isn't against the law. If we make X against the law, it will make it harder for outlaws to do X", when, in reality, outlaws weren't doing that thing legally (read: traceably) in the first place.
I always thought that part was understood. My interpretation of it has always (or for as long as I can recall) been that the "outlaws" are both actual criminals (ill-intent) and de facto criminals (no ill-intent, but caught by the current legislation for doing or possessing something that used to be legal or ignored by the law).
The interesting thing in retrospect is that the pro-crypto faction seems to have won. These days anyone that actually needs cryto can get it and use it. The amount of useful intelligence that a government can get from passive monitoring is constantly decreasing. Entities like the NSA have to concentrate on stuff like meta-data as they can be sure that most of the interesting content is unavailable to them. Signals intelligence is quickly becoming technologically obsolete. So the situation that the Clipper chip supporters feared back in the day has become real.
It would be really good if the NSAs of the world would just accept this and stop doing evil in their desperate attempt to survive.
The NSA also focused on meta-data because it was more likely to survive a legal challenge. A law enforcement observer can make note of you entering and leaving a building at certain times. Your entrance and exit is public. What they can't do, without warrants (legally, not speaking to technical ability), is observe what happens inside a private place or within (what's intended to be) private communications.
Meta-data also bypasses codes, as you point out, by revealing the network of communication (who-with-who, when, and how often). So whether the communication/interaction is recorded and understandable or not becomes less important. In the case that it is recorded and understandable, excellent, even more intel. If it's not, they still have some material to work with.
The focus on bulk collection of metadata is based on the assumption that it can be used to stop terrorists and other enemies. This assumption remains unproven. American drones routinely murder people abroad purely because of their metadata, and yet peace is nowhere nearer.
At home, it turns out that it is pretty much impossible to stop a handful of people who have access to weapons, want to do as much damage as possible, and don't fear death. Increasing the surveillance on regular citizens will not change that.
I think you are confusing military and civil intelligence, which is not terribly surprising, because they are increasingly being blurred, but it it important to keep in mind.
The NSA, at least on the margin, does not care about convictions. Their mandate has nothing to do with law enforcement.
You're right that metadata is often much more operationally important than the message. But the DEA-style parallel-construction corruption is a hobby business for the NSA.
I was using Groupwise in 2007 at a previous employer, and PGP was definitely easy to use with Groupwise. I worked in a MIS department for a large UK company, and the customer services department was in a small town in Wales, the kind of town where not many people had broadband in 2007, and the staff were not trained in any technical specialty, it was mostly just people with jobs instead of people with careers. The staff used to send banking information using PGP and Groupwise and people rarely had issues.
The problem now is the increasing number of centralised services, Google doesn't want to be storing encrypted emails within Gmail, because the content cannot be analysed for advertising purposes. And the same goes for other free email providers. It's still possible, but it is increasingly difficult.
Well, given the fact that Phil himself doesn't use PGP, I don't think I'd call it a 'conspiracy'. It just turns out that reliable, secure, identifiable crypto is hard to do...
That's why prz is doing Silent Circle now. VoIP crypto is actually easier, since you can rely on the fact that it's very difficult to convincingly forge someone's voice. Tie that to the crypto verification (via SAS) and it's easy for anyone to have a secure channel they're confident is actually secure.
I wonder how the people who wrote crypto software and provide secure messaging services feel about terrorists and other bad people using their products to execute their plans? Most of these people wouldn't have the ability or access to such technology had it not be for the efforts of a few who have made it very accessible and user friendly.
I know there are plenty of legitimate uses, but especially for the services that essentially bill themselves as secure and untraceable, you have to know at a certain point you've designed and built technology that is actively being used to hurt innocent people. For me it would be difficult to quantify if the amount of good is worth the all the bad people in the world.
This is the "think of the children" argument used by many manipulative government news agencies when discussing
anonymous money
anonymous communication
anonymous residence
anonymous weapons
etc
Maybe we should make hands illegal since they do illegal things and you can only use your hand with a license from the smart-over-lord-government, right?
Or less dramatically, all technology and all constructions from home improvements to particle accelerators to hairspray to encrypted internet should pass review from the government and people can only work on what the government approves and use things they have licenses for.
--
Do you want to apply your same argument to car manufacturers? Cars can be used by kidnappers, bank robbers, rapists and murderers to flee crime scenes. So car manufacturers should stop producing cars because all of the good they are used for (visiting loved ones in the hospital, visiting your kids baseball game, going to work) isn't worth the carnage caused by the "bad people"?
Exactly right. It goes with almost anything, if you make it, someone will end up finding a way to use it in a negative way. If everyone lived thinking X shouldn't be created because it could turn into Y then we would all be living in caves afraid of everyone else.
Give me a half hour and a $500 budget (cash) to go shopping at Home Depot, Canadian Tire or some other DIY shop and I can do some fairly bad things. Tools will always be dual use, the same encryption that powers your e-commerce can be used to fairly reliably communicate between two parties if the best-before-date of the communications is something on the order of a few weeks to months.
So you basically get to choose: we stop all tools from being freely available and we open up all our communications to criminals and governments alike or we will have to take the good with the bad.
The people who wrote crypto software (and the people who built TOR and who operate anonymous proxy servers, VPN services and so on, built bitcoin, the internet, your browser and your mail client) all realize their work is 'dual use' and there is absolutely nothing that you can do about it so just let it go and accept it.
That same hammer that can pound in a nail in the hands of a carpenter (skilled or not) can be used to bludgeon someone to death. Should we ban hammers? How should the person who invented the hammer feel?
A tool I invented is used by some pretty bad people. I don't like it much. But I recognize that that same tool is also used for good, and that those good uses are probably the majority of them. It used to bother me, but I got over it, now I do think longer and harder about if there is any possible bad use of the stuff I make that I can make harder by designing the stuff I build in a different way. It doesn't always work out and people are pretty clever about finding alternate uses.
I always find this type of reasoning absolutely hilarious. It relies on some odd hyperdualism that "good people" and "bad people" are two biologically distinct entities and that the latter engaging in activity immediately makes it zero-sum. As if good people are human and bad people are space lizards.
Bad people are people. They will do the exact same things that people in general do. They will use the exact same services that people in general use.
You know what has objectively led to much more innocent people being hurt than crypto? The Internet. Why aren't you out there guilt tripping all people who work on it?
It's like saying, how the people who invented a knife and provide a way to cut fruit feel about criminals and other bad people using their products to kill people.
I work at Silent Circle (Phil's new company), writing secure communications software. Sure, hitmen and thieves may use the software, but I trade the slightly higher risk of crime against me for the extra privacy.
That's a dangerous assumption in any situation (it's never a good idea to assume your opponent is stupid). We do not have a monopoly on mathematics; you never know who has read [1]الكتاب المختصر في حساب الجبر والمقابلة.
> I wonder how the people who wrote crypto software and provide secure messaging services feel about terrorists and other bad people using their products
Are you going to ask shoemakers the same question? I'm sure most terrorist prefer the advantage of durable shoes when fighting. While I don't have any numbers, I suspect durable shoes have helped more terrorists than encryption.
...
The problem is you're anthropomorphizing technology. Each new technology is neither good nor evil (or neutral). I'll let Feynman explain:
I think a power to do something is of value. Whether the result is a good
thing or a bad thing depends on how it is used, but the power is a value.
Once in Hawaii I was taken to see a Buddhist temple. In the temple a
man said, “I am going to tell you something that you will never forget.”
And then he said, “To every man is given the key to the gates of heaven.
The same key opens the gates of hell.”
And so it is with science. In a way it is a key to the gates of heaven,
and the same key opens the gates of hell, and we do not have any
instructions as to which is which gate. Shall we throw away the key
and never have a way to enter the gates of heaven? Or shall we struggle
with the problem of which is the best way to use the key? That is,
of course, a very serious question, but I think that we cannot deny
the value of the key to the gates of heaven.”
Encryption is just a tool, and like every tool that has ever been made it will occasionally be used jby bad people doing bad things.
We keep learning more and more science and technology, and I don't think it is truly possible to stop that trend. I suggest we start finding ways to live with each other in the presence of technologies like encryption, because he problem is only going to get worse. We, as a species, are going to have a serious problem if we haven't learned that lesson by the time someone invents a way to make nuclear weapons with common household parts... or some unknown technology that is even worse.
How can you make that argument when it's clear that bad people would have found other means to encrypt their communication anyway?
It is so easy to do basic crypto if you're a terrorist - to them, whether xMessenger.app has it is a marginal usability difference. Giving terrorists a marginal usability difference, to many, is okay in exchange for securing the bulk of innocent communications from overreaching governments.
Zimmerman's response to being described as "overwhelmed with feelings of guilt" in article following 9/11 and the possibility that his encryption tools had been used by the plotters.
Everything can be used for bad. Does someone who makes a knife feel bad about if it is used for murder or torture? Probably yes, I know I would, but do I feel guilty? Hell no. I cannot control what a bad person will do with my good thing. Encryption is no different, it can be (and is) used for good. Keeping people safe with secure communication is just one of those things. Yes that also means bad people can use it to plot an attack. It is unfortunate but there is no stopping its use. Terrorists have used encryption for as long as the "good guys" have.
In the same way one might wonder how the people that make door locks and window blinds feel about terrorists and other bad people using their products to execute their plans.