Hacker News new | past | comments | ask | show | jobs | submit login
On the Impending Crypto Monoculture (metzdowd.com)
322 points by tonyg on March 24, 2016 | hide | past | favorite | 124 comments



This was an inevitable consequence of Bernstein being one of the very few cryptographers simultaneously devoted to:

* Theoretical rigor

* Competitive performance ("new speed record for X" being a theme of his research)

* Misuse-resistant constructions and interfaces

He was doing this stuff before it was cool (he's as far as I can tell the only cryptographer to have written something like qmail) and the rest of the industry is struggling to catch up.

The list of Bernsteinisms is slightly less scary in context:

* Curve25519 and EdDSA are sort of related work, and the alternative in non-DJB cryptography would be "NIST P-curve and ECDSA over that P-curve". The consensus seems to be that Edwards curves are superior for a bunch of practical reasons, and Bernstein pioneered them, so: no surprise.

* Poly1305 and ChaCha20 are virtually never used independently; they can been seen as DJB's most current AEAD construction. There are plenty of competing AEADs; in fact, there's a competition (CAESAR) underway that DJB is involved in.

So really, that's not that much scarier than the fact that AES and SHA-3 share an author.


He's also pro-freedom. He's fought the US government in court. He goes to hacker cons and mixes with us mortals. People who've met him tend to like and trust him.


I've met him (I'm in Chicago). I like him. But he is not the only cryptographer with that reputation. Phil Rogaway has a similar reputation, and went as far as to put a no-military-use patent on OCB.


> Phil Rogaway has a similar reputation, and went as far as to put a no-military-use patent on OCB.

Had he not done that, it might be more useful. When it expires, maybe we'll see it put to good use.


As someone who has a highly ambivalent attitude toward the military for very real, legitimate reasons I'd love it if you could elaborate. I myself live that way now, as in; I don't work for the military, anyone who subcontracts to the military or the homeland security, etc. And it is precisely for reasons that would push one to include a no-military-use patent on OCB. fwiw I'm not intimately familiar with the figures involved, I know who they are, read some of bernsteins code/math/writing, and no vaguely who Rogaway is (but haven't yet had a chance to read the "moral character" essay).


It is basically incompatible with both Open Source and Free Software definitions. ie:

"6. No Discrimination Against Fields of Endeavor" https://en.wikipedia.org/wiki/The_Open_Source_Definition

and

"The freedom to run the program as you wish, for any purpose (freedom 0)." https://en.wikipedia.org/wiki/The_Free_Software_Definition


It happens that Rogaway added an explicit license for open-source software, regardless of military use, relatively recently (2013).

http://web.cs.ucdavis.edu/~rogaway/ocb/license.htm

Part of the history here is that OCB used to have a patent grant for all GPL software, again regardless of military use, and then in 2012 authorized Mosh to use it under GPL + two exceptions.

https://github.com/mobile-shell/mosh/blob/master/ocb-license...

(In the general case, yes, such a restriction on use makes it non-free / non-open-source, which makes it annoying for people who want to use the software through an intermediary like Debian.)


However, Rogaway also issued a separate patent license for free and open source use (which does not exclude military uses).

http://web.cs.ucdavis.edu/~rogaway/ocb/license1.pdf

So while it's true that free and open source software would not be able to impose a no-military-use restriction, Rogaway in this case is exempting free and open source projects from that restriction.


>> However, Rogaway also issued a separate patent license for free and open source use (which does not exclude military uses).

That's true, and if you were making a one off library like NaCl OCB might be a good choice. But the restrictions means it couldn't be adopted into any standard for interoperable software. Though ISTR that recently there was another license grant that allows it to be incorporated into TLS.

In addition because it was never adopted into any standard interoperable software it didn't get as much scrutiny as it might have otherwise, and so even for a one off you might be skeptical.

The long and short of it is that if OCB hadn't been patented at the time it was invented it probably would have "won" and been widely used. GCM was invented several years later, has some worse performance characteristics, and AFAICT considerably harder to implement well in software. Instead it is something that is taught but rarely used in practice. Whether Rogaway prefers this world to the counterfactual one, only he knows.


Free and open source software don't allow that sort of restriction. So while Rogaway's intentions are honorable, the results aren't coherent.


I'm not 100% sure, but I think this is a choice of one of a few licenses for the software rather than a dual license. The licenses aren't coherent if you apply the licenses simultaneously, but I think only one of the licenses need apply to the software for a given user.

So you can release free and open source software which might be used for military purposes, OR you can release closed sourced software, but it can't be used for military purposes. The effect is that if your software will be used for military purposes you have to release it as free and open source software.

In this case it's not a dual license where you're bound by the terms of both licenses, it's a situation where you choose one of the licenses based on which set of terms you can comply with. Multi-option licenses like this are frequently used for proprietary software to enforce tiered pricing (such as nonprofit/personal/corporate licenses all applying to the same software but different licensees).

This is my interpretation, but I'm not a lawyer. I wouldn't stake my business on my interpretation without consulting a lawyer and neither should you.


The fact that the current formulation of the details of most open source licences don't allow restrictions against military use doesn't make them incoherent. Just like you can argue that open source software makes the world a better place, you can argue that making the military more powerful makes the world a worse place.


> Free and open source software don't allow that sort of restriction. So while Rogaway's intentions are honorable, the results aren't coherent.

I'm not quite sure what you're referring to, but I don't think any interpretation is quite right.

* "No existing FOSS license limits military applications of its covered software." That's true, but Rogaway's license doesn't even limit military applications of covered software either.

* "If a FOSS license limited military applications of its covered software, it wouldn't qualify as a FOSS license." Also true, but Rogaway's license doesn't do that.

* "Some FOSS licenses try to make it hard to grant a patent to some of their users but not to all." Also true, but Rogaway's license probably doesn't do that (depending on how we interpret the part about "Software Implementation"), and not all licenses have this property.

* "If a patent grant is specific to a particular program, FOSS projects can't accept or rely on it as a matter of policy." Maybe true for some projects, but this concern comes up more regularly for copyright licensing rather than patent licensing, and not part of the definition of FOSS licensing, and Rogaway's license only refers to the licensing style rather than the identity of the program.

* "If a patent grant is specific to FOSS, then FOSS projects can't accept it as a matter of policy." I don't think this is true at all; I think many FOSS projects have been quite happy to rely on such licenses in the past and I think there are a number of precedents for them. (That doesn't mean that the developers or projects necessarily think that software should be patentable or that they should require a license in the first place.)

I agree with the observation that Rogaway's licensing terms have discouraged standards adoption of the technology, but I don't see how they would forbid individual free and open source projects from adopting it.

I think people may have been confused by the presence of alternative patent licenses; recall that license #1 requires only FOSS (not non-military), while license #2 requires only non-military (not FOSS). If license #1 would be sufficient for FOSS projects to benefit from in the absence of license #2, it should still be sufficient in the presence of license #2. (Edit: geofft also points out that the terms of the licenses have changed over time and that #1 used to be GPL-specific -- but I think my observations still apply in the case of GPL-covered projects.)


One big reason is that with this restriction his algorithms almost certainly won't be included in any of the common security standards (FIPS 140 and NIST 800-53 are the primary ones in the US), which means that anyone who needs to follow those standards won't be able to use it.


That was the point though, no?


No, those standards dictate how many organizations beyond the military adopt security protocols.


That's a side effect. The intended purpose of those standards are government procurement.


ah, yes. that should have been obvious I suppose. thanks.


Just a guess, but maybe wtbob is getting at the fact that odd licenses are generally GPL-incompatible or otherwise awkward or ambiguous. See for example, the JSLint license https://en.wikipedia.org/wiki/JSLint#License


thanks for replying.


Did you read the OP? It is addressed there.


How is a political opinion of an individual related to his scientific work? I mean, it's not like he's releasing anything close source and asks you to believe him on his word; so, if he had opposite political views, would you see his work differently? Why?


I figured someone who knows more about this (than me) would write a good reply to this, and they did:

Ron Garret replies:

    Saying "How on earth did it come to this?” strongly implies
    that you think that the trend towards DJB’s crypto suite a
    problem, but you don’t offer much in terms of proposals for
    how to solve it, or even what a solution would look like.
    You seem to agree that a solution would *not* look like the
    status quo.  So what exactly are you advocating here?
    
    I submit that the impending monoculture in crypto is not
    necessarily a problem, any more than the monoculture in
    physics (what?  No alternatives to GR and QM?) or climate
    science is necessarily a problem.  It’s possible that crypto
    has a Right Answer, and that Dan Bernstein has discovered/
    invented it.  If you believe that simplicity and minimalism
    ought to be part of the quality metric then there may be very
    few local maxima in the design space, and DJB may simply have
    found one of them.
    
    rg


I read the "how on earth did it come to this?" question differently than Garret did, more about the rest of the field than about DJB. How did it come to a point where it seems literally only DJB is producing high-quality crypto systems that people want to move to, and nobody else is?

tptacek's answer to that question is pretty reasonable: https://news.ycombinator.com/item?id=11357587


To expand on this, part of what makes DJB's algorithms more likely to be the Right Answer in the way being discussed is that he chose curves for them that satisfy a number of security constraints, but that are also the simplest possible that satisfy those properties. In other words, if you accept the constraints put on the algorithms[1], and were to go on to determine your own implementation based on them, it's very likely you would come up with something like the DJB curves. (This is assuming you are up on the latest crypto techniques etc).

[1]SafeCurves: https://safecurves.cr.yp.to/


Unfortunately, this argument doesn't really make sense. Just like decades of software development (should) have taught us, and just like fighting climate change, doing physics research, giving antibiotics, fighting terrorists, criminals, spammers, enemy armies, eating healthy, raising kids, whatever--multiple approaches and techniques and tools are always a good thing. Why would crypto be the one single area of life where there was a Right Answer? The fact that crypto is part of an arms race with criminals and authoritarian governments is strong evidence that a monoculture is not the right approach.


It is demonstrably not the case that diversity in cryptographic constructions is always a good thing. What it usually results in is a diversity of single points of failure.


Great point. When selecting a crypto approach, it makes no sense to not choose what is apparently "the best" at this time.

However, is there a legitimate concern that we're collectively hoping DJB hasn't messed anything up? I'm no expert in the area, but are there ways to prove that his work is correct?


Many people have tried to find flaws in his math. DJB and friends now work on formal verification of implementations.


Cryptographers are mathematicians working in a limited problem space. It should therefore be no surprise that they converge on similar solutions (e.g. both RSA and D-H were discovered at GCHQ years before they were independently rediscovered by the people in the initialisms)


All those examples are extremely high level abstractions, but cryptography is not - it is a direct application of math to a single purpose.

> ...multiple approaches and techniques and tools are always a good thing.

Only at a very high level. If you switched out "cryptography" with "privacy protection" then you'd be right.


I'm reading this on an android phone; is there a way of forcing that text to not require me to scroll back and forth on every single line?


On Firefox for Android you can activate Reader View by tapping the book icon in the address bar.


If we are talking about crypto monoculture, don't AES and SHA-3 come from Joan Daemen? Also before this, the 90's crypto was basically a Ron Rivest monoculture with RC4 and RSA. This is nothing new and I believe today's monoculture is more secure than previous ones. Also, just like DES, RSA and RCA got displaced, so will DJB's monoculture if something more secure comes along.

Basically this monoculture is a consequence that crypto is very subtle and it is often better to have 1 algorithm than everybody uses implements and studies and tries to break rather than 10 that nobody really studies.


You couldn't possibly mean that enough people are working on it and that monoculture is fine, with that argument.

You just state the status quo, but make it sound like a good thing, but the impliciy reason is, that there aren't enough capable people, at least not interested ones.


Knew before clicking that this was going to be about DJB having won.

Peter Gutmann definitely has the credibility to make this critique. But saying that DJB having won is more a vote against other crypto than a vote for Dan is like saying that Git having won is more a vote against other SCMs than a vote for Linus.

Well sure, you could say that. But that would rather understate Linus' substantial contribution to thinking about version control differently and better.

Similarly DJB has won because he led the way in thinking about the crypto problem correctly. Peter basically acknowledges the underlying facts here, but seems to not want to give Dan his due.


Distributed version control wasn't Linus's idea. BitKeeper, Arch, Monotone and darcs did it first, and in some ways better. But they all had problems: BitKeeper was proprietary, Arch was baroque and difficult to use, and Monotone and darcs had poor performance (darcs egregiously so, with an unfortunately-easy-to-hit exponential-time edge case in their merging algorithm). Using Git was a vote against these systems. But there's also not as much of a DVCS monoculture: Bazaar and Mercurial are also used quite often, and could be adopted quickly if git were to implode for some reason.


Arch also had terrible performance. The primary objects that lived in repositories were changesets, which every client downloaded as .tgz files IIRC, and then replayed locally to reach the current state.

The primary innovation of both git and Mercurial was performance. Git is a highly optimized content-addressable database with version control built on top.


Arch was baroque and difficult to use

More than git? I shudder at the thought.


Well, just to give you a taste, here's an excerpt from "Revision Control with Arch: Introduction to Arch" in Linux Journal (http://www.linuxjournal.com/article/7671):

In Arch, a changeset is represented as a directory tree full of bookkeeping files, patches, new files and removed files. The best contribution technique is to create a changeset directory and then tar it up for delivery:

    $ tla changes -o ,,new-robot-comment
    $ tar czvf my-changes.tar.gz ,,new-robot-comment/
Arch ignores files beginning with two commas, an equal sign and a few other special characters. By using a ,, at the start of our changeset directory name, we avoid the annoyance of Arch complaining that our new directory doesn't exist in the archive. It is probably good practice to use your e-mail address or some other identifier in the tarball filename and changeset directory name.


I thought it was just me. Anything more than the basic commands and I get confused and have to look it up. Git is certainly popular -perhaps for many it's the first version control system they've used-and is very fast, but would you say it's well designed from a user perspective?


No, git is a UI disaster. Some of the complexity is necessary due to moving to a distributed model of version control, but most of it is lack of clarity in designing the CLI.

Is anyone who's used Mercurial able to comment on the UI differences?


Bitkeeper was first among the DVCS's, but Linus had a hand in its design. McVoy intentionally designed something to fit the (decentralised) Linux development process. The adoption of Bitkeeper for Linux and subsequent discussions gave rise to the cambrian explosion of Arch(/TLA/Bzr), Monotone, Darcs etc in the early 00s.

The famous two weeks time off to write git was many years later, when tridge finally forced his hand to do it. I don't think it's quite right to say git was a vote against anything else (except maybe against C++, which would bring some maintenance problems in the kernel community).



I don't think it's so much about not wanting to give djb his due as wanting to point out that literally the rest of the industry is shamefully bad.


That may be true, but I still think there's a negative undertone against DJB's crypto here, even if it's unintentional (if you're going to talk about a crypto monoculture, it may become inevitable to talk about DJB's work and its adoption).

However, I worry many (developers) will get the wrong message from this post: that they shouldn't use the new standards from DJB (even if they are clearly superior in every way), because that would "help create the monoculture".

Also look at how this post ends:

> So the (pending) Bernstein monoculture isn't necessarily a vote for Dan, it's more a vote against everything else.

So it's not a vote for Berstein that he got so much so right? If it's not a vote for DJB's work, then maybe he can point out what's wrong with it (other than everyone wanting to adopt it).

The post could've very well ended like this, and it would've been better for it:

> The impending monoculture (based around DJB's crypto) is showing us that we need simpler, more boring crypto everywhere, but it needs to come from other authors as well.

Something like that. He is after all arguing against non-boring crypto throughout the post, no? So that should've been his conclusion?

Maybe even add a little bit of "DJB has showed us the way - now it's time for others to pick up the torch and take it from here." But I imagine he wouldn't have gone that far.


The author of this post is a cryptographer. The people who select ciphers for products are almost invariably not cryptographers. The idea that DJB crypto is selected simply because it is "clearly superior in every way" seems inaccurate.

In fact, I think it's comments like these that set people like Guttman off. If you pay enough attention to the people who select but don't design ciphers --- ie, non-cryptographer engineers --- you're starting to hear more and more of a drumbeat of "do whatever DJB says"; if you push those people to explain those decisions, you don't usually get good answers.


> If you pay enough attention to the people who select but don't design ciphers --- ie, non-cryptographer engineers --- you're starting to hear more and more of a drumbeat of "do whatever DJB says"; if you push those people to explain those decisions, you don't usually get good answers.

Sure. I like to call that condition "secure by default". Said engineers most often don't have a lot of knowledge and experience with cryptography, so they opt for the best option available. In a lot of cases, it's DJB's work. In other cases, it's someone who followed his example (e.g. the BLAKE(2) team).

Also, this might have had something to do with this shift towards the current situation:

https://gist.github.com/tqbf/be58d2d39690c3b366ad

> Use, in order of preference: (1) The Nacl/libsodium default

> If you can just use Nacl, use Nacl. You don't even have to care what Nacl does.


I take your point, and the point of the sibling commenter. I just think Guttman is talking about a different kind of "DJB by default" phenomenon.


I think you're right.


And yet, that drumbeat is probably a good thing. And perpetuated by people like you chanting "just use NaCl" every chance you get. So it's not super fair to pick on non-crypto geeks for accepting that easy truth.


We're not talking about what library to use; we're talking about what algorithms go into standards.


But that would rather understate Linus' substantial contribution to thinking about version control differently and better.

That paradigm of version control existed previously. (Monticello in Smalltalk, for example.) However, Linus had the programming chops and name recognition to start it off down the road to what it is today.


Most important, I think, was a great big urgent real-life use case (Linux) to optimise git toward. That's why it's really good at managing a Linux-sized codebase.

(Most of what is annoying and difficult about git makes much more sense when you realise it was literally invented for Linus to handle his email, and anyone else having use for it is a serendipitious happenstance.)


One the solutions is to start using algorithm cascades instead of single algorithms where performance doesn't matter.

If you are using 10 ciphers or 10 hash functions or 10 signature schemes, then you need 10 different breakthroughs before it all falls down.

There is really no reason to not do this unless performance is important, and a lot of times performance does not really matter.

NOTE: obviously you need to this properly and use a different key for each cipher, concatenate hashes, concatenate signatures and so on. Also, you should start encrypting with the best implemented ciphers, so that plaintext is not leaked if the worst ciphers happen to have timing/cache vulnerabilities.


Since cryptosystems designed by experts virtually never use cipher cascades, having one is a pretty good indicator that your system hasn't been designed by an expert.

More importantly: Guttman's argument isn't that it's unsafe that we have this "monoculture" (how would that work? DJB turns evil and reveals the secret trapdoor in Curve25519?) --- it's that it's sad that things ended up this way.


> how would that work?

I can imagine a hypothetical world where DJB creates three crypto building-blocks, gets hailed as the next coming of Alan Turing, and then we don't do nearly the fact-checking we should when he creates a fourth, and it ends up having a horrible flaw.

That seems to be half of what this article is warning against: a blind optimistic trust that anything that DJB creates in the future will be perfect. It's not that he's an infallible god of cryptography; it's that everyone else is making obvious mistakes, and he at least has eyes.


I've heard that from some other people too. But look at the CFRG curve selection debate. There were very few appeals to Bernsteininess in that debate; it was pretty rigorous. To the extent DJB's brand played a part, it seemed to be a liability.


If you look at backdoored algorithms, there are red flags all over the place. A hallmark of DJB algorithms is that they use minimal constants that satisfy publicly declared constraints. If you look at the spec for DUAL_EC_DRBG, the backdoored rng standard, the first thing that you notice is that the constants have no explanation for how they were derived, and that they are large enough to hide a cryptographic key in.

If DJB releases a crypto algorithm with huge glaringly unexplained constants, you'd better believe people will ask questions. I don't think it's possible to both use "nothing up your sleeve numbers"[1] and to backdoor an algorithm. You'd likely need the computing power needed to brute force the EC in the first place in order to do it.

[1] https://en.wikipedia.org/wiki/Nothing_up_my_sleeve_number


It's funny that you would say this.

* During the recent CFRG curve selection debacle, Microsoft pushed for a noncompatible variant to Curve25519 (now called "PinkBikeShed") on the basis of it having minimal parameters.

* One of DJB's more notorious research papers, BADA55, is about the extent to which reliance on "easily explained math constants" is not actually safe.

Also: the problem with Dual_EC is that it's a public key RNG. It's a construction that has no reason to exist. The reason you don't have to worry about DJB's (or Rogaway's or Shay Gueron's or whatever) documents having the Dual_EC problem is that none of them are going to propose a public key RNG.


Thanks for the pointer to BADA55, I had not seen that.

Do you think it's possible for constants (selected in the way the constants for ed25519 were selected) to be backdoored ala BADA55? It seems the vulnerabilities come from using hashing algorithms on some small constant, not from the constant itself being something simply constructed.

Also, it looks like DJB did alight on PinkBikeShed originally[1] but rejected it for technical reasons. (Reasons I can't evaluate myself). Is my understanding correct that the PinkBikeShed curve has more minimal parameters than ed25519, but that it doesn't satisfy all of the security criteria that ed25519 does?

[1] https://www.ietf.org/mail-archive/web/cfrg/current/msg05745....


There is a very fiddly technical reason why PinkBikeShed is more user-friendly than Curve25519. Bernstein would say that it's a powerful security argument, but it's more of a nice- to- have than a must-have. Curve25519 won because it had an installed base.


For anyone following this part of the comment thread, the mailing list discussion about these topics is worth a read too:

https://www.ietf.org/mail-archive/web/cfrg/current/msg05619....


The text sketches a proof how PinkBikeShed from Microsoft is worse.

He also ends with "I didn't bother writing down a comprehensive critique of PinkBikeShed, and didn't imagine that anyone would ever try to revive that curve."


It's a whole lot of sentences for a very simple difference. I think "proof" is pushing it a bit far.


He wrote a lot, but it's very clear what his principles in this case are. To be constructive, other experts should either point to the principles with which they disagree or if they do agree point to the errors in his use of them.


In that argument, Microsoft would say that performance being equal, it's more important for parameters to be minimal so that the curve is maximally rigid and trustworthy. Bernstein would say that it's more important that there are no implementation pitfalls, even very subtle ones that will be hard to exploit.

I agree with Bernstein but I won't pretend it's a totally cut and dried argument.

What won the argument wasn't so much the merits of Bernstein's case, but that Microsoft was in effect arguing for something that would break compatibility with the 25519 installed base.


> BADA55, I had not seen that.

If you want a good laugh (and some discussion of BADA55), you might enjoy this[1] talk where DJB praises his "new job" at Verizon.

[1] https://projectbullrun.org/surveillance/2015/video-2015.html...


Of all the fighting that occured last year about new curces, the MS Curve25519 initiative was the ugliest and most dishonest.


I think it's just the opposite - because djb is so well known, and his designs are frequently so tight, people will be jumping at the chance to prove a flaw in anything he does from now on.


TrueCrypt offered cascading ciphers. I never liked it, however, they were never really criticized for doing that and no one ever cited that as a reason to not use it. There were lots of other reasons not to use it.


I submitted them to expert after expert. Nobody ever showed a break with many citing meet-in-middle with 2DES or whatever. Opposite of diverse, multi-encryption. Just lots of people telling me it was stupid while their single modes and protocols got smashed.

My formula varied but amounted to this: at least three passes; 1-3 of strongest ciphers (eg AES finalists); optionally 1 old one (eg IDEA, Blowfish) for obfuscation; randomized choice of ciphers, keys, IV's, and counter values; counter mode or random good ones; those sent during key exchange. Each thing in isolation had necessary pre-conditions and configuration for secure operation. In combination, they should be as strong as weakest link plus difficulty of breaking obfuscations.

Seemed to work in practice. Many years later, I see a very watered-down version called TripleSec here with similar claims. Also, Marcus Ottela implemented a subset of my scheme in Tinfoil Chat. Past that, the advice has been "use OpenSSL" or something followed by "patch OpenSSL before the latest flaw in one component affects you."

They know better, though, as easy compromises are OK when it's something everyone was doing. Maybe they're disguising CYA legal strategy as good crypto recommendations. (shrugs)


Cipher cascades have been proven to be at least as strong as the strongest cipher, but only if you use commutative encryption - multiple stream ciphers (and/or block ciphers in counter mode) xor'd together. Truecrypt did not use this construction.


So it should be AT LEAST as strong as the first cipher used in the chain. This is still a good reason to use cascade.


The paper that proves this, is actually called "Cascade Ciphers: The Importance of Being First".

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.36....


There we go! Been a while since the last positive comment on this. :)


My opinion on cascade ciphers is currently neutral. If done properly, they are a nice hedge against currently unknown weaknesses, but a more complicated construction means more places to make mistakes.


That's why I used them. There could be obscure, mathematical interactions that they introduce that are problematic. We might eventually figure those out. The enemy might be able to exploit those without knowing which ciphers, which order, and which initial data.

Meanwhile, there's a steady stream of problems in regular ciphers, modes, and protocols that justify mitigating any one in a design. So, yeah, my polyciphers are a hedge. They also exploit combinatorial explosion by randomizing extra, large numbers. Used to by resources required to simulate the combos of algorithms themselves but FPGA's made that obsolete. :(


> Since cryptosystems designed by experts virtually never use cipher cascades, having one is a pretty good indicator that your system hasn't been designed by an expert.

I don't like this argument at all. Nobody is doing it so that means it is "bad"? As the grand parent said, there is no reason to not do cascading if you don't care about speed performances.


Also, it is overwhelmingly the case that the people that tend to favor cipher cascades are new to cryptography, and generally have a very activist mindset behind their choices and actions.

I've reviewed this sort of activist code (Minds.com, Tutanota, etc.) and found it to be dangerous.


> ...concatenate hashes...

Bad idea, using 10 128-bit hash functions instead of a single 1280-bit hash function. Instead of a single post hash view of the hidden data - you've given an attacker 10 different perspectives. Instead of studying and securing one function - you now have to secure 10 functions and study their interaction in 10 dimensions... If 1 in 10 of your functions is broken, the damage isn't limited to a 10% reduction in keyspace.

> ...start encrypting with the best implemented ciphers...

If that is known up front then the whole exercise is pointless. You can't know that because you can't see into the future.


Yes, that's true.

Encrypting the data before hashing with a secret key solves this issue if applicable.

> You can't know that because you can't see into the future.

The "best cipher" is the single one you would have used if you had not decided to use a cipher cascade.

The idea is to make sure that your cascade is never worse than that single cipher.


> There is really no reason to not do this unless performance is important...

There are very few situations where performance is not important in cryptography, and those situations do not, generally speaking, have sufficiently unusual security requirements that it would make sense to use a completely different set of crypto primitives for them.


> There are very few situations where performance is not important in cryptography

source?


Then you're creating an unwieldy situation where analysis is very difficult. That's just an advanced form of security by obscurity. What you're saying is, that it's better to go in blind, wearing lots of armor and padding. That strategy works if your enemies don't have anything more potent than a knife. It absolutely doesn't work if they have guns and explosives. Better to clearly know what you're up against.


I doubt devit was suggesting we stop analyzing the individual methods. If we (and to use a really stupid simple example on largely insecure functions because I'm a simpleton) hash something with...

sha1(md5(things))

We can both study the relative security of sha1() and md5() separately, but also know that using both means that an attacker must defeat both methods. This isn't security by obscurity so much as security insurance. If either sha1() or md5() is compromised, the data is still protected by the other.

Similarly, if we use ten different encryption methods on top of each other... we know a few things:

- If we're studying all of those methods for flaws, the initial challenge of beating it is defeating all ten methods. - If a method is found to be compromised, our data is still protected by nine other methods. - At the very least, we've made brute forcing this a pain in the rear, because nine of those ten layers' output is probably really hard to tell are correct.

If you have a crypto 'stack' of ten methods, and you find one is now easy to compromise, you can napkin math realize you've lost one layer of protection, so when a zero day happens on one method, you don't have to panic and assume all of your data is compromised.

But your goal remains to ensure as many of those individual layers work against 'guns and explosives'.


Your post is a good example for why cascades are so dangerous.

hash1(hash2(...)) is not as secure as the stronger of the hash functions, it may only be as secure as the weaker of the hash functions. Your example is sha1(md5(...)). Well, md5 is no longer considered collision-resistant, which means that sha1(md5(...)) also isn't collision-resistant. If you're able to obtain two different inputs that are mapped to the same hash by md5, computing the sha1 of that hash is still going to lead to the same value, i.e. a collision overall.

So your assumption that an attacker must defeat both is clearly wrong and dangerous. In the case of hash functions, I believe it would be safe two concatenate the hashes (instead of composing the functions), but now you have a bigger hash size in addition to a slower running time.


I'm not convinced that concatenating hashes is safer than a single hash function either.

Let's say you have an entirely broken hash function for which hash(A) = A.

Clearly if you cocatenate hash(A) + sha256(A), you still end up with something is broken. I think if you concatenate the hash function, you may well end up with something which is strictly worse than using either hash function (as breaking one will break both).

Also, if you have the result of both hash functions, then you may have provided data which warps the probability chart, as you now have much more information on the output state of the data, which isn't combined in a clearly useful way.

Their might be a way to combine two hashes in a way which does preserve the properties of a hash function, but I think it would be significantly tricky that you would have essentially created a new hash function using other hash functions as building blocks.

(That might be good grounds for generating a hash function resistant to attack - combine multiple different styles of hashing, but it doesn't seem to be common when hash functions are designed which suggest to me there are fundamental problems. (e.g. most hash functions repeat the same functional blocks, rather than having a large selection of different blocks).)


> Clearly if you cocatenate hash(A) + sha256(A), you still end up with something is broken.

Now it could depend on your treat model. You want to hash passwords (A is secret)? If yes, then this is broken. You want an integrity check / tamper resistance on A, but A is public (see git)? Then this could be fine. Correct me if I'm wrong.


f(X)=sha1(concat(md5(X), X)), then.


What you have in that case is a 128-bit prefix with, last I heard, less than 60 bits of security - due to md5 being so broken. Now the attacker can start the first round of sha1 with a massively reduced number of potential permutations. That isn't a problem for sha1 today, but if it is tomorrow - then you are much worse off then you would have been just feeding sha1 your plaintext. The benefit of the additional computational complexity is pretty small, and not worth the additional potential point of failure.


> sha1(md5(things))

Consider what would happen if the "md5" function was found to always return the number 3 for "things" greater than 999 - the "sha1" function would be no help. The entire chain is only as strong as the weakest link. It doesn't matter how you mix the inputs, any biased input weakens the system.


It is far better to use a single function that is as strong as the additive bit strengths of the sub functions. There is not even a question about this.

For example, sha1(md5) has several flaws that jump out to crypto people. For example, a flaw in the inner function will not be protected by the outer sha1 function. In fact, the outer function will be rendered useless by an error or weakness in the inner md5 function. If I can create an md5 collision (which I just may be able to do...) then the outer sha1 is worthless.

On the other hand, if there is an error in the outer sha1, the inner md5's value as a construct is reduced - since its usefulness is wholly subject to the output of the sha1.

In addition, your construct f(g(data)), for many f.g implementations, clearly has a meet in the middle attack.

What you are creating with your stack of ten layers is a stack that _you_ cannot break, not a secure crypto stack.


Performance is always important. That's really been the main point of DJB's work -- crypto isn't widely used because it is too slow, so let's build fast crypto.


As a young person new to computer science who was originally very curious about crypto, I can say that all the admonitions against "rolling your own" is intimidating people from entering the sector. When I'm shown a gigantic tome on the topic and then told even if I know it inside and out I better not build and use anything from it for real, I figure I may as well go in another direction where if I do anything worse than the best I'm not committing a heinous mistake.

So, compared to many sectors of computer science, much more patience and willingness to toil for years before creating anything of use is required.


I think the proper elaboration to "Don't roll your own" is "Don't roll your own when it matters."

Attack the rolls of others and roll your own demo and ask for attacks.


The problem is crypto is far more important than the performance of a sort algorithm, or any other standard comp-sci algorithm.

And it's hard. I've been in this industry over 20 years, and been involved in a bunch of high-ish profile stuff, but I know that I don't have the knowledge to touch this stuff.

But in fairness it's a combination of understanding maths and understand some low level CPU stuff. If you are into that, just start learning it. Practice it. Someone has to follow on all this work. It could be you. But if it isn't, there's a ton of interesting stuff out there that isn't crypto. (Although I'd prefer everyone understood the basics of crypto, but that won't happen in my lifetime).


Yeah, it's really hard, and so I'm not smart enough to see what the author might be comparing crypto with. What's something equally hard, but has far less of a monoculture?


crypto is less CS than Applied Math . When I was graduated, there was no such thing as a CS degree - it was called Applied Math. But even then, most Applied Math didn't require very sophisticated math. Crypto does.


Agreed. People get the completely wrong idea simply because a computer is involved. It's the same as doing a black hole simulation - sure you have to code it up on a computer, but a CS student is not going to be able to do a full black hole simulation. You're going to need a physics phd, not even a physics student.

Same for crypto - you need a crypto phd for a fully secure novel crypto implementation, not a cs student.


Minor nitpick with the first paragraph:

"A major feature of these changes includes the dropping of traditional encryption algorithms and mechanisms like RSA, DH, ECDH/ECDSA, SHA-2, and AES, for a completely different set of mechanisms, including Curve25519 (designed by Dan Bernstein et al), EdDSA (Bernstein and colleagues), Poly1305 (Bernstein again) and ChaCha20 (by, you guessed it, Bernstein)."

Curve25519 is an implementation of a cryptographic group. (EC)DH is an agorithm that you can run over such a group. In fact, when you do key exchange the recommended way in libsodium, you're doing DH key exchange over Curve25519. (ECDH is simply "DH performed over an elliptic curve".)

Isn't EdDSA just ECDSA (which is again an algorithm) performed over a group that happens to be a curve in Edwards form?

Put another way, the basic maths of DH key exchange is that you have a vector space of "points" (vectors), you can add one point to another to get a new point and you can multiply a point with an integer to get another point.

DH key exchange starts with an agreed-on point P. Person A picks a secret x and sends x * P to B. Person B picks a secret y and sends y * P to B. Now A has x and y * P so can compute x * (y * P) = (xy) * P while B computes y * (x * P) = (xy) * P.

If the vector space is a certain kind of subgroup of Z ^ * _ p then this is called DH. If the vector space is an elliptic curve group we call it ECDH. Presumably if the elliptic curve is in Edwards form we should call it EdDH. But the DH algorithm is by and large the same idea in each case - you could happily define an interface for the vector space and then implement the key exchange part once over this interface, which would let you link the same KEX code against different implementations for ECDH, EdDH etc.

The same applies as far as I know for DSA/ECDSA/EdDSA.



In the 90s we had one as well: MD5 RC4 RSA all share Ron Rivest as an author.


Anyone (not saying Gutmann) who thinks djb's algorithms have been adopted due to "rampant fanboyism" missed the years of debates about elliptic curves that happened on the IRTF CFRG mailing list (primarily involving Microsoft).


They also missed the hilarious saga of the Crystalline cipher.


Never heard about this before so I read it up on the cfrg list and oh dear it was phenomenal.

https://www.ietf.org/mail-archive/web/cfrg/current/msg06826....



This is no surprise as I rallied against the monocultures and terrible implementations before. I certainly did benefit from thr concept of misuse-resistant crypto. I usually thought of this as a requirement or implementation issue (a la Design-by-Contract specs). Never thought to design the crypto system itself for this or maybe did but fleeting. I'll have to think on that more.


I can think of worse outcomes than having to use djb's software, whether it's dummy-proof libraries, well-designed, small programs that work together, or algorithms chosen as defaults by some self-authorized standards group or by a company selling bloated, proprietary software that only works with other programs written by that company.


if you look at DJB code from qmail, djbdns and his other projects you know why this is not a problem but a good thing (when compared to other projects like openSSL or even MTA's like postfix where many contributors need to find a compromise and align themselves).

the truth is that we are all brainwashed for pair-programming and working in an agile pressure-cooker without code ownership. But if you look at the quality of the work and code one single strong software architect can deliver precisely because they didn't have to compromise with other stakeholders it is clear why the result may be better.

I'm not saying that this is always true. But I have seen it times and times again where a single individual built a whole platform that simply worked mainly because he had the freedom to do so uninterrupted.


DJB delivered a really good 32C3 talk last December on post-quantum crypto which touches the subject of design quite well: https://media.ccc.de/v/32c3-7210-pqchacks


If the situation was that ciper suites based on NIST algorithms were being forced out from TLS and replaced with the new suites I might have agreed. But that is not the case. The new cipher suites with new algorithms (yes developed by DJB) just adds to the suites availble. They present an alternative to the NIST monoculture. Complement, not replace.

Also, there is now a cipher suite with AES in OCB mode. And the licensing terms as stated in the ITEF IPR disclosures makes the mode easier to use. There are also a draft for Argon2 as KDF (an algorithm not by DJB).

I was more worried when everything seemed to end up being based on AES. Not because its a bad algorithm, but beacuse we didn't have a fallback. That is actually why the ChaCha-Poly1306 suite came about. RC4 was no longer a usable fallback for AES.


Sort of related .. I had to upgrade our Kerberos crypto to something strong enough yet something that both Windows 2012 and Java supported.

There is only one option: AES128. It was quite surprising to me, though I'm very far from a crypto expert.

AES256 is also possible if you get the extra strength Java crypto libs.


AES128 is not going to be your weakest link. Ever.

And there is no reason to believe AES256 is stronger than AES128. There have been attacks against the former which are not applicable to the latter, and the key length is enough.


In case the source goes down, there's a public mirror online here: https://marc.ttias.be/cryptography-general/2016-03/msg00391....


Doesn't X25519 suffer from the same nonce-reuse issue?


X25519 is an elliptic curve Diffie-Hellman function and doesn't take a nonce. If you're asking about ChaCha20... yes, which is why nonce reuse misuse resistance is an important theme of CAESAR.


Although, in fairness, misuse-resistance is a theme of EdDSA.


Sorry, don't know why I wrote that, I was thinking ChaCha20/Poly1305. OP picks on all the other AEAD modes for failing with nonce reuse, but doesn't mention it's actually the same failure mode with ChaCha20/Poly1305.


All of his problems with GCM are fixed in the recent modification, GCM-SIV. Can't standards bodies just add that?


GCM and GCM-SIV are different constructions. GCM-SIV isn't the latest version of GCM.

GCM-SIV is good, but it doesn't address all the problems of GCM. You still need hardware support to get a safe, efficient GCM-SIV.

GCM-SIV is also way more complicated than Chacha20+Poly1305.


It's clear that they're different constructions. The two papers don't even share an author.

Why do you need hardware support for GCM-SIV?


I think DJB has been a visionary in cryptography, as much as someone can be a visionary in this field. He saw software and encryption as free speech and fought the U.S. government for it (he was only 24 at the time - how many of you are/were willing to take on the US government at 24?)

https://en.wikipedia.org/wiki/Daniel_J._Bernstein#Bernstein_...

He created all of this boring crypto that everyone wants to use now ahead of most. He even launched a site around the sole idea of post-quantum cryptography 8 years ago, because he thought it's that important and we needed to start worrying about it then, if not earlier.

https://pqcrypto.org/

Most cryptographers started paying attention to PQ crypto only after the NSA said we should worry about it, last year (because obviously we should all wait until the NSA bestows its knowledge and guidance upon us before doing anything). But even then it's more of a mild worry, as I'm not seeing the crypto community act too panicked about it, despite the fact that we probably have only about 5 years to figure out some fast and resilient algorithms and protocols, another 5 years to test them, and another 5 to deploy them worldwide.

Because in about 15 years quantum computers will probably be strong enough to break conventional encryption. I think Google expects to have a 100-qubit universal QC around 2018, and from there it should scale up easily (possibly at a rate of 2-4x every 2 years, if it's similar to D-WAVE's progress).

https://www.technologyreview.com/s/544421/googles-quantum-dr...

According to this, a 4,000 qubit computer will be able to break RSA 2048-bits:

https://security.stackexchange.com/questions/87345/how-many-...

If we get a 100-qubit in 2018 and then double-up the qubits every 2 years, then we'll have a 6400 qubits quantum computer in 2032. Maybe it will happen sooner, maybe it will happen 10 years later than predicted (although many seem to be predicting a large enough to be useful universal quantum computer within 10 years), but either way we don't have much time left to figure this out.

So I guess my point is - don't give DJB the opportunity to create yet another "monoculture" by allowing him to stand alone in paving the road for PQ crypto. Because if he does, and 15 years from now we end up adopting his PQ crypto, too, then you can't come complaining again about using "too much DJB crypto".


Post quantum crypto has been around for a while. Lattice based crypto was introduced in the late 90s.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: