Hacker News new | past | comments | ask | show | jobs | submit login

Almost no cryptography being deployed today uses prime numbers.

(Not because of any weakness, but because there are better, more efficient algorithms instead.)




RSA is still in wide use in both old and new deployments (e.g. it is still the majority of TLS handshakes).

Elliptic curves are faster and use smaller keys, but they're also a more fragmented ecosystem; plus, in particular for the older NIST curves, there is the unshakable fear that they're backdoored by the NSA (since they use magic unexplainable numbers, which RSA does not).

Thankfully ed25519 gave us an alternative free of magic numbers, and it's seeing a lot of adoption (in particular in open source software), but it's nowhere near taking over RSA.


No cryptography engineer seriously believes the NIST P-curves are backdoored, and they are in widespread use. Ed25519 is a signing scheme; it isn't a replacement for the RSA in classic TLS --- you'd be thinking of Curve25519, its sibling. The benefit of the 25519s isn't "no magic numbers", it's a structure that makes it easy to implement relatively safely. And all these curves work over prime subfields.

This is all 100% pedantry. But the belief that RSA is risky because "prime numbers" is false, and worth pushing back on. There are reasons not to use RSA, but they're not as simple as "we don't trust prime field cryptography".


No magic numbers is certainly one of the advantages of curve25519 and its siblings. The NSA already gave us one backdoored elliptic curve algorithm (Dual EC DRBG); there is no reason to trust them with magic numbers. They may be backdoored or they may not be, but every serious cryptography engineer knows there's no good reason for algorithm constants not to be generated according to public criteria if you aren't hiding anything. Sometimes they're hiding that they picked numbers that made the algorithm stronger against secret attacks they discovered (DES). Sometimes they're hiding a backdoor (Dual EC DRBG). We'd all rather they not hide anything.

More info: https://safecurves.cr.yp.to/rigid.html

Ed25519 is a replacement for RSA in the x.509 WebPKI, which is what I was trying to refer to when I said TLS. Classic TLS (as in non-PFS, the one that also used RSA to encrypt the session secret) is dead and nobody cares about replacing it with anything. There is no public key encryption involved in modern TLS; instead all you need is a signature scheme (for the certificate and for the final server to authenticate itself) and a key exchange scheme. The former can be ed25519. The latter can be curve25519 (specifically, the retroactively named X25519 ECDH key exchange).

My point is precisely that there's no inherent distrust in RSA (and some concerns with NIST EC, both the magic numbers and secure implementation difficulty), which is why we haven't abandoned it yet. There is certainly no inherent issue with prime field cryptography.


Curves in the Web PKI are overwhelmingly NIST P-curves, which, again, are only deeply mistrusted on message boards, and when needed to get the BADA55 paper accepted.

New designs shouldn't use the P-curves, because it's too easy to implement them vulnerably (for all intents and purposes any random string is a workable Curve25519 point, and that's not the case for the P-curves --- you have to do fussy input validation). But that has nothing to do with conspiracy theories about how the curve was generated.

You don't have to take my word for this; you can just read the Koblitz and Menezes paper, which takes this question on in detail.

So, no, really, none of this is true.


That the NSA picked the magic DES S-boxes in order to defend against differential cryptanalysis (which they'd discovered first), and that the NSA picked the Dual EC DRBG constants to backdoor it in a NOBUS type operation are both established facts.

Yes, curves in the Web PKI are largely P curves right now, but ed25519 is also standardized.

You're certainly free to trust the NIST curves, but my point still stands: there is no good reason not to pick such constants using a nothing up my sleeve algorithm, and the fact they didn't do that means they either know something we don't, they backdoored it, they wanted people to suspect they backdoored it, or they're idiots and decided to ignore established best practice for no reason. It's entirely possible the answer is the latter, of course :)


No, the logic in your last paragraph doesn't hold at all. You should read the Menezes paper rather than trying to derive this stuff from faulty axioms.


I wouldn't contest that no crypto engineer "seriously believes NIST P-curves are backdoored", but I know some high profile crypto engineers who seriously think and demonstrate how they might be flawed and could have been backdoored since day one. [1] [2]

It's almost impossible to prove they were backdoored, but considering the sensitivity of the subject, I understand why many consider this unknown a reason to distrust NIST P-curves.

[1] https://cr.yp.to/talks/2013.05.31/slides-dan%2Btanja-2013053...

[2] https://cr.yp.to/newelliptic/nistecc-20160106.pdf


Dan Bernstein wrote a paper everyone refers to as BADA55 which essentially suggests that virtually every curve in common use other than his are potentially backdoored, even if they're derived from fundamental mathematical constants (in fact, demonstrating that possibility is the point of the paper). So I'd be careful about using Bernstein as a load-bearing citation for this argument.

Again: Koblitz and Menezes take up this topic in detail.


Backdoored or not, the P-curves (or more specifically the standard algorithm we use for them) are hard to use and easy to misuse. djb dedicated an entire page listing all the theoretical issues with the P-curves and other elliptic curves[1], but their main weakness in practice is that they are just too prone to bad implementation and misuse.

The most well-known failure has to be the PS3 jailbreak [2]. Sony just failed to implement their RNG (or alternatively copied their RNG code from xkcd #221), which rendered their ECDSA-based crypto completely worthless.

Another famous case is the long list of JWT/JWE libraries which were vulnerable to invalid curve attacks, again completely destroying the security of their NIST p-curves (when used for encryption) [3].

Really, I don't think nobody should be using NIST P-curves if they have any choice, unless you verified your implementation yourself. And I don't even want to claim to be able to do it.

(I don't think tptacek ever said you should use the NIST curves[4], so there's no controversy there)

[1] https://safecurves.cr.yp.to/

[2] https://www.youtube.com/watch?v=LP1t_pzxKyE

[3] https://auth0.com/blog/critical-vulnerability-in-json-web-en...

[4] https://latacora.micro.blog/2018/04/03/cryptographic-right-a...


By cryptography being deployed today I meant new protocols. Like if the people who are actually in the position of picking cryptographic primitives, virtually no one reaches for RSA. Sorry if that wasn’t clear.


To be clear, there is no rational reason not to reach for RSA unless you need the smaller keys of elliptic curves or you are afraid of quantum cryptography and have to avoid elliptic curves as well.


There is a massive performance difference—not just size, but computation as well. Especially if you want a constant-time implementation. There is also an order of magnitude more ways to screw up and not do input validation correctly, thereby introducing vulnerabilities. For all these reasons there aren’t really good libraries for RSA outside of TLS implementations, and that’s a lot of baggage to inherit if you are not doing TLS.


RSA is faster for signature generation. RSA is faster by more than an order of magnitude for signature verification[1] and encryption[2].

All these libraries listed here are bad in some way for RSA?

* https://en.wikipedia.org/wiki/Comparison_of_cryptography_lib...

[1] https://www.ijser.org/researchpaper/Performance-Based-Compar...

[2] https://hal.archives-ouvertes.fr/hal-02926106/document


Your own link [2] shows RSA being 10-50x slower than the EC algo they used, which itself is many factors slower than state of the art. Are you reading the table right?


Table 3 (decryption) from link [2] shows that it took 1.265 seconds for 233 bit ECC and 0.031 seconds for 2240 bit RSA to encrypt something. The associated comment was:

>We noted that the encryption by the RSA algorithm is faster than the one with ECC algorithm.

While we are here, I will point out that the performance was almost the same for the same key lengths in the encryption case.

So ECC seems to be faster for key generation but overall slower for everything else.


Those numbers are clearly nonsense; notice how the smaller ones don't even scale with key size, while they obviously should to some extent. That paper is flawed. They obviously made a mistake.

Look at the ed25519 benchmarks:

https://ed25519.cr.yp.to/

71000 signature verifications per second and 109000 signature generations per second on a quad-core machine. That is much, much faster than RSA.

It's well established that ECC is much faster than RSA in general. I suspect I know what happened. DSA signature schemes require randomness to generate signatures, unlike RSA. They probably were using an entropy-constrained random number generator. That bottlenecked ECDSA through no fault of the algorithm. Ed25519 does not suffer from this issue, since it is constructed in a way that requires no external source of randomness (the random part is substituted with a deterministic hash of the key and input).


If you want to prove that ed25519 is generally faster than RSA for signatures I think you would have to find a benchmark that supports your contention.


https://blog.cloudflare.com/ecdsa-the-digital-signature-algo...

RSA 2048 bits: 1001 signatures per second ECDSA P-256: 9516 signatures per second

Note that 256-bit RSA provides a higher security level than 2048-bit RSA, so this is biased in RSA's favor.

https://connect2id.com/blog/nimbus-jose-jwt-6

That one has ed25519 signing 60x faster than RSA-2048. Verification is a bit slower with ed25519 (RSA is very asymmetric there due to the low hamming weight public exponents), but considering the difference in security level, it basically works out to about the same.

So signature verification is about the same for EC vs RSA, while EC is much faster than RSA for signature generation, and ridiculously faster for key generation. EC certainly isn't significantly slower than RSA for any equivalent operation, and definitely not by 40x like the paper you linked claims. That is just wrong.


It is fairly well accepted that RSA is faster than most curve based things for signature verification and your sources support that?

This is all in response to another user who said:

> There is a massive performance difference—not just size, but computation as well.

... and we don't know exactly what they meant by that, but I was pointing out that it was not that simple...

Asymmetrical performance is normally not very important in most systems due to the use of symmetric stuff to do all the heavy lifting. The exception would be if you were doing some sort of forward secrecy scheme that involved forgetting the private key for each and every message. There the speed of ECDHE is helpful vs the slow key generation of RSA. However, the Signal Protocol shows us we can use a hash ratchet for such short cycle forward secrecy schemes so that key generation speed does not have to be a significant issue.


I wouldn’t waste your time. He appears to be trolling.


This is comically false. But for the weird appeal to quantum computers, it's exactly what a Slashdot commenter would have said about elliptic curves in 1998.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: