I've often thought the use of prime numbers to be the weakness in cryptography.
Whilst theory is different in practice due to machine limitations, there are only so many prime numbers a machine can present in a limited timespan restricting the range of primes available to use.
With this in mind, and then knowing what a webserver will typically use by simply browsing the website with different encryption algo's disabled in the browser if its not possible to work out the underlying webserver software and version from a variety of methods like a simple 404 message, decoding the URL or using DPI, further limits the encryption algorithms to spend time reversing when using the replay attack method making it somewhat more targeted. Its still a sort of brute force but a more targeted brute force.
So should I see primes as a weakness in cryptography?
First of all, primes are only used to arrive at a session key, and once you have a session key you're in the land of symmetric algorithms, which provide security by permutations rather than vectoring into prime spaces. The content of a web page does not matter at all in terms of the security being provided. A 404 is just as secure as a valid home page, in terms of cryptography. (Not in terms of application security, but that's a whole different thing.)
Second, the supply of prime numbers is countable but also infinite, and the relation between a number space and the number of primes within it is well established within the workable sizes. We have upper and lower bounds on the number of primes within certain ranges. This is partly why we end up with certain key sizes as being secure and other key sizes as being insufficient. Secure key sizes (in asymmetric algorithms) partly are secure because there are so many primes that can be fodder for key generation.
Not very easy to guess just because the space in question is limited to "1024-bit primes"!
And indeed, that's part of why we use primes that are this big when we still use RSA. They're big enough that the number of possibilities makes them not easy to guess (even given the extra hint of "the secret number you're looking for is one of the factors of this 2048-bit semiprime").
Exponentials are always hard for human intuition. We start with some kind of pattern and it feels like there just aren't that many numbers that would satisfy it. But when you get out to large numbers (even numbers as large as the one above), there just are that many numbers that would satisfy the pattern!
So in a replay attack all the communication is recorded including the session keys, and whilst primes are infinite dont dispute this, where the theory (human) and the practical (cpu's) differs is the limitations in the machine hardware, in much the same way Spectre and Meltdown exploited the hardware to obtain secrets.
The point about the webserver was using methods to work out what the underlying webserver and version was and potentially the OS, after all not much point trying to decrypt an algo that is not used on the webserver, but metadata is given out by many webservers to suggest what type of webserver it is.
And then there is education, we see people believe many things because this is what's taught to them, we see this cognitive dissonance with religious people but also in "educated" people like some GP's or academics because its their life's work and I'm sure cryptography has consumed many hours of people thinking things through, without much time spent on other areas of weakness like the limitations of a cpu and memory registers.
After all how many crypto experts are fully conversant with the inner workings of a CPU design? If they were conversant with CPU designs would Spectre and Meltdown have been found years earlier?
Admittedly side channel attacks but I did say in my original post "Whilst theory is different in practice due to machine limitations" and this is the thing is the current implementation of cryptography on devices suitable when considering the design of CPU's and devices?
Cryptanalysis is a pretty robust area of research. There are in fact a lot of cryptanalysts that know CPUs quite deeply. Certain organizations spend a great deal of money to make it so. If vulnerabilities are found we learn from them and advance the understanding of cryptography design. It has been that way for 76 years now, at least.
So using the basic definition of quantum computing being "a device which can compute all permutations at once", in theory yes it is a 0x73 0x68 0x69 0x74 0x20 0x73 0x74 0x6f 0x72 0x6d but thats because these algo's do not encompass time and a place.
Build time into an algo and then you need a quantum computer which can also roll back or forward time.
You see the problem is, how do you build time into an algorithm like a particle decay? And if you could, could you then make it like a time delayed bank vault which only opens for certain windows during the course of the particle decay?
I also wonder if a place, some co-ordinates could also be built into an algo if that is even possible. There are medieval examples, namely trusted couriers but then you are not looking so much at an algo but more a self contained device to defeat quantum computing.
Anyway for now, you've got the NSA data centre at Bluffdale Utah which can sort of replicate quantum computing by running multiple instances with each using one pwd from all the possible pwd combination to look for the magic bytes in a file https://en.wikipedia.org/wiki/List_of_file_signatures in order to decrypt files. A variation of that Could be used for replay attacks on network communication.
At least thats how I would do it quickly and easily.
That’s not how quantum computers work, but you’re on the right track. And yes, time is almost always an essential part of crypto algorithms (usually proving the order of things, e.g. when using nonces, or temporary keys to block re-generation as with ephemeral keys.)
Not an expert, but I don't think it's necessarily a weakness in the current applications of cryptography. In something like RSA a typical key is >=1024 bits, which is 1.7e308 possible number so even though primes are more "limited" the actual reduction in security (if non-primes could have been used by magic) the primes that are left are still plentiful. One reason RSA is not really used anymore is because other algorithms such as elliptic curve cryptography provides much better efficiency (partially due to it not requiring primes keys). So it's less efficient than more modern technologies, but not necessarily weaker.
RSA is still in wide use in both old and new deployments (e.g. it is still the majority of TLS handshakes).
Elliptic curves are faster and use smaller keys, but they're also a more fragmented ecosystem; plus, in particular for the older NIST curves, there is the unshakable fear that they're backdoored by the NSA (since they use magic unexplainable numbers, which RSA does not).
Thankfully ed25519 gave us an alternative free of magic numbers, and it's seeing a lot of adoption (in particular in open source software), but it's nowhere near taking over RSA.
No cryptography engineer seriously believes the NIST P-curves are backdoored, and they are in widespread use. Ed25519 is a signing scheme; it isn't a replacement for the RSA in classic TLS --- you'd be thinking of Curve25519, its sibling. The benefit of the 25519s isn't "no magic numbers", it's a structure that makes it easy to implement relatively safely. And all these curves work over prime subfields.
This is all 100% pedantry. But the belief that RSA is risky because "prime numbers" is false, and worth pushing back on. There are reasons not to use RSA, but they're not as simple as "we don't trust prime field cryptography".
No magic numbers is certainly one of the advantages of curve25519 and its siblings. The NSA already gave us one backdoored elliptic curve algorithm (Dual EC DRBG); there is no reason to trust them with magic numbers. They may be backdoored or they may not be, but every serious cryptography engineer knows there's no good reason for algorithm constants not to be generated according to public criteria if you aren't hiding anything. Sometimes they're hiding that they picked numbers that made the algorithm stronger against secret attacks they discovered (DES). Sometimes they're hiding a backdoor (Dual EC DRBG). We'd all rather they not hide anything.
Ed25519 is a replacement for RSA in the x.509 WebPKI, which is what I was trying to refer to when I said TLS. Classic TLS (as in non-PFS, the one that also used RSA to encrypt the session secret) is dead and nobody cares about replacing it with anything. There is no public key encryption involved in modern TLS; instead all you need is a signature scheme (for the certificate and for the final server to authenticate itself) and a key exchange scheme. The former can be ed25519. The latter can be curve25519 (specifically, the retroactively named X25519 ECDH key exchange).
My point is precisely that there's no inherent distrust in RSA (and some concerns with NIST EC, both the magic numbers and secure implementation difficulty), which is why we haven't abandoned it yet. There is certainly no inherent issue with prime field cryptography.
Curves in the Web PKI are overwhelmingly NIST P-curves, which, again, are only deeply mistrusted on message boards, and when needed to get the BADA55 paper accepted.
New designs shouldn't use the P-curves, because it's too easy to implement them vulnerably (for all intents and purposes any random string is a workable Curve25519 point, and that's not the case for the P-curves --- you have to do fussy input validation). But that has nothing to do with conspiracy theories about how the curve was generated.
You don't have to take my word for this; you can just read the Koblitz and Menezes paper, which takes this question on in detail.
That the NSA picked the magic DES S-boxes in order to defend against differential cryptanalysis (which they'd discovered first), and that the NSA picked the Dual EC DRBG constants to backdoor it in a NOBUS type operation are both established facts.
Yes, curves in the Web PKI are largely P curves right now, but ed25519 is also standardized.
You're certainly free to trust the NIST curves, but my point still stands: there is no good reason not to pick such constants using a nothing up my sleeve algorithm, and the fact they didn't do that means they either know something we don't, they backdoored it, they wanted people to suspect they backdoored it, or they're idiots and decided to ignore established best practice for no reason. It's entirely possible the answer is the latter, of course :)
No, the logic in your last paragraph doesn't hold at all. You should read the Menezes paper rather than trying to derive this stuff from faulty axioms.
I wouldn't contest that no crypto engineer "seriously believes NIST P-curves are backdoored", but I know some high profile crypto engineers who seriously think and demonstrate how they might be flawed and could have been backdoored since day one. [1] [2]
It's almost impossible to prove they were backdoored, but considering the sensitivity of the subject, I understand why many consider this unknown a reason to distrust NIST P-curves.
Dan Bernstein wrote a paper everyone refers to as BADA55 which essentially suggests that virtually every curve in common use other than his are potentially backdoored, even if they're derived from fundamental mathematical constants (in fact, demonstrating that possibility is the point of the paper). So I'd be careful about using Bernstein as a load-bearing citation for this argument.
Again: Koblitz and Menezes take up this topic in detail.
Backdoored or not, the P-curves (or more specifically the standard algorithm we use for them) are hard to use and easy to misuse.
djb dedicated an entire page listing all the theoretical issues with the P-curves and other elliptic curves[1], but their main weakness in practice is that they are just too prone to bad implementation and misuse.
The most well-known failure has to be the PS3 jailbreak [2]. Sony just failed to implement their RNG (or alternatively copied their RNG code from xkcd #221), which rendered their ECDSA-based crypto completely worthless.
Another famous case is the long list of JWT/JWE libraries which were vulnerable to invalid curve attacks, again completely destroying the security of their NIST p-curves (when used for encryption) [3].
Really, I don't think nobody should be using NIST P-curves if they have any choice, unless you verified your implementation yourself. And I don't even want to claim to be able to do it.
(I don't think tptacek ever said you should use the NIST curves[4], so there's no controversy there)
By cryptography being deployed today I meant new protocols. Like if the people who are actually in the position of picking cryptographic primitives, virtually no one reaches for RSA. Sorry if that wasn’t clear.
To be clear, there is no rational reason not to reach for RSA unless you need the smaller keys of elliptic curves or you are afraid of quantum cryptography and have to avoid elliptic curves as well.
There is a massive performance difference—not just size, but computation as well. Especially if you want a constant-time implementation. There is also an order of magnitude more ways to screw up and not do input validation correctly, thereby introducing vulnerabilities. For all these reasons there aren’t really good libraries for RSA outside of TLS implementations, and that’s a lot of baggage to inherit if you are not doing TLS.
Your own link [2] shows RSA being 10-50x slower than the EC algo they used, which itself is many factors slower than state of the art. Are you reading the table right?
Table 3 (decryption) from link [2] shows that it took 1.265 seconds for 233 bit ECC and 0.031 seconds for 2240 bit RSA to encrypt something. The associated comment was:
>We noted that the encryption by the RSA algorithm is faster than the one with ECC algorithm.
While we are here, I will point out that the performance was almost the same for the same key lengths in the encryption case.
So ECC seems to be faster for key generation but overall slower for everything else.
Those numbers are clearly nonsense; notice how the smaller ones don't even scale with key size, while they obviously should to some extent. That paper is flawed. They obviously made a mistake.
71000 signature verifications per second and 109000 signature generations per second on a quad-core machine. That is much, much faster than RSA.
It's well established that ECC is much faster than RSA in general. I suspect I know what happened. DSA signature schemes require randomness to generate signatures, unlike RSA. They probably were using an entropy-constrained random number generator. That bottlenecked ECDSA through no fault of the algorithm. Ed25519 does not suffer from this issue, since it is constructed in a way that requires no external source of randomness (the random part is substituted with a deterministic hash of the key and input).
If you want to prove that ed25519 is generally faster than RSA for signatures I think you would have to find a benchmark that supports your contention.
That one has ed25519 signing 60x faster than RSA-2048. Verification is a bit slower with ed25519 (RSA is very asymmetric there due to the low hamming weight public exponents), but considering the difference in security level, it basically works out to about the same.
So signature verification is about the same for EC vs RSA, while EC is much faster than RSA for signature generation, and ridiculously faster for key generation. EC certainly isn't significantly slower than RSA for any equivalent operation, and definitely not by 40x like the paper you linked claims. That is just wrong.
It is fairly well accepted that RSA is faster than most curve based things for signature verification and your sources support that?
This is all in response to another user who said:
> There is a massive performance difference—not just size, but computation as well.
... and we don't know exactly what they meant by that, but I was pointing out that it was not that simple...
Asymmetrical performance is normally not very important in most systems due to the use of symmetric stuff to do all the heavy lifting. The exception would be if you were doing some sort of forward secrecy scheme that involved forgetting the private key for each and every message. There the speed of ECDHE is helpful vs the slow key generation of RSA. However, the Signal Protocol shows us we can use a hash ratchet for such short cycle forward secrecy schemes so that key generation speed does not have to be a significant issue.
This is comically false. But for the weird appeal to quantum computers, it's exactly what a Slashdot commenter would have said about elliptic curves in 1998.
So should I see primes as a weakness in cryptography?