Hacker News new | past | comments | ask | show | jobs | submit login
2048 Bit RSA and the Year 2030 (59.ca)
239 points by upofadown on July 10, 2023 | hide | past | favorite | 149 comments



We can expect a quantum computer with 20 million noisy qubits to break RSA 2048 [1]

I can't speak to coherence time or circuit depth concerns, but qubit counts are doubling roughly every year. Current chips have thousands of qubits, so the exponential scaling implies we'd have 20 million qubits by 2035-2040.

edit: And from the paper, the required quantum volume ("megaqubitdays") scales bewteen O(n^3) and O(n^4) with RSA key length. So a few years after breaking RSA 2048, you'd have a computer five times larger that could break RSA 3072.

[1] https://quantum-journal.org/papers/q-2021-04-15-433/


Google's "quantum supremacy" paper was about a 53 qubit chip in 2019 [1]. This year, they reported having 70 qubits [2]. Looking at the papers (and supplements), the gate fidelities and qubit lifetimes have stayed roughly the same. This really doesn't look like anything like Moore's law to me.

[1] https://doi.org/10.1038/s41586-019-1666-5

[2] https://doi.org/10.48550/arXiv.2304.11119


Last time I took around 1 hour to do a review of literature, I discovered that the size of those computers were growing at a linear rate of around 4 qbits/year.

It got slightly faster on the last few years, I don't know if inherently so or just due to random fluctuation. Yet people keep repeating that claim that the growth is exponential, based on no evidence at all.


For example, is this developer roadmap far from the truth? https://newsroom.ibm.com/2022-11-09-IBM-Unveils-400-Qubit-Pl...

While it is too early to call it exponential, it is wildly faster than 4 qubits/year. Same can be said about other hardware systems too (trapped ions, neutral atoms, and with some caveats photonics and color centers too).


This is the kind of article I threw away for being unsubstantiated.

Besides, it has 2 "real" datapoints, and both are completely different from the sizes everybody else were achieving in well-public and well-reviewed computers.

If you restrict your data to devices other people were allowed to touch, a lot of those huge numbers just disappear.


People in academia are having access to 127-qubit computer, not sure about 400+ option. It more like you don't have much to do with this amount of qubits that noisy (they aren't particularly bad, just not good enough) and with sparse connectivity.


I would not believe anything I heard from IBM unless I also heard it from someone else


If we didn't invent photolithography, we'd still be using computers the size of buildings limited to universities and military installations.

Right now no one knows how to build a scalable quantum computer. But as soon as we find out the equivalent of lithography for building quantum chips, the progress will come, and it will come quickly and suddenly.


How do you know there is an equivalent process for quantum chips?


There are some equivalents already in existence (spin qubits in CMOS processes), that use the exact same lithography techniques. There are a few other potential alternatives too. All of them suffer from harder engineering challenges than the much "easier" to produce (in small volumes) transmons and trapped ions/atoms/molecules. Thus a plausible future is one in which the first somewhat-useful quantum computers are tens of thousands of qubits in transmon/ion hybrid systems. Then (in that plausible future) the spin qubits in CMOS do catch up, and thanks to their superior manufacturing scalability, they blow past the capabilities of transmons/ions.

Not too different from how we had to use vacuum lamps while we figure out how solid state systems can work... Or spinning hard drives before we figured out SDDs.

Or maybe none of this would work out and the whole field would be a bust... but you know Clarke's laws https://en.wikipedia.org/wiki/Clarke%27s_three_laws


I don't. All I'm saying is that progress can be linear/non-existing until a breakthrough is made that allows scaling. For microchips this was lithography. After such a breakthrough the scaling can (temporarily) be exponential.


The record quantum computers can factor is 21 -- and that is by cheating by already knowing the factors are 3 and 7. There are other results that use special form composites which don't count.

So a QC can factor a 5 bit number with Shor's algorithm in 2023 (with some cheating). That record has not changed for 10+ years.

I publicly bet 8 years ago that nobody would factor the number 35 by 2030. I hope I'm proved wrong.


More detail on progress towards quantum factoring of RSA (among other things):

https://sam-jaques.appspot.com/quantum_landscape

I'm not super concerned. Cynically: one big driver of research into PQ cryptography is that it's a full employment program for academic cryptographers. Not that there's anything wrong with that! I like a Richelot isogeny as much as the next guy, or I would if I understood what they were.


At least as of 2020, it looked like both qubit counts and two-qubit gate quality were improving exponentially, and our naive extrapolation said RSA 2048 wouldn't get cracked by 2039 at 95% confidence.

https://arxiv.org/abs/2009.05045

As far as I can tell, this website suggests that two-qubit gate infidelity has continued to improve exponentially, although it's hard to tell if these are reliable datapoints.

https://metriq.info/Task/38

Because there's typically an engineering tension between qubit count and gate quality, what you want to track is something like quantum volume, which looks to be on trend

https://metriq.info/Task/34

but it's notable that Google achieved quantum supremacy without having amazing quantum volume numbers, so it's not a perfect metric

https://spectrum.ieee.org/quantum-computing-google-sycamore

Worth noting that once you cross the fault tolerant threshold you will probably see a big shift in how engineering effort is distributed: many researchers think it will be harder to improve gate quality than just increase increase qubit counts to make up for it, so you may see gate quality stall and extrapolation become even less reliable.


> once you cross the fault tolerant threshold

If we cross the threshold. Until then, adding a qubit requires halving the noise floor, so the Moore's law equivalent to exponential scaling is basically adding a fixed number of qubits per year.


Not sure what you're saying.

Are you suggesting that gates surpassing the fault tolerant threshold will never be achieved? The vast majority of experts disagree with this.

Without fault tolerance, you have a hard wall on circuit depth because errors grow exponentially with depth. You can't make up for this by adding any number of qubits. "Halving the noise floor" means...what, improving gate fidelity?


That's what I'm saying. I don't really mean never, but I don't see why we have good reason to believe we'll reach that threshold this millenium.

Until then, the best we can do is if you half the noise floor (or double the fidelity) you can add roughly one ideal qubit.


You can't double the fidelity because it's already much larger than 1/2, and if you mean halving the infidelity then this doubles the depth of the deepest circuit you can perform (not just increasing it by one layer). And none of that leads to an ideal qubit.

The most obvious reason to suppose the fault tolerance threshold will be crossed is that simple extrapolation of progress predicts it will.


The vast majority of experts are being employed to say this. There is no demonstrable evidence of physical computation to the contrary.


Tenured professors who have been working in the field for decades before you ever heard about it in the news, and who have been consistently deeply critical of quantum computing hype despite negative professional and financial penalties from doing so, nonetheless still agree overwhelmingly that it will happen. You can't just believe the opposite of what swindlers are arguing and expect that to be right.


On one hand I can make a joke that you are defending the only job on Earth that not only gets paid but can't get fired past a certain point due to their 'tenure.'

On the other hand I can make a nod in the direction of the decades of expertise any one can build in a chamber of vapor ware with the hope that it will distill into something useful.

As critical as I am, I of course congratulate anyone on taking on such difficult innovation. Nonetheless, there is no physical evidence it will amount to anything thus far. It is purely hypothetical, and the only thing theoretical in justification is the fictitious proof of its use through mathematical modeling which is not physical reality and thus closer to building a virtualized computer in a MMO that mimics Earth saying you get to play as a martian using some weird technology while it runs on classical computing.


> On one hand I can make a joke that you are defending the only job on Earth that not only gets paid but can't get fired past a certain point due to their 'tenure.'

You could, but the premise is wildly false; tenured professors can get fired for cause (and tend to have some process protections alongside that), and tenured professors aren’t the only workers with contracts or legal protection that restrict firing to “for cause” (pretty much all high level employees with individual contracts like—but not limited to—executives have that, maybe with a high-priced buyout option, though without or with limited process protections), and most unionized workers and most (even if not unionized) public workers have both limitations to firing (or other adverse actions) for-cause and strong due process protections.


So people who have financial interests can't be trusted, but also people who don't have financial interest can't be trusted.

All new tech starts as purely hypothetical. (Not to mention concrete milestones like quantum supremacy, gate infidelity trends, etc.)


Yeah that's why most startups fail. Rarely do startups exist for over 40 years with no physical proof of the idea working and just a simulation of it that fundamentally undermines its reality and even more rarely are there people who don't work at those startups who defend them as if compelled by some strange force.


Quantum computing is not a startup, it's a technology. There are many technologies that took much longer than 40 years from initial conception to first useful application.


I bet they had a physical example along their way in that 40 years.


Nope. The analogy with the development of quantum sensing is actually very close: the idea of using squeezed or superposed measuring probes to increase measurement sensitivity goes all the way back to Braginsky in 1967. It took two or three decades for "demonstrations" of this effect to be realized in the lab. However, people correctly pointed out that in every case where these early demonstrations were done, it would have been much easier to carry out the same measurement using non-quantum techniques by simply scaling up the resources (usually, by increasing laser power). So these demonstrations were never actually useful for anything practical.

(Likewise, the earliest "demonstrations" of quantum computing where they factored two-digit numbers were ridiculous for several reasons, not least of which was that you could do it in a microsecond using a classical computer. Then later, when quantum supremacy calculations were done by Google, the mathematical problem chosen was completely useless for any practical application.)

It was not until 2013, 46 years after Braginsky theorized the basic principles, that a squeezed probe was used for something other than a proof-of-concept: measuring gravitational waves at LIGO. This enabled LIGO to detect ultra weak gravitational waves that would otherwise be invisible to it: https://www.nature.com/articles/nphoton.2013.177

Needless to say, it will take even longer before these techniques are used for economically relevant applications.


I understand your point, jessriedel, but my contention remains that there's a marked distinction between quantum sensing and quantum computing. As we're discussing, quantum sensing has yielded tangible, applicable results, which is the physical manifestation I was alluding to earlier.

In terms of computing, we have seen many new paradigms materialize within decades, even centuries, of their conception. Babbage conceived of the Analytical Engine in the 1800s, but we saw programmable computers by the mid-1900s. The transition from electromechanical to electronic computing occurred within a few decades. Every instance of these examples are real physical examples that were able to be used and demonstrated as a physical device providing utility. We can even go back to the early computation devices of the loom or even analog computing calendar/clock systems.

Quantum computing is an ambitious concept, and while I respect the academic rigor that goes into its development, the lack of concrete, physical outcomes, even in a rudimentary sense, after four decades is notable. Theoretical advancements are important, but the inability to materialize them in some substantial form, especially when juxtaposed against the timeline of prior computing paradigms, can warrant skepticism.

Don't get me wrong, I value innovation and the pursuit of new technological frontiers, but I also believe in questioning, and when the physical evidence is wanting, I'll voice my concerns. This is not to discredit the work being done but rather to keep the discussion grounded and accountable.


Real physical quantum computers exist, they just can't do anything useful yet, and in particular they haven't achieved fault tolerance. Quantum sensing took 46 years before there was an application. The idea of quantum computers has only been around for 38 years (Deutch's paper was 1985). They seem highly analogous to me and I don't see what distinction you're trying to point to.


I see where you're coming from, jessriedel, but I must respectfully disagree with the assertion that "real physical quantum computers exist." In the common parlance, a "computer" is a device that can perform a set of operations or computations independently following programmed instructions.

For a "real physical quantum computer" to exist by this definition, it should be able to carry out such operations using quantum phenomena, without any reliance on classical computing architecture for function, error correction, or result verification.

What we currently have in the field of quantum computing doesn't fit this bill. The quantum systems we have today are not independent "computers" but rather more like "classical computers conducting quantum experiments." They're akin to people playing an MMO and running an imaginary computational architecture that only exists within the confines of the game rules. In this sense, they're creating and operating within an entirely simulated environment.

This isn't to diminish the value of the research being conducted or the strides that are being made. But I would argue that the statement "real physical quantum computers exist" is, at this stage, a significant overreach. We may have precursors or tools that can manipulate quantum phenomena in interesting ways, but we're still a considerable distance from having an operational, standalone quantum computer in the full sense of the term.


> For a "real physical quantum computer" to exist by this definition, it should be able to carry out such operations using quantum phenomena, without any reliance on classical computing architecture for function, error correction, or result verification.

Nope, wrong. Plans for quantum computers have always included assistance from classical computers. This is like saying for a nuclear weapon to exist, it must have no conventional explosives, but in fact all nuclear weapons contain conventional explosives to initiate the nuclear reactions.

You've made a number of incorrect claims here without admitting error, so I won't continue the conversation.


> Current chips have thousands of qubits,

really? I thought we have 70 max?


I suspect d-wave nonsense was involved in the confusion.


IBM released a processor with 433 qubits last year, and they say they are on track to deliver a 1121 qubit processor this year.

https://www.ibm.com/quantum/roadmap


Do you have links to some results for this 433 processor?

The only results I've seen is this [0] from April which is 70 qbit and it's all about fighting with noise.

It looks like overcoming noise is exponentially harder with more qubits and this whole quantum thing may never work for practical problems after all?

[0] https://arxiv.org/pdf/2304.11119.pdf


if qubit counts are doubling every year then why is 21 still the largest number ever factored with a quantum algorithm


Unless you're worried about storing and/or transmitting a huge amount of keys (in the order of "at least 100/second") and/or using one key "at least 100 times/second", why not just go for 4096 by default?


Because 4096 bit RSA is a lot slower and bigger and those things matter. And there isn't any upside? If you're actually worried about 2048-bit RSA (and you should not be), you should switch to one of the elliptic curve schemes.


All other things equal, 256-bit elliptical curve cryptography is going to break before RSA2048 does with quantum computing advances.

Do NOT switch to ECC if your threat model includes a quantum computer arriving.

Either use larger RSA keys or more appropriately a hybrid signature scheme combining one of NIST's PQC signatures and a traditional algorithm.

https://csrc.nist.gov/Projects/post-quantum-cryptography/sel...


I'm not actually sure about this. the elliptic curve schemes are just as broken with quantum computers, and the larger key size of rsa seems like it might add a few years of overhead in terms of qbits needed. not an expert though


Because that will soon become "why not 128k". Or perhaps we should use 2 megabytes for each prime? We don't know, but it's better to be safe than sorry.


If you accept the current assumptions, then you would have to accept that 4096 bit RSA will be obsolete by 2060.


That's a significant amount of time if we're talking about long-term file storage.

I've had my dropbox account for over 10 years now. Being concerned about a timescale of 20 to 30 years seems reasonable for things like long term file storage IMO.

Backblaze, dropbox, google drive, onedrive, AWS are all over a decade old.


> I've had my dropbox account for over 10 years now. Being concerned about a timescale of 20 to 30 years seems reasonable for things like long term file storage IMO.

But you're relying on your chosen cloud-provider staying around for 30 years. The number of tech companies that have died in the last 30 years easily exceeds the number still standing [citation needed].


> But you're relying on your chosen cloud-provider staying around for 30 years.

Yes, and I recognize that the company existing, or at least that product existing for that long isn't incredibly likely. But I think the fact that there's 3 products from massive companies like Amazon, Google, Microsoft, and 2 from smaller ones, dropbox/backblaze that lasted 10 years means that at the very minimum ~20 years should be considered as realistically possible.

And honestly, if we're willing to assume whatever we're storing isn't worth them storing for longer (let's say against your will) - then you should just rekey it anyway yourself.

But I'm lazy, and again we're getting to near 15 years for some of those services now.

> The number of tech companies that have died in the last 30 years easily exceeds the number still standing [citation needed].

I don't disagree with your premise that the company/product you pick isn't likely to last for 30 years - however I don't think this specific statistic is the correct one to evaluate this with, given the wide range of tech companies with differing products, markets, financial situations, regulations, the many startups that are effectively designed to be acquired, etc.

At the very least, I don't think it's fair to compare Google/Microsoft/AWS to "insert latest crypto based file storage startup" in terms of long-term viability.


OK, so you're up for a bet. Thats fine by me - once a year I bet on the gee-gees in The Grand National.


> That's a significant amount of time

Depends on the threat model. I mean, WireGuard and Signal rotate derived keys every 2mins!


So only 37 years? I think I can live with that.


It depends on what you are transmitting, right?

Hypothetically if you are a journalist working with communications from a source in an authoritarian country (or a country that could become authoritarian in the next 4 decades; and name one that couldn’t, right?) it would be no good if you got some elderly person killed in the future.

Or just like bank account details I guess?


No cryptographic scheme remains unbreakable forever. If what you want is permanent protection, cryptography is not the solution. If people are encrypting things expecting that encryption will never be broken, they're misusing encryption.

The point of cryptography isn't to keep secrets forever, it's to keep secrets for long enough that by the time those secrets are revealed, they are worthless.


> No cryptographic scheme remains unbreakable forever. If what you want is permanent protection, cryptography is not the solution.

Whilst this has historically been true, it's very plausible that AES-256 means that (for this limited problem, symmetric encryption) we're done.

The "obvious" attack (some type of brute force) on AES-256, even assuming you have a quantum computer (which we don't) and it's actually more affordable than our current computers (which it won't be) is not practical in our universe.


I think this is a topic that much cleverer people than me have thought long and hard on. Of course nothing practical remains unbreakable forever, but it seems weird that for example the default key size for ssh-keygen isn’t in the “probably two lifetimes” range.


I can tell you that journalists aren't worrying about that most of the time. It's very much outside their threat model in the majority of cases, as it should be - there's no way to feasibly predict and protect against cryptographic risks 30+ years from now.


In that case the volume of traffic such a communication medium would need to handle is likely small enough that you can bump the key size higher to ensure greater longevity, currently past the lifetimes of those involved, and accept that transmitting data will take a small fraction of time longer.


If extreme security is needed it’s time to turn off your computer, leave your cellphone at home, don’t tell details to colleagues who do not have a need to know, and maybe resort to dead drop methods so even you don’t know who the source is yourself.


Some people are willing to take a on some extra risk to talk to journalists, and the world is a better place for their bravery.

And we’re talking about thousands of bits, we spend way more than that on stupid things like making UI slightly prettier. I’m streaming music at, I guess, ~200kbps, why not spend a couple seconds worth of music on keys? (Who knows, maybe it will protect some famous journalist somehow and we’ll end up ahead when it spares us a whole moment of silence).

Edit: TBH I’m not really sure this is a useful way to look at things, but the music bandwidth/moment of silence parallel was too convenient to pass up.


I would hope that someone invents a more robust cryptography protocol in the next 37 years.


A bit less, since you have to worry about an attacker storing anything you do now for future attacks.

But if you only want any given secret to stay save for 20 years, you can still use 4096 bit RSA for another 17 years. Which sounds like a good tradeoff: enough of time for better algorithms to get established, but little risk of a breach you will care about.


Isn't RSA from like the 80s? That would already make it ~40 years old.


Being pedantic: RSA is a cryptosystem, not a protocol, and the parameters that were used in 1980s RSA encryption look nothing like the parameters used in today's RSA (in part because they're much larger now, but also in part because we now know about all the weird things that happen when you choose non-standard exponents, primes that are too close to each other, etc.).

RSA is also not typically described as robust, for those reasons.


Server appears to be hugged, web archive link:

https://web.archive.org/web/20230710195916/https://articles....



I think the point is as computers get faster there is less trade off in having longer bit lengths, rather than focusing on the potential to crack sad keys.

That is, if it costs very little to have larger keys, why not have larger keys?

It is essentially hedging your bets as even if quantum computing key factorisation works, key lengths will still have an impact on the difficulty of factorisation, and it may make a difference in terms of practicality.


They cover that in the article:

> Quantum computing of the sort intended to break RSA involves a breakthrough in both computing and algorithms. Normally some sort of new computing technology is invented and then algorithms are designed to enable that technology to do something useful. The quantum computing threat to RSA is different. We now have the algorithm (Shor's) but the computer to run it on only exists in our imagination.

> If someone were to invent such a computer then RSA 2048 would be immediately and trivially breakable. RSA 3072 would also be trivially breakable. The same applies to RSA 4096 and 8192.


They can't know that however, they may say it but they can't know it. Hence why not choose key lengths by looking at encoding and decoding resource usage instead of crackability.

By costs nothing I mean as CPUs get faster there is less performance impact on lenghtening keys


They already are, hence everyone moving to ED25519 (which uses less CPU for the equivalent key strength) instead of RSA.


> even if quantum computing key factorisation works, key lengths will still have an impact on the difficulty of factorisation, and it may make a difference in terms of practicality.

I mean, the whole thing with quantum computer factoring is it scales well. Getting to 2048 rsa seems really really difficult. But if we ever get there, getting to 4096 is probably just a tiny extra step.


The current state of quantum computing suggests that there is an exponential effort to increase the available qbits. Going from RSA 2048 to RSA 4096 would not just double the effort required on the quantum side.


Wouldn't that effectively prevent a threat to 2048 bit RSA in the first place?


Almost like there's no free lunch. I don't see quantum computing ever really taking off without a fundamental breakthrough in our understanding of physics.

Would love to be proven wrong though if my understanding is incorrect and there's actually a feasible path towards quantum computing at scale.


I dont think there is any fundamental physics reasons preventing quantum computers. My understanding is it is an engineering problem. A hard one no doubt, but not a fundamental physics one.

Anyways, my point was that getting a quantum computer at a decent scale is really difficult. If we manage to overcome that burden somehow, the difference between 2048 bit rsa abd 4096 bit is peanuts.


Well, we can't know. Maybe it would be trivial, maybe it could keep your data safe for 5ore years. Maybe it could double the cost to crack.

No one can know at the moment, hence the trade off, if it costs very little to do a longer key, why not do a longer key?


`That is, if it costs very little to have larger keys, why not have larger keys?` it is often very expensive/difficult to change length of a RSA key is part of existing platform/infrastructure, like key in TPM/HSM/CA infrastructure, regardless how fast computer CPU is


But RSA has been long time going out, and short-keyed RSA doubly so. I would estimate that since maybe 2015ish deploying stuff that is coupled to 2048bit RSA would have been mistake. That gives generous 15ish year transition period. Anyone who cares even the slightest should succeed transition in that sort of timeframe.


Why would deploying 2048 bit RSA be a mistake? If you believe 2048 is threatened in a meaningful time frame, when 1024 hasn't even been broken (thus sort of implying that the collapse of 2048 will occur in a much shorter time frame than the one separating 512 and 1024), is there any realistic RSA key size that should make you comfortable?


3 reasons

1. it's reasonable to assume the NSA is a decade ahead and has more computers than academia.

2. you want your secrets to last a decade (or longer)

3. the total amount of data you're encrypting per client is only 256 bits anyway (the size of a symmetric key) so the absolute performance impact is relatively minimal


I’m not a cryptographer, but I can see many more pressing reasons for migrating off RSA before 2030. Is there any reason to pick RSA for greenfield today?

RSA, to my knowledge, is vulnerable to side channels and poor parameter choices. Implementation simplicity is an underrated security parameter. The fewer feet you have, the fewer of them you can shoot.

The NSA data centers don’t want to waste time on your RSA key anyway, much less your run-of-the-mill Russian black hat groups. What bites us in practice are 0-days of something stupid like heartbleed or rowhammer that can be automated, and takes a long time to patch.


This is my (applied) cryptographic understanding as well: RSA 2048 probably won't be broken by improvements to prime factorization by 2030, but will continue to be a pointlessly dangerous (and slow) cryptosystem when compared to ECC.


I'm not convinced on the first part. That it won't be broken by improvements to prime factorization.

https://www.ams.org/notices/199612/pomerance.pdf has a great writeup on the history of the work around this. Essentially when you see improvements in complexity of the form

Old best: Quadratic Sieve: exp(c(log n)^1/2(log log n)^1/2)

New best: General Number field sieve: exp(c(log n)^1/3(log log n)^2/3)

I can't help but feel that's an exponent there that we've moved to 1/3 that could be moved further. Sure we don't know how and we've been stuck here on the current best for just over 25 years but i just feel that if you give me two methods and one moves a term like that there's a good chance there's a way to reduce that term further. It'd be weird for that complexity statement to stay as is. That's telling me "the universe doesn't allow factorization any faster than a term that's raised to a power of 1/3rd" and i'm asking "why is 1/3 so special?". So i'm not convinced that there's not more here. I don't have a clue how of course. But the history of RSA going "256bits is secure" to 512bits to 1024bits to 2048bits being needed has me worried about the safety of prime factorization.


Top reason to use RSA for me, as with most tech I use, is its longevity.

Lindy Effect has been the best predictor of what will still work in five years.


McEliece encryption algorithm was published in 1978 (one year later than RSA) and seems to be considered safe to classical and quantum computers, the only downside is the huge public key size.


The Lindy effect applies to non-perishables. There are alternatives which are much more likely to conform to a reasonable equivalency of non-perishable for cryptography. Per this article, RSA strongly does not fit reasonable equivalent definitions.


The fact that the sun rose today, has no effect on whether the sun will rise tomorrow.


And yet I doubt you would be willing to bet against the sun rising tomorrow.

Our understanding is based on imperfect models, sure. That doesn't matter most of the time. It wouldn't matter in this bet.

So much of what any lifeform does is based on past experience, even though that experience isn't the direct driver of future effects. Turns out that bets based on experience work really well.


Of course I’d bet the sun would rise tomorrow, because if it doesn’t, I’m either dead or money will be worthless…

The same applies here, would you bet on a horse that is flagging (RSA won’t work forever)? We have the ability to take in new information, and throw away past information because it is no longer relevant. If you choose to ignore the new information, just because “it’s always been that way”, that doesn’t seem rational.


Indeed. The sun has successfully rose every morning, hundreds of billions of times straight! That's a pretty good record.


Do you have a proof of that?

I've been to places where the sun doesn't rise for months on end...


It does though. If it failed to rise today due to some cosmic anomaly, it would very likely fail to rise tomorrow.


When you die, from your perspective, the sun doesn't rise. It doesn't have to be a cosmic anomaly. For every single human, there are more days the sun doesn't rise than does.


If you accept the current assumption of double exponential growth for both computing performance and algorithmic efficiency then you would have to accept that 256 bit elliptic curve keys will become obsolete in the 2040s. So it might be just RSA and discrete logs today but a requirement for pointlessly long EC keys will be along soon.


Yeah but does it matter? Either that’s true or it isn’t. If it is true, we’ll (as usual) have ample time to migrate. With RSA though, it’s already today more complex, slower and about 8x larger key sizes. And the cryptanalysis track record is afaik (please correct me) much more successful than ECC, so it’s “higher risk” that the timeline gets pushed forward or that new patches are needed to avoid bad parameter choices.

> So it might be just RSA and discrete logs today but a requirement for pointlessly long EC keys will be along soon.

It wouldn’t be pointless if computers can crack those sizes. It’d only be pointless if cryptanalysis can exploit structure to reduce the effective entropy, no?


>Implementation simplicity is an underrated security parameter.

Sure, if we were implementing cryptographic algorithms from scratch that would be a proper strong consideration. However, 99% of programmers should just link to an established library/framework and use its cryptographic implementation. These established libraries already paid the price of implementation, and are very battle-tested. There's therefore very good reason to believe their RSA implementation is secure.

Choosing an algorithm should be done on other considerations then. A lower keysize would point to ECC. But maybe we don't want a single algorithm for all encryption - a mixed global ecosystem with RSA and ECC projects would be more robust.


> These established libraries already paid the price of implementation, and are very battle-tested. There's therefore very good reason to believe their RSA implementation is secure.

Many of these established libraries have fallen in battle, some several times. There's always a new up and coming library that works on platform X, in language Y, or has a better license Z, or is integrated into an init system, and while some of them learn from the experience of others, many learn by making the same mistakes others did.

Pushing towards simpler constructions gives hope that those new implementations make fewer mistakes.


To be frank, "RSA is so impossibly complicated that nobody ever could implement it" is just spreading FUD. It's not so complicated an expert* couldn't do it, the code is small to review, it's been done, and safe implementations are well-known. If you decide to use a new up and coming library the risk is on you whatever algorithm you choose. Sure, go to ECC if you want, but there's no good reason to especially doubt the security of existing RSA implementation.

* The only type of person who should be writing a cryptographic implementation in the first place.


> These established libraries already paid the price of implementation, and are very battle-tested.

Yes absolutely. I’m not saying users should pick the one that’s easier to implement.

Simplicity is good for implementers. It allows for more participants, eg std libs to provide their own. Also, even the security geeks are humans and make mistakes. Heartbleed is a perfect example of where even simple things can go catastrophically wrong.

As a second order effect, users benefit from simplicity in the long run, because less complex systems have fewer bugs, and thus fewer security bugs.


No concern about "store now, decrypt later"?

https://en.wikipedia.org/wiki/Harvest_now,_decrypt_later


I always thought discussions about encryption forget about this topic. Encrypted data today is only hidden today, not in the future. So any darknet criminal should consider that files he/she sends are being intercepted and readable in 30 years.


I find it amusing that the table published a conservative cutoff year and an optimistic cutoff year. Based on the trends I've seen, most non-critical software would probably have made the switch in time for the conservative year, whereas anything security-critical like a bank would probably use the optimistic year.


The author of the paper had a problem. His elliptic curve method seemed like it might overtake the best known algorithm at the time for factoring. So the conservative estimate takes that into account. The elliptic curve method never managed supremacy so the optimistic estimate is actually more relevant. That means that the actual prediction is 2040 but it seems the various national cybersecurity entities might of missed that point.


For Symmetric encryption, it says: 'Current key size: 112 bits'

However the 3 linked examples, AES, ChaCha20 and Camellia all use a key size of at least 128 bits, with 192 or 256 bits also listed as options.

What does this current NIST key size recommendation (effective as of 2019) of 112 mean then? Does anyone use this size?


That's pretty much for 3DES, which is disliked for other reasons. The point still stands though...


112 bit effective key size almost certainly refers to 3des


One of the other problems about RSA cracking progress is that not a lot of people care anymore.

RSA is so slow that a lot of people have switched to Elliptic Curve.

That's going to dent progress as the smart people are all working on ECC instead of RSA.


Its not totally implausible that a factoring breakthrough in ECC could have implications for RSA.


Could you say more? ECC's trapdoor function isn't prime factoring; it's a variant of the discrete log problem.



Yes, I know that DLP and prime factorization are similar problems. I was trying to understand if the GP's comment was about improvements to prime factorization specifically having implications for DLP (since "factoring breakthrough in ECC" is an unusual phrasing to me.)


I use gpg plus kyber (quantum resistant). RSA may break, and kyber might suck. But I'm hoping not both.

https://github.com/ThomasHabets/kybertest


Recommendations from various organizations can be found at:

* https://www.keylength.com

Anything recent (≥2016) seems to say 3072 for RSA.


I have no specialist knowledge in this subfield, but after reading the article's arguments that basically if you could sic the entire bitcoin network on 2048 RSA it would take 700+ years, I have to wonder about perverse incentives.

Another thing that's missing is the lifetime expectancy, e.g. "for how many years does something encrypted in 2030 need to be unbreakable?"

The author doesn't seem to be a big authority, so has little to lose by staking their reputation on "you don't need it to be that good," whereas by the very nature of their authority, anyone in the resource you link is going to be motivated to never be wrong under any circumstances. So if someone with some reputation/authority/power to lose think there's a 0.001% chance that some new incremental improvements will allow for fast-enough breaking of 2048 bit encryption created in 2030 within a window where that would be unacceptable, then they're motivated to guess high. The authority in this case doesn't directly bear the costs of too high of a guess, whereas it could be very bad for, i dunno, some country's government, and by extension the org or people that made that country's standards recommendations, if some classified information became public 15 or 50 years earlier than intended just because it could be decrypted.


By "perverse incentives", do you mean something like: "it appears the cryptographic research department has hit a brick wall in terms of useful advancements, so we're reducing the budget and the department head will be taking a 75% pay cut"?


I mean like the incentives aren't aligned. So maybe you're giving an example but i'm honestly not sure. :)

in the space of cve or malware detection, the user wants a safe/secure computing experience with minimal overhead, but the antivirus / cve-scan vendor wants to claim that they're _keeping_ the you safe. so they're motivated to tell you all about the things they scanned and possible attacks / vectors they found. You probably would've been safe responding to only a subset of those alerts, but they have no incentive to minimize the things they show you, because if they ever missed one you would change vendors.

in the space of cryptography, the user wants secure communications that are unbreakable but with minimum hassle and overhead, but the advisory boards etc. are incentivized to act like they have important advice to give. So from the user perspective maybe it makes sense to use 2048 bit encryption for a few more decades, but from the "talking head" authority figure perspective, they can't afford to ever be wrong and it's good if they have something new to recommend every so often, so the easiest for them to do is to keep upping the number of bits used to encrypt, even if there's 99.99% odds that a smaller/shorter/simpler encryption would've been equally as secure.


I assume you’re aware, but for clarity: it’s not possible to sic the bitcoin network on anything, even cracking sha256, which it uses internally, due to hard-coded ASICs that incorporate specific quirks of the proof-of-work.


It seemed like the reason to use the Bitcoin network in the discussion was to form what a theoretical nation-state actor might possibly be hiding based on the traits of some thing in the physically-existent universe (instead of discussing things completely in imaginary-land).


One thing this paper ignores is side channel attacks. Those may involve hardware or software vulnerabilities, or they may be inherent to the process itself.

So the real bogeyman is not whether we have figured out how to factor large numbers yet (aside from Shor and the as of now mythical quantum computer), but how much information you might leak by using your key.

One (generally) overlooked idea might be, some sort of vulnerability between they key and the data being used. E.g., by multiplying many, many smaller numbers with the private key, is it possible to increase the efficiency of the sieve.

Then it might be the case that commonly used keys are more vulnerable and keys used less are less vulnerable.

Another idea would be a rainbow table of keys. It might not matter so much that you can arbitrarily factor a large number, if generating keys is fast. Especially when you mount attacks on the random number generators involved, you can reduce the search spaces.

Forcing the key itself is not so much the concern, this doesn't make me think "oh we are fine".

Historically we only have to look back to e.g. Heartbleed to be reminded that we broke ssl not by factoring primes, but by exploiting the many flaws in the protocol itself.


> One thing this paper ignores is side channel attacks.

I mean, that's fair right? The article doesn't talk about encryption in general but tries to answer "How secure is RSA?" or rather "When is/will N bit RSA be considered insecure?", so scoping it to only talk about that seems fair.

Of course, one could only focus their attention on so many things. Misuse, misconfiguration, side channel attacks and etc become unrelated to the topic at hand in this case.


Possibly?

My point was essentially that key-length could have unintended side-effects.

E.g. if you were to have some rainbow table approach, e.g. in theory larger key size would mean more possible key combinations, meaning more expensive space complexity, and harder-to-crack keys. Factoring a key is not necessary if you already know the factors, a hashtable has O(log(N)) complexity. If you implement some custom FPGA hardware and a nice database its not too difficult to imagine some specialized operation storing off generated public keys to their corresponding private key pairs, and the power costs are rather low.

Of course due to combinatorics the size of this output space is rather large, despite the fact that the distribution of primes shrinks as they grow larger, but the argument is about factoring a single number, not about efficiently computing primes and their common multiples. Counter to the article, it completely obviates the need to hide your power bill, as you can cache every single computation from the past 30+ years...

To bring it back to what I was saying, the difficulty of brute-forcing RSA (or other schemes) is potentially irrelevant to the cost of obtaining a solution, and higher bit-length key-pairs offer some hedge against that possibility. It seems pretty relevant to the question "how secure is RSA" to me.


This ignores potential current unpublished advances by entities like NSA and potential unforeseen future algorithmic and computing power advances. Logically speaking, larger keys could help and are unlikely to hurt with both. Even if larger key still ends up breakable, adversaries may still go for low hanging fruit.

Other than that, it depends on secrecy timeline and cost/performance sensitivity. An average credit card transaction is unlikely to be targeted by NSA or archived in hopes of cracking it 30 years later, and on the other hand volume is very high and latency is important. So use whatever is thought to not be breakable now and upgrade keys if and when technology progresses. On the other hand, list of American spies in Russia would not take more than a few minutes to decrypt even with enormous key sizes and on the other hand disclosure could cause real damage even decades later. Might as well overshoot even if there is no known reason as of yet.


> The assumptions that the 2030 date for increasing RSA key length were based on turned out to be invalid. A check of current capability confirms this. There seems to be no rational reason to increase RSA key sizes past 2048 starting in the year 2030. We don't have any reason to increase RSA key sizes at any time based on our current understanding.

Great to know my porn collection will be safe with 2048 bit RSA. :)


My MITM proxy that sits on the LAN and acts as a filtering gateway to the Internet still uses a 1kbit RSA key, only because it's the smallest size my devices will accept. It's somewhat amusing that, despite widespread "knowledge" that 1024-bit RSA is "insecure", this is still roughly 2^100 times more difficult to factor than the current latest record of 829 bits.


I realize I'm rather far out of my depth here, but when discussing asymmetric encryption, doesn't this typically imply the discussion is about authentication?

Is there a way to derive the ephemeral keys? My understanding is that these are not directly shared, but it's exactly where I am weakest on the basic concepts of the handshake and related stuffs.


That AI will accelerate algorithmic improvements is a new factor that has not previously been taken into account. This may be too optimistic.


There's practically no reason to think that AI will greatly accelerate the development of new algorithms.


Why? Because we're nearing the theoretical limits or because AI will be too dumb to help or for other reasons?


For the same reason that there's no reason to think AI will continue to advance exponentially.

The idea of the singularity is fun, but it's unrealistic. Nothing lasts exponentially forever.


Even if the exponential improvements, which have been happening in both hardware and AI algorithms, were to slow down to sub-exponential, this still assumes that exponential advancements are necessary for AI to reach a level where it can assist in algorithm design. And only now we're seeing big jumps in AI investments, which could serve as another source of improvements. You could even argue that we're seeing the first signs of recursive improvements, with Copilot making programmers ~20% more productive.


I have yet to see an AI bot produce anything that's truly intelligent and/or original (all I see is ever more powerful hardware and ever bigger quantities of training data being thrown at essentially the same statistical probability algorithms), and I don't predict that changing in the foreseeable future - at least, not without the same kind of fundamental breakthrough that would be required for quantum computing to become a practical reality.


Is there a quantum algorithm for cracking ECDSA like Shor's for RSA? I was hoping they'd mention it in the article.


Shor's algorithm can also break elliptic curve encryption: https://en.wikipedia.org/wiki/Post-quantum_cryptography


Part of the issue as a prospective cryptographic user/consumer is that not only do I not know which algorithm(s) should be used, the most likely library https://github.com/open-quantum-safe/liboqs also explicitly states that it shouldn't be used in production.

Hybrid deployment (E.G. with ECC using a curve like 25519) is a great recommendation and probably obvious, far more so than picking a winner among the available post quantum possibly safe algorithms.


It is more or less universally assumed in practice that PQ key agreement algorithms will be paired with classical (RSA/ECC) cryptography. There's not much need to discuss or standardize it; it's a given.


Doesn’t NSA object hybrid schemes on weird grounds that current implementations suck, full of implementation errors and all-new PQ-only ones will not?

Edit: reference https://mailarchive.ietf.org/arch/msg/spasm/McksDhejGgJJ6xG6...


That doesn't make much sense as an objection, since the classical cryptography code is better ironed out than the PQ stuff, and the logic to combine the two is fairly simple.

Unless there is an unexpected leap in the viability of quantum cryptanalysis, you should expect that all commercial/standard cryptography with PQ capabilities will run in a hybrid configuration.

I'm only commenting here because there's a pervasive belief that this is controversial in cryptography engineering circles, or that NIST is trying somehow to prevent hybrid schemes from happening, which is simply not the case --- though they may not bother to standardize any particular mechanism of combining ECC/RSA with PQ exchanges (but: they don't standardize stuff like TLS ciphersuites, either).


So ECDHE becomes EC(Post Quantum Exchange)? Am I understanding that correctly?


I suspect you do not understand given the phrasing.

Think of Tunneling or layers or nesting dolls. The order doesn't particularly matter from a security perspective. Though today I'd wrap with the conventional algorithm on the OUTSIDE layer, since it takes less computational time to check / validate. The inner layer would then be one or more of the post-quantum algorithms; a particularly paranoid application might use more than one if something absolutely must remain a secret.


Thanks for answering.

So you’re encrypting with an asymmetric post quantum algorithm then using that as a payload with regular ED25519 or similar?

What value does the pre quantum wrapper add?


It's like using different technologies to protect against nuclear waste leaking from storage. Each plausibly suitable layer of a different sort makes a leak less likely.

Post-quantum algorithms are, as yet, young and very possibly poorly understood. They may even offer no security at all (due to presently unseen flaws); therefore as a hedge against that include at least another currently in use and current best practice algorithm so that at least _that_ level of security is retained.

Plus, as I pointed out several replies ago, if the current (and fast since reasonable key size Elliptic Curve based) algorithm is the outer layer it can be validated quickly which is a better guard against denial of service attacks and other poor fakes.


A good Google search term is "CECPQ2". Roughly: run a PQ KX to reach a shared secret, and run an independent X25519 to reach a second share secret, and then HKDF them both (conceptually: just cat them together and use them as a single secret).


Why not just XOR them? Is the reason we use HKDF is so if one or both is only partially broken (only some bits or some correlation) instead of fully broken we still benefits from the information that is still there?


With an XOR, if you can control bits on one of the inputs, you can deterministically change one of the bits on the outputs. That is not the case with an HKDF.


Honestly, for all I know, they do XOR them. I didn't look carefully at how CECPQ2 works! Just: there's very straightforward joinery for running two key agreements and combining them so they both have to succeed to arrive at the session secret.

Later

(Leaving this comment here in perpetuity as evidence that I didn't think about your question as hard as David Adrian did. The message still holds, and that message is: "big ol' shrug".)


Every mainstream asymmetric cipher is broken by quantum computing. Post-quantum ciphers are only this year slowly rolling out in canary & beta platforms and demand more memory and CPU cycles

A conservative but reasonable risk assumption is to act as if all internet traffic prior to the year 2023 was transmitted in the clear. This includes Signal, PGP, and Tor.



Practically a quantum computer breaking ECC would happen much sooner than RSA, since ECC uses much smaller keys and the main issue with quantum computers is scaling.



I was under the impression that shor's algorithm worked for ecdsa just as well as rsa.


Quote: "There seems to be no rational reason to increase RSA key sizes past 2048 starting in the year 2030. We don't have any reason to increase RSA key sizes at any time based on our current understanding."

Yeahhhh, nice try NSA. If they say this, I'd say go to 8192 right now.


Is factoring something that GPU /CUDA parallelism helps with?


Not for the sort of technology we have today using the best known algorithm. The approach relies on the idea of sieving. Sieving takes a lot of memory. As an example, the most recent 829 bit factoring record used multi GB for each processor during the parallel phase and 1.5 TB for the final matrix reduction phase. Neither phase would really get much out of a bunch of processors attached to just a few GB.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: