Hacker News new | past | comments | ask | show | jobs | submit login

"/dev/random assumes that reading 64 bits from it will decrease the entropy in its pool by 64 bits, which is nonsense."

To amplify Hendrikto's point, /dev/random is implemented to "believe" that if it has 128 bits of randomness, and you get 128 bits from it, it now has 0 bits of randomness in it. 0 bits of randomness means that you ought to now be able to tell me exactly what the internal state of /dev/random is. I don't mean it vaguely implies that in the English sense, I mean, that's what it mathematically means. To have zero bits of randomness is to be fully determined. Yet this is clearly false. There is no known and likely no feasible process to read all the "bits" out of /dev/random and tell me the resulting internal state. Even if there was some process to be demonstrated, it would still not necessarily result in a crack of any particular key, and it would be on the order of a high-priority security bug, but nothing more. It's not an "end of the world" scenario.




Yes, this depleting entropy argument is like arguing a 128 bit AES key is no longer secure after it has encrypted 128 bits of data, and encrypting more will give up the AES private key, so the ONLY thing to do is block.

Its completely nuts.


The entropy value was designed to be a known underestimate, not an accurate estimate of the entropy available.

What that in mind, zero is ok as a value. You may not be able to calculate the state of /dev/random given the tools and techniques available to you, but that doesn't make zero an incorrect lower bound on what you could mathematically calculate from the extracted data.


In reality, the entropy estimate is of no value. See Ferguson and Schneier, who have a chapter on this.

The meaningful states of a CSPRNG are "initialized" or "not". Once initialized, there is never a diminishment of "entropy".


I agree with that, subject to the assumption that your CSPRNG is built on CS-enough primitives.

(What I disagreed with is the argument made by the GP, not you, that the Linux entropy value was incompatible with their in-principle mathematical description of "true" entropy. Pretty irrelevant to real cryptography.)


> There is no known and likely no feasible process to read all the "bits" out of /dev/random and tell me the resulting internal state

That's fine if you trust the PRNG. Linux used to at least attempt to provide a source of true randomness. You and Hendrikto are essentially asserting that everyone ought to accept the PRNG output in lieu of true randomness. Given various compromises in RNG primitives over the years, I'm not so sure it's a good idea to completely close off the true entropy estimation to userspace. I prefer punting that choice to applications, which can use urandom or random today at their choice.

Maybe everyone should be happy with the PRNG output. T'so goes further and argues, however, that if you provide any mechanism to block on entropy (even to root only), applications will block on it (due to a perception of superiority) and so the interface must be removed from the kernel. I see this change as an imposition of policy on userspace.


> That's fine if you trust the PRNG. Linux used to at least attempt to provide a source of true randomness. You and Hendrikto are essentially asserting that everyone ought to accept the PRNG output in lieu of true randomness. Given various compromises in RNG primitives over the years, I'm not so sure it's a good idea to completely close off the true entropy estimation to userspace. I prefer punting that choice to applications, which can use urandom or random today at their choice.

Linux never provided a source of true randomness through /dev/random. The output of both /dev/random and /dev/urandom is from the same PRNG. The difference is that /dev/random would provide an estimate of the entropy that was input to the PRNG, and if the estimate was larger than the number of bits output, it would block.


"You and Hendrikto are essentially asserting that everyone ought to accept the PRNG output in lieu of true randomness."

No, what I am asserting is simply that the idea that you drain one bit of randomness out of a pool per bit you take is not practically true unless you can actually fully determine the state of the randomness generator when you've "drained" it. No less, no more. You can't have "zero bits of entropy" and also "I still can't tell you the internal contents of the random number" generator at the same time, because the latter is "non-zero bits of entropy". Either you've got zero or you don't.

As of right now, nobody can so determine the state of the random number generator from enough bits of output, we have no reason to believe anybody ever will [1], and the real kicker is even if they someday do, it's a bug, not the retroactive destruction of all encryption ever. A SHA-1 pre-image attack is a much more serious practical attack on cryptography than someone finding a way to drain /dev/random today and get the internal state.

It's only true in theory that you've "drained" the entropy when you have access to amounts of computation that do not fit into the universe. Yes, it is still true in theory, but not in a useful way. We do not need to write our kernels as if our adversaries are turning galaxy clusters into computronium to attack our random number generator.

[1]: Please carefully note distinction between "we have no reason to believe anyone ever will" and "it is absolutely 100% certain nobody ever will". We have no choice but to operate on our best understandings of the world now. "But maybe somebody may break" doesn't apply to just the current random generator... it applies to everything including all possible proposed replacements, so it does not provide a way to make a choice.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: