Hacker News new | past | comments | ask | show | jobs | submit login

Presumably because 512 of bits (a) seemed "random enough" and (b) was the nicest block size that fit comfortably in 521 bits of modulus. This is a common mistake.



From TFA it seems more like they already had the SHA512-based implementation for DSA (where it was fine), and reused it when implementing ECDSA without realizing that it wasn't suitable in situations with moduli larger than 512 bits.


I'd have to look at the code, but a DSA modulus is much larger than a P-521 modulus. Maybe it just happened to line up nicely?


The nonce is taken modulo the order of the prime-order subgroup. For DSA that's generally a 256ish-bit prime (e.g.: choose a 1024-bit prime p such that a 256-bit prime q divides p-1; then there exists an order-q subgroup of Zp).

For P-521, the base field is 2^521 - 1, but the modulus used when computing the nonce is not that value, it's the order of the P-521 curve. By Hasse's theorem, that's roughly p +- sqrt(p), which is essentially p for such large numbers (the cofactor of P-521 is 1, so the order of the group is prime).

So: both are 521-bit numbers, but the group order is less than 2^521-1. Its hex representation is 0x01fffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffa51868783bf2f966b7fcc0148f709a5d03bb5c9b8899c47aebb6fb71e91386409.


Ahh. Thanks! (I don't think about FFDLP much, as you can see).




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: