Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What ECDSA algorithms/parameters are relevant? #111

Open
bleichenbacher-daniel opened this issue Mar 31, 2024 · 6 comments
Open

What ECDSA algorithms/parameters are relevant? #111

bleichenbacher-daniel opened this issue Mar 31, 2024 · 6 comments

Comments

@bleichenbacher-daniel
Copy link

Since I don't have access to my old code I've rewritten a large fraction of the test vector generation code.
One thing I'm wondering however is which parameters/curves etc. for ECDSA are relevant. Concretely:

  • I'm not aware of any apps using ECDSA over binary fields. Am I overlooking something or could these curves be ignored?
  • There is a large number of message digest / curve / signature encoding combinations. I'm trying to figure out which ones are relevant, i.e., explicit choices in proposed protocols.
  • Signature encoding are either DER, P1363, OpenSSH or the v,r,s format used in the "ether" world. For the last one I haven't found any satisfactory documentation (i.e. docs are sometimes contradicting each other)
  • Signature verification through public key recovery allows some new edge cases. These edge cases depend on how the public key recovery is done. I'm trying to find typical implementations.
@davidben
Copy link

davidben commented Apr 7, 2024

On the TLS side, TLS 1.2 doesn't correlate curves and hashes and people kept confusing parameterized signature schemes with parameters to the sign and verify function, so the full product of {P-256, P-384, P-521} x {SHA-256, SHA-384, SHA-512} end up being accessible, usually for the same key. 😞

TLS 1.3 fixes this and only has (P-256, SHA-256), (P-384, SHA-384), and (P-521, SHA-512).

Alas, one of the edge cases of ECDSA (needing to truncate the hash when your group order is not a multiple of 8 bits) doesn't come up in any of the common combinations. P-521 is large enough that no reasonable-sized hash needs a truncation. We have a test that passes in a hypothetical larger hash into P-521 to exercise that path. Picking one of the more obscure curves might also work, but that only works if the implementation has a custom curves API, which isn't a great idea.

(Of course, if this edge case is truly unreachable for some implementation, it arguably doesn't matter for them. The joys of over-parameterized primitives.)

@bleichenbacher-daniel
Copy link
Author

Thanks for the reply. I certainly agree that protocols should generally select small subsets of well specified algorithms. Using keys with multiple algorithms/parameters is of course a problem, but I don't know how to catch such cases with test vectors. One thing I'm doing is splitting the test vectors by algorithm, so that implementations that use selected sets of parameters don't have to search test vector files for supported and unsupported test cases. But generally, a major way to improve the situations is to have standards for key formats that fully specify the algorithms and parameters of a key.

The TLS parameters are of course no-brainers. I'm more wondering about including other curves such as the BLS curves and at the same time dropping curves over binary fields. The main issues that should be covered by test vectors are signature malleability, curves with large cofactors, verification through public key revovery. The last one might allow new attacks where r is not a coordinate of a point on the curve.

@davidben
Copy link

davidben commented Apr 8, 2024

Oh yeah, the multiple parameter thing was just me bemoaning the state of the world. Definitely agreed that test vectors aren't really the place to address that. (TBH we're pretty stuck with multiple algorithms per key for RSA and ECDSA in X.509-adjacent systems. It's frustrating. Hopefully we can avoid it for new algorithms and new systems.)

@bleichenbacher-daniel
Copy link
Author

Good point. Maybe for RSA it is time to start encoding keys with id-RSAES-OAEP and id-RSASSA-PSS object identifiers, so that libraries can no longer pass tests if they at least support these key formats. A few years ago support was still quite weak, but maybe this can be pushed a bit.
For ECDSA I don't even know if there is a way to encode all parameters in a key format.

@davidben
Copy link

davidben commented Apr 9, 2024

Definitely not! Those OIDs are absurdly complicated. We explicitly rejected implementing them in BoringSSL because of how badly they were defined. :-) It takes more bytes to say "RSA-PSS-SHA256" than to encode an entire ECDSA signature. In principle I'd agree with fixing the way the algorithm is specified, but the cure is worse than the disease here.

Also, practically speaking, switching a system to those OIDs is as much work as switching it to ECDSA and if you're doing that, you may as well go to ECDSA.

@bleichenbacher-daniel
Copy link
Author

OK, I guess I can also treat them as distinct algorithms and add files based on popularity of the variants.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants