Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Negative RSA moduli from X509 parsing #56
When parsing some certificates using X509 I get a RSA pubkey with a negative modulus, but when parsing the same certificate with openssl I get a proper modulus.
I had different negative moduli even though I can get the same from different certificates with totaly different keys.
For instance when parsing the following certificate :
I get the following modulus using openssl :
and the following one using X509 :
Do you have any idea where this might come from ?
strange (I can reproduce this with ocaml-x509). so the ASN.1 encoding of the modulus of the public exponent is (in hex):
According to layman ASN.1 section 5.7,
Did you create the certificate? If so, with which tool?
(I'm not the expert here, @pqwy certainly knows more)
NOTE: updated to clear up my length confusion
Hmm correct (even though the integer starts at AC and not 80 which is still part of the data size).
I did not generated the certificate, just got it from the zmap data set
As Nathan hint at, https://en.wikipedia.org/wiki/X.690#Length_Octets tend to indicate that
sorry for my length confusion, I updated my comment to reflect reality.
what to do: there are tools which produce wrong certificates. these certificates are in the wild. I believe we've to provide a non-negative integer decoder in asn1-combinators and use this for public keys.
That would be great, ty !
That's what openssl seems to do.
It adds a leading 0 byte
The central question in my mind is how widespread is this.
In general, no. I am very reluctant to add warts to support other people's old bugs. These certificates are badly encoded, making them invalid. They should be replaced, not used, and definitely not worked around in newly written software.
Well, unless this is really common practice. We do have a precedent of bending the standard here. But that is not a work-around, in that case everyone other than the standard seems to agree on the encoding.
As a side-note, accepting this would violate the core DER property of uniqueness of representation. Thus we immediately lose the invariant that you can encode a certificate and arrive at the same byte sequence your parsed it from. This is not something to lose lightly. The fact that OpenSSL is always willing to add warts is one of the reasons OpenSSL is what it is, plus, they have a very relaxed interpretation of certificate usage flags and are generally eager to accept any byte stream that someone claims to contain a certificate.
To resolve this dilemma I downloaded Zenmap's sample of HTTPS certificates from two days ago. Of those, we parse 508723 and reject 4382 (which is most probably a bug I will now look into). I was wondering how many certificates had negative moduli under the current decoding?
IMHO this change is not justified.
Which file is it in? It's an old self-signed
Some more results from digging around the Sonar set:
Unknown algorithms are:
That last one,
-- Just wanted to share.