Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

encoding/asn1: ObjectIdentifier + crypto/x509.ParseCertificate does not support int > 28 bits #19933

enj opened this issue Apr 11, 2017 · 3 comments


Copy link

@enj enj commented Apr 11, 2017

What version of Go are you using (go version)?

go version go1.7.5 linux/amd64 (should be the same in go 1.8+)

What operating system and processor architecture are you using (go env)?

GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build829310214=/tmp/go-build -gno-record-gcc-switches"

What did you do?

Minimum reproducer:
The following test cert can also reproduce the problem:

What did you expect to see?

No cert parsing errors.

What did you see instead?

2009/11/10 23:00:00 failed to parse cert: asn1: structure error: base 128 integer too large

The fundamental issue is that asn1.ObjectIdentifier is a alias for []int instead of []big.Int. Based on the data

INTEGER, an arbitrary integer.

OBJECT IDENTIFIER, an object identifier, which is a sequence of integer components that identify an object such as an algorithm or attribute type.


The contents octets shall be an (ordered) list of encodings of subidentifiers concatenated together. Each subidentifier is represented as a series of (one or more) octets. Bit 8 of each octet indicates whether it is the last in the series: bit 8 of the last octet is zero; bit 8 of each preceding octet is one. Bits 7 to 1 of the octets in the series collectively encode the subidentifier. Conceptually, these groups of bits are concatenated to form an unsigned binary number whose most significant bit is bit 7 of the first octet and whose least significant bit is bit 1 of the last octet. The subidentifier shall be encoded in the fewest possible octets, that is, the leading octet of the subidentifier shall not have the value 80 (base 16).

it would seem that []int is not the correct type.

However, since the type is a simple alias, it is trivial to cast between them. Any change to the underlying type would then be considered backwards incompatible. I will leave it to the Go team to decide if that is the case.

We have run into a real world case where an individual was assigned an OID component that requires 29 bits to represent. Thus at this time they cannot use their certs as the current go implementation is limited to 28 bits. It should be possible to use 31+ bits from int. This would be fully backwards compatible and consistent on all machines. Using a temporary int64, one should be able to store greater values on machines where int is larger than 32 bits. This would not be consistent across machine architectures.
Both of these methods are simple mitigations and would not help in cases such as docker/distribution#1370 where was used to store a 128 bit UUID. They are also not tolerant of implementations that may pad the integer with unnecessary leading zeros.

@bradfitz bradfitz changed the title `encoding/asn1.ObjectIdentifier` + `crypto/x509.ParseCertificate` does not support `int` > 28 bits encoding/asn1: ObjectIdentifier + crypto/x509.ParseCertificate does not support int > 28 bits Apr 11, 2017
@bradfitz bradfitz added this to the Unplanned milestone Apr 11, 2017
Copy link

@agl agl commented Apr 11, 2017

We would not change the type from int to something larger and an OID with a value > 2^28 is certainly a little odd. However, I can change the code to be able to exploit the full range of an int.

Copy link

@gopherbot gopherbot commented Apr 11, 2017

CL mentions this issue.

Copy link

@gopherbot gopherbot commented Apr 12, 2017

CL mentions this issue.

@gopherbot gopherbot closed this in 94aba76 Apr 13, 2017
lparth added a commit to lparth/go that referenced this issue Apr 13, 2017
The current implementation uses a max of 28 bits when decoding an
ObjectIdentifier.  This change makes it so that an int64 is used to
accumulate up to 35 bits.  If the resulting data would not overflow
an int32, it is used as an int.  Thus up to 31 bits may be used to
represent each subidentifier of an ObjectIdentifier.

Fixes golang#19933

Change-Id: I95d74b64b24cdb1339ff13421055bce61c80243c
Reviewed-by: Adam Langley <>
Run-TryBot: Adam Langley <>
@golang golang locked and limited conversation to collaborators Apr 13, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet
4 participants
You can’t perform that action at this time.