Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Fix OOM when deserializing UTXO entries with invalid length #7933
Thanks to @pstratem for finding this.
The normal vector deserializer reads data in chunks of at most 5 MB, preventing OOM when insane vector lengths are encoded. This protection is not present in CScriptCompressor's specialized deserializer, however, resulting in a potential OOM when very large length descriptors exist, as the target CScript is resized before attempting to read that much data.
However, CScripts have a maximum length above which they're always invalid. We can treat scriptPubKeys with such lengths as unspendable, preventing them from going into the UTXO set even, and skipping them when deserializing.
Note that none of this is exposed to the network, as the P2P code uses normal (pre)vectors, which do have this OOM protection directly in serialize.h.
referenced this pull request
Apr 25, 2016
Can confirm the test fails: