-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MetadataRenderer
: Block hash will provide poor randomness post-Merge
#417
Comments
Always was bad RNG |
duplicate #151 |
This looks like the best, making it primary |
I want to point out #242 as another good finding here. Out of everyone who looked at this line and identified weak randomness, only 0xA5DF and Lambda noticed the biggest issue: |
My specific criticism against the PRNG findings being of Medium Severity is that in the context of this codebase, Randomness is more a tool for storytelling than a tool for scarcity. Distribution of properties is linear, meaning that the PRNG is the way to create scarcity in features. If any specific individual or group where to monitor the chain, they could all predictably decide when to settle an auction and trigger the next, meaning that a recognizably rare (where rarity is perceived or actualized), can be crated. At that point the system would have a auction where most willing participant recognize the value of the new token, meaning that from "gaming" the system, the system is gaining. If this were to be abused, to always mint the scarcest of trait, then over time the scarce trait would simply lose it's rarity And because all features have a linear distribution, over enough time, no specific trait should is expected to emerge, although, per the above some trait may become rarer or scarcer. All in all I think the rarity is secondary here, as no promise of fair distribution is made. Because of that, I will confirm on a technical level that there are better ways to source scarcity (obvious one being VRF), however, because traits are secondary instead of being the main point of the collection (as in, there are no weights to determine if a feature should be scarce or common), I think Low Severity to be more appropriate. In a different contest, where scarcity is necessary, (see Meebits for example), this finding may even be of High Severity. |
L |
Lines of code
https://github.com/code-423n4/2022-09-nouns-builder/blob/7e9fddbbacdd7d7812e912a369cfd862ee67dc03/src/token/metadata/MetadataRenderer.sol#L249-L252
Vulnerability details
The
MetadataRenderer
uses the concatenation of token ID and several block attributes to generate a pseudo-random seed. Although these block attributes can be manipulated by motivated adversaries, this is a common pattern for "good enough" pseudorandomness:MetadataRenderer#_generateSeed
However, post-Merge, the pseudorandomness provided by
blockhash
will be much weaker, since it will no longer contain the output of proof-of-work hashing. Instead, the recommended source of onchain pseudorandomness is theprevRandao
value from beacon chain state.Post-Merge, the current
DIFFICULTY
opcode (0x44
), will becomePREVRANDAO
and return theprevRandao
value from the beacon chain. In current versions of Solidity,block.difficulty
will return this value, since it uses opcode0x44
under the hood.See EIP-4399 and Solidity issue ethereum/solidity#13512 for more information on changes to
DIFFICULTY
andblock.difficulty
.Impact
Random seeds may be more easily manipulable post-Merge, granting attackers control over which token attributes are generated at specific times.
Recommendation
Add
block.difficulty
as a component of the seed value:The text was updated successfully, but these errors were encountered: