Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MetadataRenderer: Block hash will provide poor randomness post-Merge #417

Closed
code423n4 opened this issue Sep 15, 2022 · 6 comments
Closed
Labels
bug Something isn't working duplicate This issue or pull request already exists QA (Quality Assurance) Assets are not at risk. State handling, function incorrect as to spec, issues with clarity, syntax

Comments

@code423n4
Copy link
Contributor

Lines of code

https://github.com/code-423n4/2022-09-nouns-builder/blob/7e9fddbbacdd7d7812e912a369cfd862ee67dc03/src/token/metadata/MetadataRenderer.sol#L249-L252

Vulnerability details

The MetadataRenderer uses the concatenation of token ID and several block attributes to generate a pseudo-random seed. Although these block attributes can be manipulated by motivated adversaries, this is a common pattern for "good enough" pseudorandomness:

MetadataRenderer#_generateSeed

    /// @dev Generates a psuedo-random seed for a token id
    function _generateSeed(uint256 _tokenId) private view returns (uint256) {
        return uint256(keccak256(abi.encode(_tokenId, blockhash(block.number), block.coinbase, block.timestamp)));
    }

However, post-Merge, the pseudorandomness provided by blockhash will be much weaker, since it will no longer contain the output of proof-of-work hashing. Instead, the recommended source of onchain pseudorandomness is the prevRandao value from beacon chain state.

Post-Merge, the current DIFFICULTY opcode (0x44), will become PREVRANDAO and return the prevRandao value from the beacon chain. In current versions of Solidity, block.difficulty will return this value, since it uses opcode 0x44 under the hood.

See EIP-4399 and Solidity issue ethereum/solidity#13512 for more information on changes to DIFFICULTY and block.difficulty.

Impact

Random seeds may be more easily manipulable post-Merge, granting attackers control over which token attributes are generated at specific times.

Recommendation

Add block.difficulty as a component of the seed value:

    /// @dev Generates a psuedo-random seed for a token id
    function _generateSeed(uint256 _tokenId) private view returns (uint256) {
        return uint256(keccak256(abi.encode(_tokenId, block.difficulty, blockhash(block.number), block.coinbase, block.timestamp)));
    }
@code423n4 code423n4 added 2 (Med Risk) Assets not at direct risk, but function/availability of the protocol could be impacted or leak value bug Something isn't working labels Sep 15, 2022
code423n4 added a commit that referenced this issue Sep 15, 2022
@GalloDaSballo
Copy link
Collaborator

Always was bad RNG

@eugenioclrc
Copy link

duplicate #151

@GalloDaSballo
Copy link
Collaborator

This looks like the best, making it primary

@horsefacts
Copy link

I want to point out #242 as another good finding here. Out of everyone who looked at this line and identified weak randomness, only 0xA5DF and Lambda noticed the biggest issue: blockhash(block.number) always returns zero and doesn't provide any randomness at all!

@GalloDaSballo
Copy link
Collaborator

My specific criticism against the PRNG findings being of Medium Severity is that in the context of this codebase, Randomness is more a tool for storytelling than a tool for scarcity.

Distribution of properties is linear, meaning that the PRNG is the way to create scarcity in features.

If any specific individual or group where to monitor the chain, they could all predictably decide when to settle an auction and trigger the next, meaning that a recognizably rare (where rarity is perceived or actualized), can be crated.

At that point the system would have a auction where most willing participant recognize the value of the new token, meaning that from "gaming" the system, the system is gaining.

If this were to be abused, to always mint the scarcest of trait, then over time the scarce trait would simply lose it's rarity

And because all features have a linear distribution, over enough time, no specific trait should is expected to emerge, although, per the above some trait may become rarer or scarcer.

All in all I think the rarity is secondary here, as no promise of fair distribution is made.

Because of that, I will confirm on a technical level that there are better ways to source scarcity (obvious one being VRF), however, because traits are secondary instead of being the main point of the collection (as in, there are no weights to determine if a feature should be scarce or common), I think Low Severity to be more appropriate.

In a different contest, where scarcity is necessary, (see Meebits for example), this finding may even be of High Severity.
However, in this codebase, where scarcity takes a backseat, I don't think gaming the system can do anything besides create more incentives for fund-raising, which actually helps the project instead of bringing it down

@GalloDaSballo GalloDaSballo added QA (Quality Assurance) Assets are not at risk. State handling, function incorrect as to spec, issues with clarity, syntax and removed 2 (Med Risk) Assets not at direct risk, but function/availability of the protocol could be impacted or leak value labels Sep 25, 2022
@GalloDaSballo
Copy link
Collaborator

L

@JeeberC4 JeeberC4 added the duplicate This issue or pull request already exists label Oct 5, 2022
@JeeberC4 JeeberC4 closed this as completed Oct 5, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working duplicate This issue or pull request already exists QA (Quality Assurance) Assets are not at risk. State handling, function incorrect as to spec, issues with clarity, syntax
Projects
None yet
Development

No branches or pull requests

5 participants