-
Notifications
You must be signed in to change notification settings - Fork 609
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implements ShardLayout and ShardConfig #4614
Conversation
Still adding tests |
if is_top_level_account(fixed_account, account_id) { | ||
return shard_id as ShardId; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am a bit confused by this. Could you explain the motivation here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comments are in ShardLayoutV1.
For this shard layout, there are some fixed shards, each fixed shard is mapped to an account and all subaccounts of this account. The rest of accounts are divided by boundary_accounts. For example, For ShardLayoutV1{ fixed_shards: ["aurora"], boundary_accounts: ["near"], }
Account "aurora" and all its subaccounts will be mapped to shard_id 0. For the rest of accounts, accounts <= "near" will be mapped to shard_id 1 and accounts > "near" will be mapped shard_id 2.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Account "aurora" and all its subaccounts will be mapped to shard_id 0
Is this actually what we want? Also adding a comment here will be helpful :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry I missed this comment.
I thought this is what we want, but maybe I misunderstood something. Is you question here regarding whether we should map all subaccounts of "aurora" to shard_id 0? I'll add a comment here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is you question here regarding whether we should map all subaccounts of "aurora" to shard_id 0?
Yes. I don't think we've agreed on that. Might require more discussion. @djsatok is there plan to create aurora subaccounts? If so, is it important that they are on the same shard with aurora
?
I don't have enough context about sharding, so I cannot review this PR properly. Let me know if github requires my review as a codeowner of some files, but the decision whether the implemented sharding mechanics make sense should be made before that. |
(epoch_info.protocol_version(), epoch_info.seat_price()) | ||
}; | ||
let config = self.config.for_protocol_version(protocol_version); | ||
let stake_divisor = { config.minimum_stake_divisor as Balance }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nit not related to this pr, but divisor being Balance seems weird - if we divide near by near, we will have just a number, not something measured in near
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah might be better to write as u128
instead
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll do that in a separate PR
@frol, yes the implemented sharding mechanics in this PR is recorded in NEP near/NEPs#241. See section protocol change and protocol-level sharding representation. You don't have to read it of course, but in case you are interested. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Approving to unblock, but please address the comments
(epoch_info.protocol_version(), epoch_info.seat_price()) | ||
}; | ||
let config = self.config.for_protocol_version(protocol_version); | ||
let stake_divisor = { config.minimum_stake_divisor as Balance }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah might be better to write as u128
instead
let mut config = genesis_epoch_config.clone(); | ||
if let Some(ShardConfig { | ||
num_block_producer_seats_per_shard, | ||
avg_hidden_validator_seats_per_shard, | ||
shard_layout, | ||
}) = simple_nightshade_shard_config | ||
{ | ||
config.num_block_producer_seats_per_shard = num_block_producer_seats_per_shard; | ||
config.avg_hidden_validator_seats_per_shard = avg_hidden_validator_seats_per_shard; | ||
config.shard_layout = shard_layout; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let mut config = genesis_epoch_config.clone(); | |
if let Some(ShardConfig { | |
num_block_producer_seats_per_shard, | |
avg_hidden_validator_seats_per_shard, | |
shard_layout, | |
}) = simple_nightshade_shard_config | |
{ | |
config.num_block_producer_seats_per_shard = num_block_producer_seats_per_shard; | |
config.avg_hidden_validator_seats_per_shard = avg_hidden_validator_seats_per_shard; | |
config.shard_layout = shard_layout; | |
} | |
let config = match simple_nightshade_shard_config { | |
Some(config) => config.clone(), | |
None => genesis_epoch_config.clone() | |
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ShardConfig is different from EpochConfig here. simple_nightshade_shard_config
is a ShardConfig
and config
is a EpochConfig
@bowenwang1996 thanks! Will address all comments before I merge the PR |
@mzhangmzz are we ready to merge this one? |
yes, merge now |
Part of NEP near/NEPs#241