Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add BlockBatchLimit comment: Maximum 128 #13502

Merged
merged 3 commits into from Feb 15, 2024

Conversation

tmors
Copy link
Contributor

@tmors tmors commented Jan 23, 2024

What type of PR is this?

Documentation

What does this PR do? Why is it needed?
As mentioned in the issue, make people aware that there is a range limit of BlockBatchLimit. Especially since it was changed on the Dencun.

Which issues(s) does this PR fix?

Fixes #13499

Other notes for review

@tmors tmors requested a review from a team as a code owner January 23, 2024 05:38
@CLAassistant
Copy link

CLAassistant commented Jan 23, 2024

CLA assistant check
All committers have signed the CLA.

@james-prysm james-prysm added the Cleanup Code health! label Jan 23, 2024
@@ -158,7 +158,7 @@ var (
// BlockBatchLimit specifies the requested block batch size.
BlockBatchLimit = &cli.IntFlag{
Name: "block-batch-limit",
Usage: "The amount of blocks the local peer is bounded to request and respond to in a batch.",
Usage: "The amount of blocks the local peer is bounded to request and respond to in a batch. Maximum 128",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This number changes depending on network so just saying maximum 128 might not be great, we should think of a better way to word this

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

got it. It seems that it's not so good to word this here because the range is dynamic.
What about adding param judgement on the fetchBlocksFromPeer to make user can realize it without enable the Debug otherwise user would keep waiting here because there is no any error msg . like this :

maxRequest := params.MaxRequestBlock(slots.ToEpoch(start))
if count > maxRequest {
    log.WithField("count", count).WithField("maxRequest", maxRequest).Error("Requested count is too high, reducing")
}

Copy link
Member

@prestonvanloon prestonvanloon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's merge this as-is for deneb and we can iterate / improve it later as needed.

@prestonvanloon prestonvanloon added this pull request to the merge queue Feb 15, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Feb 15, 2024
@prestonvanloon prestonvanloon added this pull request to the merge queue Feb 15, 2024
Merged via the queue into prysmaticlabs:develop with commit 05b2795 Feb 15, 2024
17 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Cleanup Code health!
Projects
None yet
Development

Successfully merging this pull request may close these issues.

It's better add comments on param BlockBatchLimit
5 participants