Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Exponential backoff factor for HTTP request retries #3358

Merged
merged 1 commit into from Apr 24, 2024

Conversation

hr98w
Copy link

@hr98w hr98w commented Apr 23, 2024

What was wrong?

Related to Issue #3210

How was it fixed?

Make the backoff_factor for the ExceptionRetryConfiguration class work as an actual exponential "backoff factor" And Update the doc.

Reuse backoff_factor as the initial back off time

Introduce a new field exponential_factor

Todo:

  • Clean up commit history
  • Add or update documentation related to these changes
  • Add entry to the release notes

Cute Animal Picture

Screenshot 2024-04-24 at 10 40 03

@hr98w
Copy link
Author

hr98w commented Apr 23, 2024

@fselmo as per this issue #3210
I was considering add a new field initial_backoff_time and re-use backoff_factor as the exponential factor initially. But this will affect existing usage of field backoff_factor and might cause backward compatibility issue. Please help review the PR and share your thoughts.

@fselmo
Copy link
Collaborator

fselmo commented Apr 23, 2024

Hey @hr98w. Thanks for getting this started. I'm not sure that we need to add yet another configuration option here. If we make the sleep time itself exponential, the backoff factor can apply directly to that. Something like:

for i in range(self.exception_retry_configuration.retries):
    try:
        return make_post_request(
            self.endpoint_uri, request_data, **self.get_request_kwargs()
        )
    except tuple(self.exception_retry_configuration.errors) as e:
        if i < self.exception_retry_configuration.retries - 1:
            time.sleep(
                # the `backoff_factor` works as a factor on the inherent
               # exponential setup of 2**i
                self.exception_retry_configuration.backoff_factor * 2**i
            )
            continue
        else:
            raise e
return None

Thoughts on that?

@hr98w
Copy link
Author

hr98w commented Apr 23, 2024

@fselmo sure, this also works.

I introduced a new variable to let users determine the exponential factor. Keeping that factor as 1 means no exponenial backoff. We can keep it simple here, will follow your advice.

@fselmo
Copy link
Collaborator

fselmo commented Apr 23, 2024

I introduced a new variable to let users determine the exponential factor. Keeping that factor as 1 means no exponenial backoff. We can keep it simple here, will follow your advice.

I hear you. That would allow for more configuration for sure. Exponential is pretty standard and keeping that factor low still allows for pretty quick succession if desired. If you're good with my simpler approach then let's go for it 👍🏼

@fselmo
Copy link
Collaborator

fselmo commented Apr 23, 2024

We should maybe decrease the default value here to 0.125? That would yield .125 -> .25 -> .5 -> 1 -> 2. That feels pretty good for JSON-RPC calls. Thoughts there?

@hr98w
Copy link
Author

hr98w commented Apr 23, 2024

Starting from 0.5 seems too large for me. But i'm not sure for JSON-RPC calls, any latency stats you can share?

@hr98w
Copy link
Author

hr98w commented Apr 23, 2024

@fselmo Updated the PR, please help review

Copy link
Collaborator

@fselmo fselmo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With a few tweaks to the wording this LGTM. I gave some suggestions to the wording. The important thing here is clarity and I don't think we need to be super technical in the documentation description since we can conceptually describe the doubling strategy. The technical details can be sniffed out by a curious user if they'd like to see the implementation. I'd love your thoughts on it though. Thanks again for getting the ball rolling on this... this is one of the final pieces for getting v7 wrapped up 👍🏼

docs/internals.rst Outdated Show resolved Hide resolved
docs/internals.rst Outdated Show resolved Hide resolved
docs/internals.rst Outdated Show resolved Hide resolved
newsfragments/3358.breaking.rst Outdated Show resolved Hide resolved
@hr98w
Copy link
Author

hr98w commented Apr 24, 2024

@fselmo Thanks for your comment! Got your point to make it easy to understand. Resolved them.

@hr98w hr98w requested a review from fselmo April 24, 2024 02:58
docs/internals.rst Outdated Show resolved Hide resolved
…iders.

- update docs
- decrease default backoff
- fix lint
- add newsfragment
@fselmo fselmo merged commit b8c1f82 into ethereum:main Apr 24, 2024
57 checks passed
@fselmo
Copy link
Collaborator

fselmo commented Apr 24, 2024

Thanks again 👍🏼

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants