Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Easier way to simply wait for Rate-Limit instead of throwing RateLimitExceptions #936

Closed
danbopes opened this issue Apr 16, 2022 · 7 comments

Comments

@danbopes
Copy link

danbopes commented Apr 16, 2022

Summary: What are you wanting to achieve?

I'm currently using ASP.net core dependency injection, with AddHttpClient. When I make calls to certain API's, I know that I can only make so many calls before being rate limited. When I use the RateLimitAsync policy to my httpclient builder, it throws an error, instead of simply waiting until more spots free up before making future calls.

builder.Services
    .AddHttpClient<GitHubAPI>()
    .AddPolicyHandler(
        Policy.RateLimitAsync<HttpResponseMessage>(10, TimeSpan.FromMinutes(1)));

Within my controllers/classes, I may make 500 outgoing calls to my API at once. Ideally, as spots free up, I'd like more calls to execute, but NOT throw errors in the case of the policy rate limit (Just simply wait until a spot frees up, with something like Task.Delay(), and then retry).

Right now, with this code, when I attempt to make a call to _httpClient.GetStringAsync(), I'll get the RateLimitRejectedException exception. I don't know how I can simply catch this exception, wait the .RetryAfter property, and then try again forever.

@martincostello
Copy link
Member

If you wish to throttle, rather than rate-limit, consider using the bulkhead policy, which will queue requests until there is a free "slot".

@martincostello
Copy link
Member

There's also some relevant discussion about layering/wrapping policies for rate limits and retries in #930.

@danbopes
Copy link
Author

I wish to throttle based on the rate-limit. Bulkhead simply just makes sure that I only make X requests at one time. Using my original example above, if I set it to 8, make 500 requests, it will execute 8, and then as they free up, will continue executing until I get to 11 requests. Then I get the RateLimitRejectedException. The goal is on the 11'th (And subsequent) requests, queue up until I can make another request when the window frees up.

@martincostello
Copy link
Member

The best I can suggest is something based on the linked issue where you combine retries with a rate-limit policy.

@danbopes
Copy link
Author

The "wrapper" seems to be closer towards what I'm looking for, but I'd like to wait and retry forever based on the RateLimitRejectedException RetryAfter policy.

var retryPolicy =
    Policy
        .Handle<RateLimitRejectedException>()
        .WaitAndRetryAsync
        (
            retryCount: 3,
            retryNumber => TimeSpan.FromSeconds(intervalInSeconds)
        )
        .WrapAsync(rateLimitPolicy);

Is this possible?

@danbopes
Copy link
Author

Funny enough, I did seem to wrap the exception, and this is working better. I'm not exactly sure why the devs made the Rate Limiter work on the fact that if I'm given 60 requests per minute, that equates to 1 request per second. It's not 1 request per second, it's 60 requests per minute. If I queue up 500 requests, 60 should execute (Let's say these take 5 seconds), and then all future requests should throw RateLimitRejectedException with a 55 second retry after. Setting the burst window to 60, doesn't result in the same behavior. The logic just seems awkward, and unfortunately, will have to make me write my own throttler logic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants