Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add rate limiting #57

Open
bpennypacker opened this issue Jun 21, 2016 · 5 comments · May be fixed by #108
Open

Add rate limiting #57

bpennypacker opened this issue Jun 21, 2016 · 5 comments · May be fixed by #108

Comments

@bpennypacker
Copy link

bpennypacker commented Jun 21, 2016

It would be extremely helpful if this tool included rate limiting. We've used this a few times but without rate limiting it's almost useless to us. Most of our web properties are managed through Akamai, and their application firewall will block IP addresses if it detects a high rate of errors. When using Check My Links against a page that has problems with its links it can quickly trigger the application firewall and block our entire office as a result.

The following rate limit options would be hugely useful to us:

  1. Specify a rate limit for all requests.
  2. Fallback to a lower rate when an error is encountered. If a 4xx or 5xx error is returned then wait for this period of time to elapse before issuing the next request.
  3. Stop checking links altogether if the number of 4xx or 5xx errors reaches a certain threshold.

Using the above options you could set up a scenario along these lines:

  • wait for 0.5 seconds between each request
  • If an error is returned then wait 2 seconds before the next request
  • If more than 10 errors are returned then stop processing any more requests

These sorts of options would make this plug-in infinitely more valuable to people who use it in environments where there are application firewalls that may rate-limit based on the amount of traffic seen from a given client, on the amount of errors caused by a client, etc. Rate limiting in application firewalls typically involves rules along the lines of "if you see 10 errors in the span of 5 seconds then block the offending IP for 30 minutes", so the ability to rate limit would let you ensure you never trigger these sorts of rules.

@jcaloger
Copy link

jcaloger commented Nov 9, 2016

👍

@gingerling
Copy link

  • 1 - same issue with cloudflare, registers these plugins as a ddos

@oboote
Copy link

oboote commented Jan 24, 2018

+1; CloudFront returning 504's after the first ~50 requests.

Workaround is to enable caching and refresh a few times; validating 50 at a time.

@revelt
Copy link

revelt commented May 13, 2019

#92 is related

@Wowfunhappy
Copy link

Wowfunhappy commented Dec 10, 2021

To add to this, it would be ideal if the extension could back off upon receiving a 429 error in particular.

So, if example.com/a, example.com/b, and example.com/c are in the queue to check:

  1. Extension requests example.com/a (response: 200 Success)
  2. Extension requests example.com/b (response: 429 Too Many Requests)
  3. Extension waits X seconds due to the 429.
  4. Extension requests example.com/b again, since the prior error was a 429. (response: 200 Success)
  5. Extension requests example.com/c.

IMO, this should be the default behavior. A 429 status doesn't really indicate a "broken" link unless it happens repeatedly. The server is just asking the extension to please back off a little, and the extension should politely comply.

@liudongmiao liudongmiao linked a pull request Oct 20, 2022 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants