New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use a client side rate limit to reduce chance of getting banned #4637
Conversation
Is this valid for the server to do? It it a similar magnitude of work to check for empty slots as it is to serve those with blocks in it? Or should the rate limiting be done on actual blocks served? |
@mcdee I think it's valid. The client is knowingly requesting a high rate of blocks/slots per second, although they do not know if there are actually blocks in those slots. |
True, but the only reason it is requesting more blocks soon after the last request is because there were few-to-none blocks returned last time around. Given this situation only really occurs when we have finality issues, I'm wondering if being more lenient on the server side here would help the network get back on its feet if such a thing happened in production. I'm not suggesting the server hurt itself in the process, but if it is cheap and easy to return 0 blocks I'd be inclined to reflect that in the rate limiting. |
Supporting that type of rate limiting will require some database changes to determine how many blocks are present in a given range without unmarshalling them. As it is now, we check the rate limit, if OK then we fetch the request range of blocks. We could enhance this process to be:
For now it's simpler on the server side to check if the requested count exceeds the limit and exit early without a database read. Given that the limit allows bursts of 320 blocks per second, I think this is a reasonable response from any given peer. |
Codecov Report
@@ Coverage Diff @@
## master #4637 +/- ##
======================================
Coverage 6.9% 6.9%
======================================
Files 192 192
Lines 13374 13374
======================================
Hits 923 923
Misses 12313 12313
Partials 138 138 |
…maticlabs#4637) * Use a client side rate limit to reduce chance of getting banned * fix test Co-authored-by: prylabs-bulldozer[bot] <58059840+prylabs-bulldozer[bot]@users.noreply.github.com>
…maticlabs#4637) * Use a client side rate limit to reduce chance of getting banned * fix test Co-authored-by: prylabs-bulldozer[bot] <58059840+prylabs-bulldozer[bot]@users.noreply.github.com>
Resolves #4588
Resolves #4621
Maybe helps #4631
Part of the issue is that when there are large ranges of skip slots, we might be asking for the peer for many blocks very quickly. Even when there are no blocks present in that range, the server increments that rate limit by the request count. Adding this client side rate limit should reduce the risk of bombing peers with requests.