Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Request flooding bypasses rate limit to some extend #1064

Closed
zhorakaroy opened this issue Dec 30, 2022 · 6 comments · Fixed by #1081
Closed

Request flooding bypasses rate limit to some extend #1064

zhorakaroy opened this issue Dec 30, 2022 · 6 comments · Fixed by #1081
Assignees
Labels
bug Something isn't working enhancement New feature or request good first issue Good for newcomers

Comments

@zhorakaroy
Copy link

zhorakaroy commented Dec 30, 2022

The service doesn't work properly when there are for example more than 5 requests in a second, for example, the limitation is 5 requests per second.

if we did the requests by Postman or Browser all will work fine.

But if we use the script to make requests for example 1000 per second, the guard will not work and will forward a request to the controller. @Throttle(5, 1)

I changed storage from Redis to memory and it works fine for 1000 requests for a second, and after 5 requests the guard is activating.

As I understand the logic written in functions addRecord and getRecord works very slowly, for that it's working when we make requests by the postman and doesn't work when we use the script for checking.

@kkoomen
Copy link
Owner

kkoomen commented Jan 3, 2023

Which version of nestjs-throttler-storage-redis are you currently using? This should not be a problem with the latest 0.2.2 version.

@zhorakaroy
Copy link
Author

the version is 0.2.2.

The issue is related to Redis storage. because without RedisStorage(in memory) the guard is working perfectly.

I think the logic written in addRecord and getRecord works very slowly, for it can't accumulate requests. @kkoomen

@zhorakaroy
Copy link
Author

This is the script which I use for testing.

`
const axios = require('axios');

for (let i = 0; i < 1000; i++) {
axios
.get('http://localhost:9000/v1/test')
.then(({ data }) => console.count())
.catch((err) => {
console.log(err.toString());
});
}
`

@kkoomen kkoomen pinned this issue Jan 4, 2023
@kkoomen kkoomen self-assigned this Jan 4, 2023
@kkoomen kkoomen added the bug Something isn't working label Jan 4, 2023
@kkoomen
Copy link
Owner

kkoomen commented Jan 4, 2023

Okay so, this is a tougher problem (to me) than it initially seemed. I found out that express-rate-limit with rate-limit-redis do not have this problem and I know why.

Have a look at the express-rate-limit: https://github.com/express-rate-limit/express-rate-limit/blob/master/source/lib.ts#L230-L369
and also the rate-limit-redis: https://github.com/wyattjoh/rate-limit-redis/blob/main/src/lib.ts

So, how the @nestjs/throttler guard (see here) works is by first calling getRecord() in order to get the current TTLs (partially based the req.ip) and then if it did not exceed the limit, then it will call addRecord.

The way express-rate-limit works is by calling an increment function (see here) - which is similar to the addRecord() in the @nestjs/throttler - where they increase the amount of requests being done (based on the req.ip) and return the new amount along with the remaining time. So after running the above script to trigger 1000 queries, the value in redis will eventually be ±1000. Then if this is >= specified limit, then a 429 error will be thrown. The rate-limit-redis does implement its own increment by using EVALSHA with a custom lua script that will do this all in a single call. I believe this is the reason that they don't face this issue, because everything is being done in a single call.

To make it more clear: the express-rate-limit gets triggered, does +1 in amount of requests being done in redis, but if there is another request then the other request will have the +1 from the previous request already included, because it is already in redis.

The problem with @nestjs/throttler is that we first calls addRecord, then the request after that will also call addRecord almost at the same time, let's say that at this point we have 4 requests and there are 5 allowed, then only the first one should be allowed, but this isn't the case, because both requests will notice that there are 4 requests, and they will therefore both execute.

@jmcdo29 I need your input on this one.

@Zizitto @zhorakaroy if you guys have a possible solution that doesn't involve rewriting the @nestjs/throttler logic and only requires changes in this package, please feel free to comment below.

EDIT: A solution has been implemented, see my next comment.

@kkoomen kkoomen added enhancement New feature or request help wanted Extra attention is needed good first issue Good for newcomers labels Jan 4, 2023
@kkoomen kkoomen changed the title The service doesn't work properly when there are for example more than 5 requests in a second Request flooding bypasses rate limit to some extend Jan 4, 2023
@kkoomen
Copy link
Owner

kkoomen commented Jan 5, 2023

Changes have been made in the @nestjs/throttler package. If nestjs/throttler#1303 has been merged then I will adjust changes to this package and merge these changes to master and release a new version.

kkoomen added a commit that referenced this issue Jan 5, 2023
…limits

These changes have been made based on the new PR changes from @nestjs/throttler#1304,
see: #1064
@kkoomen kkoomen removed the help wanted Extra attention is needed label Jan 5, 2023
kkoomen added a commit that referenced this issue Jan 5, 2023
…limits (fixes #1064)

These changes have been made based on the new PR changes from @nestjs/throttler#1304,
see: nestjs/throttler#1304
@kkoomen kkoomen unpinned this issue Jan 19, 2023
@kkoomen
Copy link
Owner

kkoomen commented Jan 19, 2023

This has been released in version v0.3.0 on NPM. Special thanks to @zhorakaroy for discovering this nasty bug. This was a good improvement for the nestjs throttler core package as well as for this package.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working enhancement New feature or request good first issue Good for newcomers
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants