Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rate limit based on API Key tier #6

Open
murtuza-kolasavala opened this issue Nov 23, 2022 · 1 comment
Open

Rate limit based on API Key tier #6

murtuza-kolasavala opened this issue Nov 23, 2022 · 1 comment
Labels
enhancement New feature or request

Comments

@murtuza-kolasavala
Copy link

Hi, I am trying this module to work with API key based rate limiting. We want to implement rate limit on API key based on tier. Some API key required to have 50 RPS and Some required to have 100 RPS. Our implementation of rate limiting is below.

map $ms_apikey $ratelimit_tier10000 {
        include /etc/nginx/ratelimit_tier10000.map;
        default '';
    }

    map $ms_apikey $ratelimit_tier1000 {
        include /etc/nginx/ratelimit_tier1000.map;
        default '';
    }

    map $ms_apikey $ratelimit_tier500 {
        include /etc/nginx/ratelimit_tier500.map;
        default '';
    }

    map $ms_apikey $ratelimit_tier200 {
        include /etc/nginx/ratelimit_tier200.map;
        default '';
    }

    map $ms_apikey $ratelimit_tier100 {
        include /etc/nginx/ratelimit_tier100.map;
        default '';
    }

    map $ms_apikey $ratelimit_tier50 {
        include /etc/nginx/ratelimit_tier50.map;
        default '';
    }

    map $ratelimit_tier10000$ratelimit_tier1000$ratelimit_tier500$ratelimit_tier50$ratelimit_tier200$ratelimit_tier100 $ratelimit_default {
        ''   $ms_apikey;
        default '';
    }

Here we store API keys on file based on their tiers and map it in nginx.conf file. This mapping works well with default rate limiting module. Implementation on location block like below.

location /somepath {
        rate_limit $ratelimit_default requests=5 period=1s burst=5;
        rate_limit $ratelimit_tier50 requests=50 period=1s burst=50;
        rate_limit $ratelimit_tier100 requests=100 period=1s burst=110;
        rate_limit_pass redis;
        proxy_pass http://www.example.com;
    }

I have stored API key in /etc/nginx/ratelimit_tier50.map file so API key should have 50 RPS rate limit. While running load test with 150 RPS it isn't blocking nor i see any movement on redis. Please guide if i am doing something wrong in implementation.

@kleisauke
Copy link
Member

kleisauke commented Nov 28, 2022

Hi @murtuza-kolasavala,

It is not possible to define multiple rate_limit directives in the same block. It will override the previous one, if you do so.

For this use-case, you could use the following approach:

Details
upstream local {
    server 127.0.0.1:80;

    # a pool with at most 200 connections
    keepalive 200;
}

upstream redis {
    server unix:/var/run/redis/redis.sock;
}

server {
    listen 80;

    server_name auth_backend.local;

    allow 127.0.0.1;
    allow ::1;
    deny all;

    # Note: this is for demonstration only! It should be handled in a dedicated auth server instead
    # https://www.nginx.com/resources/wiki/start/topics/depth/ifisevil/
    location / {
        if ($http_x_api_key = '') {
            return 401; # unauthorized
        }

        # Generated with:
        # openssl rand -base64 18
        if ($http_x_api_key = 'uBE59IZPTp0lbrCMZ9pYLRQZ') {
            add_header X-Route 'tier_1' always;
            return 200;
        }

        if ($http_x_api_key = 'EgWqmoZzRHdUDQ0p6tDfvcAe') {
            add_header X-Route 'tier_2' always;
            return 200;
        }

        return 403; # forbidden
    }
}

server {
    listen 80;

    server_name server.local;

    allow 127.0.0.1;
    allow ::1;
    deny all;

    rate_limit_headers on;

    location /tier_1 {
	rate_limit $http_x_api_key requests=50 period=1s burst=50;
 	rate_limit_pass redis;
    }

    location /tier_2 {
	rate_limit $http_x_api_key requests=100 period=1s burst=110;
 	rate_limit_pass redis;
    }
}

server {
    listen 80;
    listen [::]:80;

    server_name localhost;

    location / {
        # Any request to this block will first be sent to this URL
        auth_request /_auth;

        # The auth handler sets this header as a way to specify the route
        auth_request_set $route $upstream_http_x_route;

	proxy_pass http://local/$route$uri;
        proxy_set_header Host server.local;
        proxy_set_header X-Route $route;
        proxy_set_header X-Original-Host $http_host;
        proxy_set_header X-Original-Scheme $scheme;
        proxy_set_header X-Forwarded-For $remote_addr;
    }

    # API key validation
    location /_auth {
        internal;

        proxy_pass http://local;
        proxy_set_header Host auth_backend.local;
        proxy_set_header Content-Length '';
        proxy_set_header X-Original-Host $http_host;
        proxy_set_header X-Original-Scheme $scheme;
        proxy_set_header X-Original-URI $request_uri;
        proxy_set_header X-Forwarded-For $remote_addr;

        proxy_pass_request_body off; # no need to send the POST body
    }
}

(note that this uses the auth_request module, which is not enabled by default. It requires to compile nginx with the --with-http_auth_request_module flag).

For example:

# Santity-check Redis module
$ for run in {1..5}; do redis-cli RATER.LIMIT api_key 110 100 1 | sed -n '3~5p'; done
110
109
108
107
106

# Santity-check 401
$ curl -s -o /dev/null -w "%{http_code}\n" localhost
401 

# Santity-check 403
$ curl -s -H "X-API-Key: foo" -o /dev/null -w "%{http_code}\n" localhost
403

# Check API tier
$ curl -s -I -H "X-API-Key: uBE59IZPTp0lbrCMZ9pYLRQZ" localhost | grep "X-RateLimit-Remaining"
X-RateLimit-Remaining: 50
$ curl -s -I -H "X-API-Key: EgWqmoZzRHdUDQ0p6tDfvcAe" localhost | grep "X-RateLimit-Remaining"
X-RateLimit-Remaining: 110

# Basic load testing
$ ab -n 200 -c 10 -k -H "X-API-Key: uBE59IZPTp0lbrCMZ9pYLRQZ" localhost/load_test | grep "Failed requests"
Completed 100 requests
Completed 200 requests
Finished 200 requests
Failed requests:        149

I'll look into supporting variables in the requests=, period= and burst= parameters, which would make this possible:

map $http_x_api_key $requests {
  default                  50;
  EgWqmoZzRHdUDQ0p6tDfvcAe 100;

}
map $http_x_api_key $burst {
  default                  50;
  EgWqmoZzRHdUDQ0p6tDfvcAe 110;

}

[...]

rate_limit $http_x_api_key requests=$requests period=1s burst=$burst;

Let's tag this as an enhancement.

@kleisauke kleisauke added the enhancement New feature or request label Nov 28, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants