-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rate limit based on API Key tier #6
Comments
It is not possible to define multiple For this use-case, you could use the following approach: Detailsupstream local {
server 127.0.0.1:80;
# a pool with at most 200 connections
keepalive 200;
}
upstream redis {
server unix:/var/run/redis/redis.sock;
}
server {
listen 80;
server_name auth_backend.local;
allow 127.0.0.1;
allow ::1;
deny all;
# Note: this is for demonstration only! It should be handled in a dedicated auth server instead
# https://www.nginx.com/resources/wiki/start/topics/depth/ifisevil/
location / {
if ($http_x_api_key = '') {
return 401; # unauthorized
}
# Generated with:
# openssl rand -base64 18
if ($http_x_api_key = 'uBE59IZPTp0lbrCMZ9pYLRQZ') {
add_header X-Route 'tier_1' always;
return 200;
}
if ($http_x_api_key = 'EgWqmoZzRHdUDQ0p6tDfvcAe') {
add_header X-Route 'tier_2' always;
return 200;
}
return 403; # forbidden
}
}
server {
listen 80;
server_name server.local;
allow 127.0.0.1;
allow ::1;
deny all;
rate_limit_headers on;
location /tier_1 {
rate_limit $http_x_api_key requests=50 period=1s burst=50;
rate_limit_pass redis;
}
location /tier_2 {
rate_limit $http_x_api_key requests=100 period=1s burst=110;
rate_limit_pass redis;
}
}
server {
listen 80;
listen [::]:80;
server_name localhost;
location / {
# Any request to this block will first be sent to this URL
auth_request /_auth;
# The auth handler sets this header as a way to specify the route
auth_request_set $route $upstream_http_x_route;
proxy_pass http://local/$route$uri;
proxy_set_header Host server.local;
proxy_set_header X-Route $route;
proxy_set_header X-Original-Host $http_host;
proxy_set_header X-Original-Scheme $scheme;
proxy_set_header X-Forwarded-For $remote_addr;
}
# API key validation
location /_auth {
internal;
proxy_pass http://local;
proxy_set_header Host auth_backend.local;
proxy_set_header Content-Length '';
proxy_set_header X-Original-Host $http_host;
proxy_set_header X-Original-Scheme $scheme;
proxy_set_header X-Original-URI $request_uri;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_pass_request_body off; # no need to send the POST body
}
} (note that this uses the For example: # Santity-check Redis module
$ for run in {1..5}; do redis-cli RATER.LIMIT api_key 110 100 1 | sed -n '3~5p'; done
110
109
108
107
106
# Santity-check 401
$ curl -s -o /dev/null -w "%{http_code}\n" localhost
401
# Santity-check 403
$ curl -s -H "X-API-Key: foo" -o /dev/null -w "%{http_code}\n" localhost
403
# Check API tier
$ curl -s -I -H "X-API-Key: uBE59IZPTp0lbrCMZ9pYLRQZ" localhost | grep "X-RateLimit-Remaining"
X-RateLimit-Remaining: 50
$ curl -s -I -H "X-API-Key: EgWqmoZzRHdUDQ0p6tDfvcAe" localhost | grep "X-RateLimit-Remaining"
X-RateLimit-Remaining: 110
# Basic load testing
$ ab -n 200 -c 10 -k -H "X-API-Key: uBE59IZPTp0lbrCMZ9pYLRQZ" localhost/load_test | grep "Failed requests"
Completed 100 requests
Completed 200 requests
Finished 200 requests
Failed requests: 149 I'll look into supporting variables in the map $http_x_api_key $requests {
default 50;
EgWqmoZzRHdUDQ0p6tDfvcAe 100;
}
map $http_x_api_key $burst {
default 50;
EgWqmoZzRHdUDQ0p6tDfvcAe 110;
}
[...]
rate_limit $http_x_api_key requests=$requests period=1s burst=$burst; Let's tag this as an enhancement. |
Hi, I am trying this module to work with API key based rate limiting. We want to implement rate limit on API key based on tier. Some API key required to have 50 RPS and Some required to have 100 RPS. Our implementation of rate limiting is below.
Here we store API keys on file based on their tiers and map it in nginx.conf file. This mapping works well with default rate limiting module. Implementation on location block like below.
I have stored API key in /etc/nginx/ratelimit_tier50.map file so API key should have 50 RPS rate limit. While running load test with 150 RPS it isn't blocking nor i see any movement on redis. Please guide if i am doing something wrong in implementation.
The text was updated successfully, but these errors were encountered: