Skip to content

Pull Request: Setup Redis for Distributed Rate Limiting & Caching (#171, #85)#116

Merged
AlAfiz merged 4 commits intoBETAIL-BOYS:mainfrom
JerryIdoko:feature/redis-scaling-171-85
Mar 26, 2026
Merged

Pull Request: Setup Redis for Distributed Rate Limiting & Caching (#171, #85)#116
AlAfiz merged 4 commits intoBETAIL-BOYS:mainfrom
JerryIdoko:feature/redis-scaling-171-85

Conversation

@JerryIdoko
Copy link
Copy Markdown
Contributor

📝 Description
This PR transitions the TradeFlow-API from a single-instance architecture to a horizontally scalable, distributed system. By replacing local in-memory stores with Redis, we ensure that all API nodes share a "single source of truth" for rate-limiting counters and frequently accessed token prices.

This prevents "Limit Reset" bugs that occur when a load balancer rotates a user between different server instances.

🎯 Key Changes
Redis Client Integration: Created config/redis.js using ioredis with built-in retry strategies and error handling.

Distributed Rate Limiting: Refactored the express-rate-limit middleware to use rate-limit-redis. This centralizes request counting across the entire server cluster.

Env Configuration: Added REDIS_URL to the environment template to support local development (Docker/localhost) and production cloud clusters.

Performance Gains: Set the stage for high-speed caching of Stellar asset prices, reducing redundant database lookups and external API calls.

🌐 Scaling Architecture
By moving the state out of the Node.js process memory and into Redis, we achieve Horizontal Scalability:

✅ Acceptance Criteria Checklist
[x] Centralized State: Rate limits are now enforced globally across all server processes.

[x] Production Ready: ioredis is configured to handle connection drops without crashing the API.

[x] Consistency: Cached token prices are now synced across all instances.

[x] Non-Blocking: Redis operations are optimized to ensure zero impact on request latency.

🚀 How to Verify
Start Redis: Ensure you have a Redis instance running (e.g., docker run -p 6379:6379 redis).

Verify Connection: Start the API; you should see a "Redis Connected" log in the terminal.

Test Rate Limiting: * Spam an endpoint until you hit the 429 "Too Many Requests" error.

Restart the API server.

Try again immediately—you should still be blocked, proving the limit is stored in Redis, not local memory.

🔗 Linked Issues
Closes #171
Closes #85

@drips-wave
Copy link
Copy Markdown

drips-wave bot commented Mar 26, 2026

@JerryIdoko Great news! 🎉 Based on an automated assessment of this PR, the linked Wave issue(s) no longer count against your application limits.

You can now already apply to more issues while waiting for a review of this PR. Keep up the great work! 🚀

Learn more about application limits

@AlAfiz AlAfiz merged commit 073d7bc into BETAIL-BOYS:main Mar 26, 2026
1 check failed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Issue 171: architecture: Setup Redis for high-speed rate limiting and distributed caching

2 participants