The ultimate transparent HTTP proxy for no-code platforms. It sits between your automations and expensive APIs, caching responses based on MD5 hashes so identical requests return instantly.
โก Get Started โข โจ Key Features โข ๐ฎ Usage & Examples โข โ๏ธ Configuration โข ๐ Why This Slaps
FastAPI Transparent Proxy is the caching layer your no-code automations wish they had. Stop making the same API calls over and over. This proxy sits between your n8n/Make/Zapier workflows and expensive third-party APIs, returning cached responses for identical requestsโsaving bandwidth, reducing latency, and cutting your API bills.
|
MD5 Deduplication Same request = same cache key |
Sub-ms Response Cache hits are instant |
Zero Config Works without Redis too |
How it slaps:
- You: Point your n8n HTTP Request node to this proxy
- Proxy: Hashes the request, checks cache, returns or forwards
- Result: First call hits the API, next 1000 identical calls return instantly
- Your wallet: ๐
Manually deduplicating API calls in no-code is a nightmare. This proxy makes other approaches look ancient.
| โ The Old Way (Pain) | โ The Proxy Way (Glory) |
|
|
We're not just forwarding requests. We're building deterministic cache keys from MD5 hashes of method + URL + headers + body, so identical business requests always hit the same cache entryโeven across different workflow runs.
| Platform | One-liner |
|---|---|
| ๐ณ Docker | docker run -p 8000:8000 ghcr.io/yigitkonur/fastapi-proxy |
| ๐ Python | pip install -r requirements.txt && uvicorn main:app |
| โ๏ธ Railway/Render | Deploy from GitHub, set REDIS_URL env var |
# Clone and enter
git clone https://github.com/yigitkonur/fastapi-http-proxy-with-caching.git
cd fastapi-http-proxy-with-caching
# Setup virtual environment
python3 -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
pip install -r requirements.txt
# Run (works immediately, even without Redis!)
uvicorn main:app --host 0.0.0.0 --port 8000โจ Zero Config: The proxy starts in degraded mode without Redisโrequests still work, just without caching. Add Redis when you're ready for the full experience.
# Instead of calling the API directly...
curl -X POST "https://expensive-api.com/data" -d '{"query": "foo"}'
# Route through the proxy:
curl -X POST "http://localhost:8000/proxy?url=https://expensive-api.com/data" \
-H "Content-Type: application/json" \
-d '{"query": "foo"}'{
"success": true,
"cached": true,
"cache_key": "a1b2c3d4e5f67890",
"status_code": 200,
"data": { "your": "api response" }
}The cached: true means you just saved an API call. ๐
- Add an HTTP Request node
- Set URL to:
http://your-proxy:8000/proxy?url=https://actual-api.com/endpoint - Configure method, headers, body as normal
- Every identical request now returns from cache
# Force fresh request (bypass cache)
curl "http://localhost:8000/proxy?url=https://api.com/data&bypass_cache=true"
# Custom cache TTL (2 hours instead of default 1 hour)
curl "http://localhost:8000/proxy?url=https://api.com/data&cache_ttl=7200"# Health check (great for load balancers)
curl http://localhost:8000/health
# โ {"status": "healthy", "redis_connected": true, "version": "2.0.0"}
# Cache statistics
curl http://localhost:8000/cache/stats
# โ {"total_keys": 1547, "memory_usage": "2.3M", "prefix": "proxy:cache:"}
# Nuclear option: clear all cache
curl -X DELETE http://localhost:8000/cache
# โ {"deleted": 1547, "message": "Cleared 1547 cached entries"}| Feature | What It Does | Why You Care |
|---|---|---|
| ๐ง MD5 Hashing Deterministic keys |
Hashes method + URL + headers + body into cache key |
Identical requests always return same cached response |
| โก Graceful Degradation No Redis? No problem |
Starts without Redis, just skips caching | Deploy anywhere, add Redis later |
| ๐ All HTTP Methods Not just POST |
GET, POST, PUT, DELETE, PATCH all supported | Works with any API pattern |
| โฐ Flexible TTL Per-request control |
Default 1 hour, override per request | Cache static data longer, dynamic shorter |
| ๐ฏ Cache Bypass When you need fresh |
bypass_cache=true skips cache |
Force refresh when needed |
| ๐ Health Checks Production ready |
/health endpoint with Redis status |
Perfect for k8s liveness probes |
| ๐ง Legacy Support Drop-in replacement |
/webhook-test/post-response still works |
Migrate existing workflows gradually |
All settings via environment variables. Copy .env.example to .env:
cp .env.example .env| Variable | Default | Description |
|---|---|---|
REDIS_URL |
redis://localhost:6379/0 |
Redis connection (or Upstash URL) |
CACHE_TTL_SECONDS |
3600 |
Default cache lifetime (1 hour) |
CACHE_PREFIX |
proxy:cache: |
Redis key prefix |
PROXY_TIMEOUT_SECONDS |
30 |
Timeout for proxied requests |
DEBUG |
false |
Enable verbose logging |
Upstash is perfect for thisโpay only for what you use:
- Create a database at console.upstash.com
- Copy your Redis URL
- Set in
.env:REDIS_URL=redis://default:YOUR_PASSWORD@YOUR_ENDPOINT.upstash.io:6379
Cost: ~$0.20 per 100K cached requests. If you're making 1M duplicate calls/month, that's $2 vs whatever you're paying now.
โโโ main.py # Entry point (thin wrapper)
โโโ app/
โ โโโ __init__.py # Package metadata
โ โโโ main.py # FastAPI app factory + lifespan
โ โโโ config.py # Pydantic settings from env
โ โโโ models.py # Request/response schemas
โ โโโ dependencies.py # Service injection
โ โโโ services/
โ โ โโโ cache.py # Redis + MD5 hashing logic
โ โ โโโ proxy.py # HTTP forwarding logic
โ โโโ routes/
โ โโโ proxy.py # /proxy endpoint
โ โโโ health.py # /health, /cache/stats
โโโ requirements.txt # Pinned dependencies
โโโ Dockerfile # Multi-stage production build
โโโ .env.example # Configuration template
โโโ README.md
# Build
docker build -t fastapi-proxy .
# Run (without Redis - degraded mode)
docker run -p 8000:8000 fastapi-proxy
# Run with Redis
docker run -p 8000:8000 -e REDIS_URL=redis://host:6379 fastapi-proxyversion: '3.8'
services:
proxy:
build: .
ports:
- "8000:8000"
environment:
- REDIS_URL=redis://redis:6379/0
depends_on:
- redis
redis:
image: redis:alpine
volumes:
- redis_data:/data
volumes:
redis_data:[Unit]
Description=FastAPI Transparent Proxy
After=network.target
[Service]
User=www-data
WorkingDirectory=/opt/fastapi-proxy
Environment="PATH=/opt/fastapi-proxy/venv/bin"
ExecStart=/opt/fastapi-proxy/venv/bin/uvicorn main:app --host 0.0.0.0 --port 8000
Restart=always
[Install]
WantedBy=multi-user.targetExpand for troubleshooting tips
| Problem | Solution |
|---|---|
| "Redis unavailable" warning | Expected without Redis. The proxy still works, just without caching. Add REDIS_URL when ready. |
| Cache not working | Check redis_connected: true in /health. Verify your REDIS_URL is correct. |
| Timeout errors | Increase PROXY_TIMEOUT_SECONDS. Some APIs are slow. |
| Cache key collisions | Shouldn't happenโMD5 is deterministic. If you're seeing wrong cached responses, check if you're modifying headers unintentionally. |
| High memory usage | Set CACHE_TTL_SECONDS lower, or use /cache DELETE endpoint to clear. |
# Clone
git clone https://github.com/yigitkonur/fastapi-http-proxy-with-caching.git
cd fastapi-http-proxy-with-caching
# Setup
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
# Run with hot reload
uvicorn main:app --reload
# Run tests (coming soon)
pytestBuilt with ๐ฅ because paying for duplicate API calls is a soul-crushing waste of money.
MIT ยฉ Yiฤit Konur