API Key Rotator is a powerful and flexible API key management and request proxy solution built with Go (Gin). It is designed to centralize the management of all your third-party API keys and provide automatic rotation, load balancing, and secure isolation through a unified proxy endpoint.
Whether you need to provide high availability for traditional RESTful APIs or a unified, SDK-compatible access point for large model APIs like OpenAI, this project offers an elegant and scalable solution.
The project includes a high-performance Go backend and a simple, easy-to-use Vue 3 admin panel, with "one-click" deployment via Docker Compose.
- Centralized Key Management: Manage API key pools for all services in a unified web interface.
- Dynamic Key Rotation: Atomic rotation based on Redis to effectively distribute API request quotas.
- Type-Safe Proxies:
- Generic API Proxy (
/proxy): Provides proxy services for any RESTful API. - LLM API Proxy (
/llm): Offers native streaming support and an SDK-friendlybase_urlfor large model APIs compatible with OpenAI's format. Supported providers include OpenAI, Gemini, Anthropic, etc.
- Generic API Proxy (
- Highly Extensible Architecture: The backend uses an adapter pattern, making it easy to extend support for new types of proxy services in the future.
- Secure Isolation: All proxy requests are authenticated via global keys, with support for multiple keys to protect real backend keys from being exposed.
- Dockerized Deployment: Provides a complete Docker Compose configuration for one-click startup of the backend, frontend, database, and Redis.
This project is fully containerized, and it is recommended to use Docker Compose for one-click deployment and development.
Ensure that Docker and Docker Compose are installed on your system.
After cloning the project, create a .env file from the .env.example template in the project root directory.
# Copy the configuration file template
cp .env.example .envThen, edit the .env file according to your needs, at least setting sensitive information such as the database password and administrator password.
This project uses the GLOBAL_PROXY_KEYS environment variable to configure proxy authentication keys, supporting a single key or multiple keys:
-
Single Key:
GLOBAL_PROXY_KEYS=your_secret_key
-
Multiple Keys (Recommended for multi-client scenarios):
GLOBAL_PROXY_KEYS=key1,key2,key3
The multiple keys feature allows you to assign different authentication keys to different clients or services, improving security and management flexibility.
We provide standard Docker Compose configurations for development and production environments.
Development Environment
# Start with the development environment configuration
docker-compose up --build -dProduction Environment
# Start with the production environment configuration
docker-compose -f docker-compose.prod.yml up --build -dDevelopment Environment (with Vite and Hot Reload):
- Frontend Dev Server:
http://localhost:5173 - Backend API Root:
http://localhost:8000/
Production Environment (with Nginx):
- Web Application (Frontend + Backend API):
http://localhost(orhttp://localhost:80, depending on your.envconfiguration)
If you prefer to run and debug the source code directly on your local machine without using Docker, you can follow these steps.
-
Enter the Go backend directory
cd backend/ -
Install dependencies
go mod download
-
Configure environment variables Create a
.envfile in the project root (refer to.env.example) and configure the connection information for the database and Redis. -
Start the backend server
go run main.go
The service will run at
http://127.0.0.1:8000.
-
Enter the frontend directory (in another terminal)
cd frontend/ -
Install dependencies
npm install
-
Start the frontend server
npm run dev
Vite will automatically handle API proxying. The service will run at
http://localhost:5173.
Now, you can access the admin panel at http://localhost:5173.
Using the openai Python SDK as an example, combined with an OpenRouter model, you can use the proxy service by modifying the base_url.
from openai import OpenAI
client = OpenAI(
# Format: http://<PROXY_PUBLIC_BASE_URL>/llm/<Service Slug>
base_url="http://PROXY_PUBLIC_BASE_URL/llm/openrouter-api",
api_key="<GLOBAL_PROXY_KEY>",
)
completion = client.chat.completions.create(
# Please refer to the specific provider's documentation for model names
model="openai/gpt-4o",
messages=[
{
"role": "user",
"content": "What is the meaning of life?"
}
]
)
print(completion.choices[0].message.content)Where PROXY_PUBLIC_BASE_URL and GLOBAL_PROXY_KEY are the environment variables you configured in your .env file.
The Generic API Proxy can be used for any RESTful API. Here's an example of calling a weather API using Python requests library:
import requests
# Configure proxy parameters
proxy_url = "http://PROXY_PUBLIC_BASE_URL/proxy/weather/current"
proxy_key = "<GLOBAL_PROXY_KEY>"
# Query parameters
params = {
"query": "London"
# When proxying requests to the target API, the system polls the real API keys configured in the backend and appends them to the original authorization parameter access_key (which is configured in the backend).
}
# Set headers
headers = {
"X-Proxy-Key": proxy_key
}
# Make the request
response = requests.get(proxy_url, params=params, headers=headers)
# Handle the response
if response.status_code == 200:
data = response.json()
print(f"Weather information: {data}")
else:
print(f"Request failed with status code: {response.status_code}")In this example:
weatheris the service slug configured in the admin panelcurrentis the path of the target API endpointPROXY_PUBLIC_BASE_URLis your proxy service address<GLOBAL_PROXY_KEY>is one of the global proxy keys you configured
The proxy will automatically forward the request to the configured target URL, appending the path and query parameters to the target address.
If you want to dive deeper into the code, please refer to the following documents: