Skip to content

DesignerTrip/Cortex

 
 

Repository files navigation

Cortex

WebMCP

AI Provider Gateway — A unified API gateway to multiple LLM providers.

Provides OpenAI and Anthropic-compatible APIs with intelligent routing, usage tracking, API key management, rate limiting, and an embedded React admin UI. Built with Rust and serdesAI.

WebMCP Support

Cortex is agent-native! All admin operations are callable via WebMCP tools.

Enable WebMCP in your browser:

localStorage.setItem('cortex_webmcp_enabled', 'true')

Use tools from the console:

const client = window.webmcp.context.createClient()
const providers = await client.executeTool('cortex_list_providers', {})

See the WebMCP Documentation for complete details.

Quick Start

Local Development

cargo run --release -- --port 8090

Docker

docker compose up -d

Verify

curl http://localhost:8090/api/health

Supported Providers

  • OpenAI — GPT-4, GPT-3.5, etc.
  • Anthropic — Claude 3.5, Claude 3, etc.
  • Google — Gemini models
  • Groq — Fast inference
  • Mistral — Mistral/Mixtral models
  • Ollama — Local models
  • Azure OpenAI — Azure-hosted OpenAI
  • AWS Bedrock — AWS-hosted models

API Endpoints

Endpoint Description
GET /api/health Health check
POST /v1/chat/completions OpenAI-compatible chat API
POST /anthropic/v1/messages Anthropic-compatible messages API
GET/POST /api/providers Provider management
GET/POST /api/keys API key management
GET/POST /api/aliases Model alias management
GET /api/usage Usage tracking & analytics

Configuration

Variable Description Default
CORTEX_PORT Server port 8090
CORTEX_DB SQLite database path cortex.db
CORTEX_ENCRYPTION_KEY Encryption key for sensitive data (API keys, tokens) cortex-dev-key
CORTEX_CONFIG Path to YAML config file (none)
RUST_LOG Log level filter cortex_server=info,tower_http=info

CLI Arguments

cortex [OPTIONS]

Options:
  -p, --port <PORT>                  Server port (overrides config)
      --db <DB>                      SQLite database path [default: cortex.db]
      --encryption-key <KEY>         Encryption key [env: CORTEX_ENCRYPTION_KEY]
  -c, --config <FILE>                YAML config file path
  -h, --help                         Print help

Project Structure

Cortex/
├── Cargo.toml              # Workspace root
├── Cargo.lock
├── cortex-server/          # Main server crate
│   ├── Cargo.toml
│   ├── src/
│   │   ├── main.rs           # Entrypoint (CLI parsing, server start)
│   │   ├── lib.rs            # Library root
│   │   ├── config/           # Configuration loading (YAML, env, CLI)
│   │   ├── db/               # Database layer (SQLite, migrations)
│   │   ├── dispatch/         # Request routing & load balancing
│   │   ├── extensions/       # OpenAI/Anthropic API compatibility
│   │   ├── handlers/         # HTTP handlers (health, metrics, static)
│   │   ├── logging/          # Structured logging setup
│   │   ├── middleware/       # Auth, CORS, rate limiting, metrics
│   │   ├── providers/        # Provider registry & factory
│   │   ├── routes/           # Router assembly & API endpoints
│   │   └── state.rs          # Application state
│   └── tests/                # Integration tests
├── web/                     # React admin frontend
│   ├── src/
│   └── e2e/
├── Dockerfile               # Multi-stage Docker build
├── docker-compose.yml
├── .github/workflows/
├── docs/
├── examples/
└── config/

See .env.example for environment variable reference.

Documentation

About

Unified LLM gateway with OpenAI/Anthropic-compatible APIs, intelligent routing, rate limiting, and admin UI

Resources

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages

  • Rust 74.8%
  • TypeScript 24.8%
  • Dockerfile 0.2%
  • Shell 0.1%
  • Python 0.1%
  • JavaScript 0.0%