Skip to content

Gamingstein/Project-Aether

Project Aether: The Sentient MCP-Discord Ecosystem

CI Python 3.12+ Tests Ruff Docker MCP LangGraph License: MIT

Project Aether is a production-grade, event-driven AI ecosystem for Discord β€” a portfolio showcase of modern AI architecture, software engineering, and observability practices.

The system is modelled as a biological organism: each microservice is an "organ" with a distinct function, communicating through a central nervous system powered by Redis Streams. The Brain reasons through a compiled LangGraph StateGraph, auto-discovers tools from an MCP server via the Streamable HTTP protocol, tracks mood through a sentiment engine, and retrieves conversation context from an async ChromaDB vector store β€” all observable through Prometheus metrics and Grafana dashboards.


Highlights

Category Details
AI Reasoning Multi-node LangGraph StateGraph with 8 discrete nodes, tool-calling loop, and intent-based routing
Tool Protocol Real MCP (Model Context Protocol) via Streamable HTTP β€” auto-discovery, zero Brain changes to add tools
Vector Memory Async ChromaDB with tiktoken-based token windowing for RAG context budgeting
Observability Prometheus metrics (counters, histograms, gauges) + Grafana dashboard (20 panels) + structlog JSON logging with trace-ID correlation
Security Prompt injection protection (10 pattern families), sliding-window rate limiting, MCP bearer token auth
Database Alembic-driven PostgreSQL migrations with async engine support
Testing 599 tests (unit + integration), ruff lint/format, GitHub Actions CI
Architecture 5 microservices, Redis Streams event bus, Docker Compose orchestration

System Design Overview

One sentence: A user talks to Discord β†’ the Gateway publishes an event β†’ the Brain reasons through an 8-node LangGraph β†’ MCP tools execute actions β†’ the Cerebellum adds human-like typing delay β†’ the Gateway delivers the reply.

flowchart LR
    User([fa:fa-user User])
    Discord([fa:fa-brands-discord Discord])

    subgraph Aether["Project Aether"]
        direction LR
        GW[Gateway]
        Brain[Brain<br/><small>LangGraph</small>]
        MCP[MCP Server<br/><small>FastMCP</small>]
        CB[Cerebellum]
        DB[(PostgreSQL)]
        VEC[(ChromaDB)]
        CACHE[(Redis)]
    end

    User <-->|messages| Discord
    Discord <-->|discord.py| GW
    GW -->|"aether:input"| CACHE
    CACHE -->|consume| Brain
    Brain <-->|"Streamable HTTP /mcp"| MCP
    Brain -->|"RAG query"| VEC
    MCP -->|SQL| DB
    Brain -->|"aether:reply_intent"| CB
    CB -->|"aether:output"| GW
    Brain -->|"aether:output<br/>(owner fast-path)"| GW

    style Aether fill:#1a1a2e,stroke:#e94560,stroke-width:2px,color:#eee
    style Brain fill:#0f3460,stroke:#e94560,color:#eee
    style MCP fill:#533483,stroke:#e94560,color:#eee
Loading

The diagram above shows the happy path for a regular user message. Owner messages skip the Cerebellum and go directly from the Brain to the Gateway. Every arrow labelled aether:* is a Redis Stream.


Architecture

graph TB
    subgraph Services["Microservices"]
        GW["<b>Gateway</b><br/>Discord bot Β· rate limiting<br/><i>:9090/metrics</i>"]
        BR["<b>Brain</b><br/>LangGraph StateGraph Β· MCP tools Β· mood<br/><i>:9091/metrics</i>"]
        MCP["<b>MCP Server</b><br/>FastMCP + FastAPI Β· 11 tools<br/><i>:8080 Β· :9092/metrics</i>"]
        CB["<b>Cerebellum</b><br/>Typing simulation<br/><i>:9093/metrics</i>"]
        DASH["<b>Dashboard</b><br/>FastAPI + WebSocket SPA<br/><i>:8501</i>"]
    end

    subgraph Data["Data Layer"]
        REDIS[("Redis 7<br/>Streams Β· Pub/Sub Β· Cache")]
        PG[("PostgreSQL 16<br/>Tasks Β· Mod logs Β· Reports")]
        CHROMA[("ChromaDB<br/>Vector embeddings Β· RAG")]
    end

    subgraph Obs["Observability"]
        PROM["Prometheus<br/><i>:9099</i>"]
        GRAF["Grafana<br/><i>:3000</i>"]
    end

    GW <-->|"aether:input / output"| REDIS
    BR <-->|"consume / publish"| REDIS
    CB <-->|"reply_intent / output"| REDIS
    DASH <-->|"status / persona_update"| REDIS

    BR <-->|"Streamable HTTP"| MCP
    BR -->|"RAG queries"| CHROMA
    MCP -->|"asyncpg"| PG
    DASH -->|"queries"| PG

    PROM -.->|"scrape /metrics"| GW
    PROM -.->|"scrape"| BR
    PROM -.->|"scrape"| MCP
    PROM -.->|"scrape"| CB
    GRAF -.->|"query"| PROM

    style Services fill:#16213e,stroke:#0f3460,color:#eee
    style Data fill:#1a1a2e,stroke:#533483,color:#eee
    style Obs fill:#1a1a2e,stroke:#e94560,color:#eee
Loading

Services

Service Role Stack
Gateway Discord bot β€” event ingestion, response delivery, rate limiting discord.py, Redis Streams
Brain Central reasoning engine β€” LangGraph StateGraph, MCP tools, mood engine LangGraph, LangChain, OpenRouter, ChromaDB (async), MCP SDK
MCP Server Tool execution via Model Context Protocol FastMCP, FastAPI, PostgreSQL, DuckDuckGo
Cerebellum Human-like typing delay simulation Redis Streams, asyncio
Dashboard Real-time monitoring & persona editor FastAPI, WebSocket, vanilla JS SPA, Redis, PostgreSQL
Prometheus Metrics collection & alerting prom/prometheus:v2.53.0
Grafana Metrics visualization grafana/grafana:11.1.0

Communication Streams

Stream Purpose Producer Consumer
aether:input Discord events β†’ Brain Gateway Brain
aether:output Actions β†’ Discord Brain, Cerebellum Gateway
aether:reply_intent Typing simulation triggers Brain Cerebellum
aether:status System status updates All Dashboard
aether:persona_update Persona hot-swap signals Dashboard Brain, Gateway

Microservice Topology

The following diagram shows every container, its network connections, and which ports are exposed to the host:

graph LR
    subgraph host["Host Machine"]
        direction TB

        subgraph net["aether_network (bridge)"]
            direction TB

            subgraph init["One-shot Init"]
                DB_INIT["db_init<br/><small>alembic upgrade head</small>"]
                REDIS_INIT["redis_init<br/><small>stream setup</small>"]
            end

            subgraph services["Long-lived Services"]
                GW["aether_gateway<br/>:9090"]
                BR["aether_brain<br/>:9091"]
                MCP["aether_mcp_server<br/>:8080 Β· :9092"]
                CB["aether_cerebellum<br/>:9093"]
                DASH["aether_dashboard<br/>:8501"]
            end

            subgraph data["Data Stores"]
                REDIS["aether_redis<br/>:6379"]
                PG["aether_postgres<br/>:5432"]
                CHROMA["aether_chromadb<br/>:8000"]
            end

            subgraph obs["Observability"]
                PROM["aether_prometheus<br/>:9099"]
                GRAF["aether_grafana<br/>:3000"]
            end
        end
    end

    DB_INIT -.->|depends| PG
    REDIS_INIT -.->|depends| REDIS
    GW -->|streams| REDIS
    BR -->|streams| REDIS
    BR -->|HTTP /mcp| MCP
    BR -->|async query| CHROMA
    MCP -->|asyncpg| PG
    CB -->|streams| REDIS
    DASH -->|queries| PG
    DASH -->|pub/sub| REDIS
    PROM -.->|scrape| GW
    PROM -.->|scrape| BR
    PROM -.->|scrape| MCP
    PROM -.->|scrape| CB
    GRAF -.->|datasource| PROM

    style host fill:#0d1117,stroke:#30363d,color:#c9d1d9
    style net fill:#161b22,stroke:#30363d,color:#c9d1d9
    style init fill:#1c2333,stroke:#8b949e,color:#c9d1d9
    style services fill:#16213e,stroke:#0f3460,color:#eee
    style data fill:#1a1a2e,stroke:#533483,color:#eee
    style obs fill:#1a1a2e,stroke:#e94560,color:#eee
Loading

Brain: LangGraph StateGraph

The Brain runs a compiled, multi-node reasoning graph. Each node is a pure async function of (state, config) β€” testable, observable, and composable.

flowchart TD
    START((START))
    IP["input_parser<br/><small>Hydrate state Β· sanitise Β· snapshot persona</small>"]
    RT["router<br/><small>Intent classification<br/>regex fast paths β†’ LLM fallback</small>"]
    RAG["rag_retriever<br/><small>Async ChromaDB + token windowing</small>"]
    SA["sentiment_analyzer<br/><small>Amygdala mood tracking</small>"]
    RSN["reasoner<br/><small>System prompt assembly Β· LLM invocation</small>"]
    TN["tool_node<br/><small>MCP tool execution</small>"]
    OF["output_formatter<br/><small>Persona formatting Β· Discord truncation</small>"]
    OR["output_router<br/><small>Redis publish β†’ Gateway or Cerebellum</small>"]
    END_NODE((END))

    START --> IP --> RT

    RT -->|"no_action"| END_NODE
    RT -->|"chat / command /<br/>moderation / task"| RAG

    RAG --> SA --> RSN

    RSN -->|"has tool_calls"| TN
    RSN -->|"no tool_calls"| OF

    TN -->|"loop back"| RSN

    OF --> OR --> END_NODE

    style START fill:#e94560,stroke:#e94560,color:#fff
    style END_NODE fill:#e94560,stroke:#e94560,color:#fff
    style RSN fill:#0f3460,stroke:#e94560,color:#eee
    style TN fill:#533483,stroke:#e94560,color:#eee
Loading

Graph nodes: input_parser β†’ router β†’ rag_retriever β†’ sentiment_analyzer β†’ reasoner ↔ tool_node β†’ output_formatter β†’ output_router

Every node is dependency-injected via config["configurable"] β€” the LLM, MCP client, Amygdala mood engine, ChromaDB collection, Redis client, persona, and feature flags are all passed through the config dict. No globals.

πŸ“– Full LangGraph documentation β†’


MCP Integration

The Brain auto-discovers tools from the MCP server via the Streamable HTTP transport β€” adding a new tool on the server makes it immediately available to the LLM with zero Brain changes.

sequenceDiagram
    participant B as Brain
    participant M as MCP Server

    Note over B,M: Startup β€” Tool Discovery
    B->>+M: POST /mcp (initialize)
    M-->>-B: 200 OK (capabilities)
    B->>+M: POST /mcp (tools/list)
    M-->>-B: 200 OK (11 tool schemas)

    Note over B: Convert β†’ LangChain StructuredTools
    Note over B: Bind tools to LLM
    Note over B: Build StateGraph with ToolNode
Loading

Available tools: task_crud, internet_search, discord_mod, discord_roles, server_report, file_ops, purge_messages, lock_channel, unlock_channel, set_slowmode, unban_user

πŸ“– Full MCP documentation β†’


Architectural Principles

Project Aether follows a set of deliberate design principles that inform every implementation decision:

Principle How It Manifests
Biological Metaphor Services are named after body parts (Brain, Gateway/Body, Cerebellum) β€” each has a single responsibility, just like an organism.
Event-Driven Decoupling Services never call each other directly. All communication flows through Redis Streams, making services independently deployable.
Dependency Injection Every graph node is a pure function of (state, config). No globals, no singletons β€” trivially testable with mocks.
Auto-Discovery over Config The Brain discovers MCP tools at startup. Adding a tool on the server requires zero Brain changes.
Graceful Degradation If ChromaDB is down β†’ RAG returns a placeholder. If MCP is unreachable β†’ bare-LLM mode. If Redis is down β†’ rate limiter fails open.
Observable by Default Every node emits Prometheus metrics. Every log line carries a trace_id. Every service exposes /metrics.
Schema-Driven Contracts Pydantic models define every inter-service message. Alembic manages every database schema change.
Security in Depth Prompt injection regex, message length limits, bearer token auth, parameterized queries, sandboxed file ops, network isolation.

Getting Started

Prerequisites

Quick Start

# 1. Clone the repository
git clone <repository-url>
cd Project_Aether

# 2. Configure environment
cp .env.example .env
# Edit .env β€” set DISCORD_TOKEN and OPENROUTER_API_KEY

# 3. Start everything
./scripts/start.sh

The startup script will:

  • βœ… Validate environment configuration
  • βœ… Build all Docker images
  • βœ… Start data stores (Redis, PostgreSQL, ChromaDB)
  • βœ… Run database migrations via Alembic (alembic upgrade head)
  • βœ… Initialize Redis streams
  • βœ… Start all services (Gateway, Brain, MCP Server, Cerebellum, Dashboard)
  • βœ… Start observability stack (Prometheus, Grafana)

Manual Setup

# Build and start
docker compose build --parallel
docker compose up -d

# Verify
docker compose ps

# View logs
docker compose logs -f brain

Accessing Interfaces

Interface URL Description
Dashboard http://localhost:8501 Real-time monitoring & persona editor
Grafana http://localhost:3000 Metrics dashboards (admin/aether)
Prometheus http://localhost:9099 Raw metrics & targets
MCP Server http://localhost:8080/health Health check endpoint

Stopping

# Graceful shutdown (preserves data)
./scripts/stop.sh

# Full cleanup (remove volumes)
./scripts/stop.sh --clean

# Or manually
docker compose down       # keep data
docker compose down -v    # remove volumes

Configuration

Environment Variables

Variable Description Default
DISCORD_TOKEN Discord bot token (required) β€”
OPENROUTER_API_KEY OpenRouter API key (required) β€”
OPENROUTER_MODEL LLM model to use meta-llama/llama-3.1-405b-instruct
POSTGRES_USER PostgreSQL username aether
POSTGRES_PASSWORD PostgreSQL password aether_secret
POSTGRES_DB PostgreSQL database name aether_db
MCP_AUTH_TOKEN Bearer token for MCP auth (unset = no auth)
LOG_LEVEL Logging verbosity INFO
GATEWAY_RATE_LIMIT Max messages per user per window 20
GATEWAY_RATE_WINDOW_SEC Rate limit window (seconds) 60
GRAFANA_ADMIN_PASSWORD Grafana admin password aether

Persona Customization

Edit shared/persona.json or use the Dashboard for live hot-swapping:

{
  "name": "Aether",
  "identity": "A sentient AI assistant with a calm, analytical demeanor",
  "system_prompt": "You are {name}. Your mood is {mood_description}...",
  "mood": 0.6,
  "energy": 0.8,
  "wpm": 160,
  "traits": ["Curious", "Self-aware", "Adaptable", "Evolving"]
}

Changes are hot-swapped via Redis pub/sub β€” no restart required.


Observability

Prometheus Metrics

Every service exposes a /metrics endpoint scraped by Prometheus:

Metric Type Description
aether_messages_processed_total Counter Messages processed by service/event type
aether_graph_latency_seconds Histogram End-to-end graph latency by routing decision
aether_llm_latency_seconds Histogram LLM inference latency by model
aether_graph_node_latency_seconds Histogram Per-node execution latency
aether_tool_calls_total Counter Tool invocations by tool name
aether_rag_latency_seconds Histogram ChromaDB query latency
aether_mood_current Gauge Current mood value (0.0–1.0)
aether_sentiment_classifications_total Counter Sentiment results by method/label
aether_rate_limit_hits_total Counter Rate limit triggers
aether_errors_total Counter Errors by service/component
aether_mcp_tools_discovered Gauge Number of MCP tools discovered
aether_service_up Gauge Service health indicator

Grafana Dashboard

The pre-provisioned Aether Overview dashboard includes 20 panels:

  • Service Health β€” up/down status for all services
  • Message Throughput β€” rates, dropped messages, hourly stats
  • LLM & Graph Performance β€” latency percentiles (p50/p95/p99), per-node breakdown
  • Tool Usage & MCP β€” tool call rates, tool latency, HTTP request latency
  • Mood & Sentiment β€” current mood gauge, mood over time, classification rates
  • Errors & Rate Limiting β€” error rates by component, rate limit triggers

Structured Logging

All services use structlog with:

  • JSON output in production (non-TTY), coloured console in development
  • Trace ID correlation β€” every request carries a trace_id across Gateway β†’ Brain β†’ MCP β†’ Cerebellum
  • stdlib compatibility β€” existing logging.getLogger() calls flow through structlog automatically

Security

Feature Implementation
Prompt Injection Protection 10 regex pattern families in sanitizer.py (role-override, ignore-instructions, jailbreak, base64 blobs, newline floods, etc.)
Message Length Limiting 4 000 char max before LLM, truncation with logging
Rate Limiting Redis sorted-set sliding window per user (configurable limit/window)
MCP Authentication Bearer token middleware on /mcp; /health always public
System Prompt Hardening Separate SystemMessage / HumanMessage boundaries + SECURITY RULES section
File Operations Sandboxing file_ops tool restricted to MCP_FILE_ROOT with path traversal blocking
Database Security Parameterized queries via asyncpg; network-isolated PostgreSQL

Database Migrations

Project Aether uses Alembic for schema management with async PostgreSQL support.

# Apply all pending migrations
alembic upgrade head

# Create a new migration
alembic revision --autogenerate -m "add users table"

# View migration history
alembic history

# Downgrade one revision
alembic downgrade -1

# Generate SQL without connecting (offline mode)
alembic upgrade head --sql

The Docker db_init container runs alembic upgrade head automatically at startup. The initial migration captures the full baseline schema: tasks, mod_logs, daily_reports tables, ENUM types, triggers, and the uuid-ossp extension.


Testing

# Install dev dependencies
uv sync --dev

# Run all tests (599 tests)
uv run pytest

# Run with verbose output
uv run pytest -v

# Run specific test file
uv run pytest tests/unit/test_async_chromadb.py

# Run with coverage
uv run pytest --cov=services --cov=shared

# Lint
uv run ruff check .

# Format check
uv run ruff format --check .

Test Suite Breakdown

File Tests Coverage
test_brain_graph.py 38 Graph routing, tool loops, owner bypass, mood tracking, edge cases
test_mcp_connectivity.py 6 MCP client connection, retry logic
test_mcp_tools.py 54 Individual tool execution, argument validation
test_amygdala.py 77 Mood engine, LLM sentiment, keyword fallback, mood scaling
test_async_chromadb.py 55 Async collections, token windowing, budget enforcement, metrics
test_alembic_setup.py 76 Migration structure, Alembic config, init_db refactoring
test_metrics.py 51 Prometheus counters, histograms, gauges, ASGI metrics app
test_sanitizer.py 56 Prompt injection patterns, length limits, whitespace normalization
test_models.py 57 Pydantic model serialization, validation, edge cases
test_redis_connector.py 38 Stream operations, consumer groups, error handling
test_cerebellum.py 25 Typing delay calculation, mood variance
test_logging.py 24 structlog setup, trace IDs, processor chain
test_rate_limiter.py 21 Sliding window, Redis sorted sets, fail-open
test_mcp_auth.py 21 Bearer token middleware, auth bypass
Total 599

Project Structure

Project_Aether/
β”œβ”€β”€ services/
β”‚   β”œβ”€β”€ brain/                    # LangGraph reasoning engine
β”‚   β”‚   β”œβ”€β”€ nodes/                # Graph node implementations
β”‚   β”‚   β”‚   β”œβ”€β”€ input_parser.py   #   Hydrate state, snapshot persona
β”‚   β”‚   β”‚   β”œβ”€β”€ router.py         #   Intent classification
β”‚   β”‚   β”‚   β”œβ”€β”€ rag_retriever.py  #   Async ChromaDB + token windowing
β”‚   β”‚   β”‚   β”œβ”€β”€ sentiment_analyzer.py  # Amygdala mood tracking
β”‚   β”‚   β”‚   β”œβ”€β”€ reasoner.py       #   LLM invocation + tool calls
β”‚   β”‚   β”‚   β”œβ”€β”€ output_formatter.py  # Persona formatting + truncation
β”‚   β”‚   β”‚   β”œβ”€β”€ output_router.py  #   Redis stream publishing
β”‚   β”‚   β”‚   └── sanitizer.py      #   Prompt injection protection
β”‚   β”‚   β”œβ”€β”€ graph.py              # StateGraph builder + compiler
β”‚   β”‚   β”œβ”€β”€ state.py              # AgentState TypedDict schema
β”‚   β”‚   β”œβ”€β”€ amygdala.py           # Mood engine (LLM + keyword sentiment)
β”‚   β”‚   β”œβ”€β”€ mcp_client.py         # MCPClientManager (Streamable HTTP)
β”‚   β”‚   β”œβ”€β”€ service.py            # BrainService orchestrator
β”‚   β”‚   └── config.py             # Pydantic-settings configuration
β”‚   β”œβ”€β”€ gateway/                  # Discord bot service
β”‚   β”‚   β”œβ”€β”€ service.py            # GatewayService
β”‚   β”‚   └── rate_limiter.py       # Redis sliding-window rate limiter
β”‚   β”œβ”€β”€ mcp_server/               # MCP tool server (FastMCP + FastAPI)
β”‚   β”‚   β”œβ”€β”€ server.py             # Server setup, auth middleware, /metrics
β”‚   β”‚   └── tools/                # Tool handlers
β”‚   β”œβ”€β”€ cerebellum/               # Typing simulation service
β”‚   └── dashboard/                # FastAPI + WebSocket dashboard
β”œβ”€β”€ shared/
β”‚   β”œβ”€β”€ config.py                 # Centralised Pydantic-settings configs
β”‚   β”œβ”€β”€ models.py                 # Pydantic data models (shim)
β”‚   β”œβ”€β”€ models/                   # Model sub-modules
β”‚   β”‚   β”œβ”€β”€ events.py             # DiscordEvent, InterServiceMessage
β”‚   β”‚   β”œβ”€β”€ persona.py            # PersonaModel, SentimentBand
β”‚   β”‚   β”œβ”€β”€ directives.py         # BrainOutput, ToolCallDirective
β”‚   β”‚   └── tasks.py              # TaskModel, TaskStatus, TaskPriority
β”‚   β”œβ”€β”€ redis_connector.py        # RedisStreamClient
β”‚   β”œβ”€β”€ logging_config.py         # structlog setup + trace ID context
β”‚   β”œβ”€β”€ metrics.py                # Prometheus metrics definitions
β”‚   β”œβ”€β”€ persona.py                # Persona loader + hot-swap listener
β”‚   └── db/
β”‚       └── init_db.py            # Alembic-driven database initialisation
β”œβ”€β”€ alembic/                      # Database migration framework
β”‚   β”œβ”€β”€ env.py                    # Async engine config (asyncpg)
β”‚   β”œβ”€β”€ script.py.mako            # Migration template
β”‚   └── versions/
β”‚       └── 2026_03_06_001_initial_schema.py  # Baseline schema
β”œβ”€β”€ tests/
β”‚   β”œβ”€β”€ conftest.py               # Shared fixtures (mocks, fakes)
β”‚   β”œβ”€β”€ integration/              # Graph, MCP, tool tests
β”‚   └── unit/                     # Node, model, service unit tests
β”œβ”€β”€ docker/
β”‚   β”œβ”€β”€ Dockerfile.*              # Per-service Dockerfiles
β”‚   β”œβ”€β”€ prometheus.yml            # Prometheus scrape config
β”‚   └── grafana/provisioning/     # Datasource + dashboard auto-provisioning
β”œβ”€β”€ docs/
β”‚   β”œβ”€β”€ langgraph_workflow.md     # LangGraph reference documentation
β”‚   └── mcp_integration.md        # MCP protocol reference
β”œβ”€β”€ docker-compose.yml            # Full stack orchestration
β”œβ”€β”€ pyproject.toml                # Dependencies + tool config (UV, ruff, pytest)
β”œβ”€β”€ alembic.ini                   # Alembic configuration
β”œβ”€β”€ ARCHITECTURE.md               # Architecture deep-dive
β”œβ”€β”€ CHANGELOG.md                  # Version history
└── .github/
    β”œβ”€β”€ workflows/ci.yml          # GitHub Actions (tests + lint)
    └── CONTRIBUTING.md           # Contribution guidelines

Event Flow

Message Processing (Full Pipeline)

sequenceDiagram
    actor User
    participant D as Discord
    participant GW as Gateway
    participant R as Redis Streams
    participant BR as Brain (StateGraph)
    participant MCP as MCP Server
    participant PG as PostgreSQL
    participant CB as Cerebellum

    User->>D: Send message
    D->>GW: on_message event
    GW->>GW: Validate Β· rate-limit Β· generate trace_id
    GW->>R: XADD aether:input

    R->>BR: XREADGROUP (consume)
    Note over BR: StateGraph invocation

    BR->>BR: input_parser β€” sanitise, snapshot persona + mood
    BR->>BR: router β€” classify intent
    BR->>BR: rag_retriever β€” async ChromaDB query + token windowing
    BR->>BR: sentiment_analyzer β€” Amygdala mood adjustment
    BR->>BR: reasoner β€” assemble system prompt, invoke LLM

    opt LLM requests tool calls
        BR->>MCP: POST /mcp (tools/call: task_crud)
        MCP->>PG: INSERT INTO tasks …
        PG-->>MCP: OK
        MCP-->>BR: 200 OK (tool result)
        BR->>BR: reasoner (2nd pass β€” produce final text)
    end

    BR->>BR: output_formatter β€” persona formatting + truncation
    BR->>BR: output_router β€” choose delivery path

    alt Owner message
        BR->>R: XADD aether:output
    else Regular user
        BR->>R: XADD aether:reply_intent
        R->>CB: XREADGROUP (consume)
        CB->>CB: Calculate typing delay (WPM + mood + variance)
        CB->>D: Show typing indicator
        CB->>R: XADD aether:output
    end

    R->>GW: XREADGROUP (consume)
    GW->>D: Send reply
    D->>User: Display response
Loading

Persona Hot-Swap

sequenceDiagram
    actor Op as Operator
    participant DASH as Dashboard
    participant VOL as Shared Volume
    participant R as Redis Pub/Sub
    participant BR as Brain
    participant GW as Gateway

    Op->>DASH: Edit persona in UI
    DASH->>VOL: Write persona.json
    DASH->>R: PUBLISH aether:persona_update

    par Brain reload
        R->>BR: Receive notification
        BR->>VOL: Reload persona.json
        BR->>BR: Reinitialise Amygdala
    and Gateway reload
        R->>GW: Receive notification
        GW->>VOL: Reload persona.json
    end

    Note over BR: Next graph invocation uses new persona<br/>(snapshotted by input_parser)
Loading

Key Dependencies

Package Version Purpose
langgraph β‰₯0.2.0 StateGraph framework for the reasoning pipeline
langchain / langchain-openai β‰₯0.2.0 / β‰₯0.1.0 LLM abstraction and OpenRouter integration
mcp β‰₯0.1.0 Model Context Protocol SDK (Streamable HTTP client/server)
chromadb β‰₯0.5.0 Vector store for RAG (AsyncHttpClient)
tiktoken β‰₯0.7.0 Token counting for RAG context budgeting
discord.py β‰₯2.4.0 Discord API client
fastapi / uvicorn β‰₯0.111.0 / β‰₯0.30.0 MCP server HTTP framework
pydantic / pydantic-settings β‰₯2.7 / β‰₯2.3 Data validation and configuration management
redis β‰₯5.0.4 Event streaming (Redis Streams) and caching
asyncpg β‰₯0.29.0 Async PostgreSQL driver
alembic / sqlalchemy β‰₯1.13.0 / β‰₯2.0.0 Database migration framework
structlog β‰₯25.5.0 Structured JSON logging with trace-ID correlation
prometheus-client β‰₯0.21.0 Prometheus metrics instrumentation
ddgs β‰₯9.0.0 DuckDuckGo search integration

Troubleshooting

Services Won't Start

docker info                          # Check Docker is running
docker compose build --no-cache      # Rebuild from scratch
docker compose logs db_init          # Check migration errors

Brain Not Responding

docker compose logs brain | grep -i error
docker exec aether_redis redis-cli GET heartbeat:brain
curl http://localhost:9091/metrics | grep aether_service_up

MCP Tool Errors

curl http://localhost:8080/health                          # Check MCP server
docker compose logs mcp_server | grep -i error
curl http://localhost:9091/metrics | grep mcp_tools        # Check discovery count

Rate Limiting Issues

docker compose logs gateway | grep "rate.limit"
curl http://localhost:9090/metrics | grep rate_limit

Grafana Not Loading

docker compose logs grafana
# Default credentials: admin / aether
# Dashboard: Aether Overview (auto-provisioned)

Documentation

Document Description
docs/langgraph_workflow.md Complete LangGraph reference β€” topology, state schema, node descriptions, conditional edges, tool-calling loop, observability, extension guide
docs/mcp_integration.md MCP protocol reference β€” Streamable HTTP transport, auth, tool discovery, protocol flow diagrams, adding new tools
ARCHITECTURE.md System architecture deep-dive β€” services, data layer, event flow, Docker setup, security, scalability
CHANGELOG.md Version history
.github/CONTRIBUTING.md Contribution guidelines

External References


Future Directions

The following roadmap captures planned enhancements that would deepen Aether's capabilities:

  • Multi-user personas β€” per-user personality profiles with independent mood tracking
  • Voice integration β€” Discord voice channel support via discord.py voice client
  • Distributed tracing β€” Jaeger/Tempo integration to correlate trace_id across services with full span trees
  • Alertmanager β€” Prometheus alerting rules with PagerDuty/Slack notifications for SLA breaches
  • Plugin system β€” hot-loadable MCP tool plugins without server restart
  • Multi-model A/B testing β€” route a percentage of traffic to a challenger model and compare latency/quality metrics
  • Token rotation β€” short-lived MCP_AUTH_TOKEN via HashiCorp Vault or AWS Secrets Manager
  • Persistent checkpointer β€” Redis-backed LangGraph checkpointer for cross-restart conversation memory
  • Streaming responses β€” SSE/WebSocket streaming from LLM to Discord for long-form replies
  • Horizontal Brain scaling β€” leader election or consistent hashing for multi-instance Brain deployment
%%{init: {'theme': 'dark'}}%%
timeline
    title Project Aether Roadmap
    section Foundation (Done)
        Microservices architecture : 5 services
        LangGraph StateGraph : 8-node reasoning pipeline
        MCP Streamable HTTP : Auto-discovery + 11 tools
        Prometheus + Grafana : 20-panel dashboard
        599 tests : Unit + integration
    section Next
        Persistent checkpointer : Redis-backed memory
        Distributed tracing : Jaeger/Tempo spans
        Alertmanager : SLA breach notifications
    section Future
        Voice integration : Discord voice channels
        Plugin system : Hot-loadable MCP tools
        Multi-model A/B : Challenger model routing
        Horizontal scaling : Multi-instance Brain
Loading

Contributing

This is a portfolio/showcase project, but contributions are welcome!

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Follow existing code style (type hints, docstrings, Pydantic models)
  4. Add tests for new functionality
  5. Ensure ruff check . and pytest pass
  6. Submit a pull request

See CONTRIBUTING.md for detailed guidelines.


License

MIT License β€” See LICENSE for details.


Built with: Python 3.12 β€’ LangGraph β€’ MCP β€’ ChromaDB β€’ Discord.py β€’ FastAPI β€’ Redis β€’ PostgreSQL β€’ Alembic β€’ Prometheus β€’ Grafana β€’ structlog β€’ Docker

Project Aether β€” autonomous, observable, and architecturally elegant.

About

🧠 A decentralized, event-driven AI Lifeform for Discord β€” featuring MCP (Model Context Protocol), microservices architecture, vector memory, and autonomous server management.

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Contributors