Skip to content

chefm4tt/AzerothLM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

45 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿง  AzerothLM

An AI research companion for World of Warcraft โ€” context-aware, multi-provider, and journal-based.

AzerothLM bridges the game's sandboxed Lua environment with modern Large Language Models. It reads your character's actual state โ€” gear, professions, quests, reputations, zone, gold โ€” and makes that context available to an AI that answers specific, actionable questions about your character, not generic ones.


๐Ÿ’ก What Can You Do With It?

Pre-raid gear planning

You're a level 60 Hunter sitting in Stormwind with a mix of dungeon blues and quest greens. AzerothLM reads every equipped slot and tells you exactly which pieces to upgrade first, which quest rewards you're about to miss, and which dungeon bosses to prioritize โ€” without you typing a single item name.

Farming route optimization

Mining 300, Skinning 306, currently questing in Hellfire Peninsula with 14 active quests. Ask AzerothLM for the best gold-per-hour route and it answers with specifics โ€” which nodes, which mobs, which respawn paths โ€” tailored to your professions and current location.

Reputation & attunement planning

Hostile with The Aldor, Neutral with The Sha'tar, and trying to figure out what to do next. AzerothLM traces your attunement chain, maps out the rep grind, and identifies which of your active quests feed into it โ€” all from your actual standings and quest log.


๐Ÿ—๏ธ Architecture: The Air-Gap Bridge

WoW addons run in a sandboxed Lua environment with no internet access. AzerothLM bridges this gap using a file-based relay:

  1. Context Collection โ€” The addon scans your equipped gear, profession levels, active quests, reputation standings, and talent points into SavedVariables.
  2. Research Input โ€” You create topics and ask questions through the relay CLI or MCP tools.
  3. AI Processing โ€” The relay sends your question plus full character context to an LLM via LiteLLM, supporting multiple providers.
  4. Signal File โ€” The relay writes the AI response back to AzerothLM_Signal.lua.
  5. In-Game Sync โ€” Type /reload in-game to load the updated journal.

โœจ Key Features

  • ๐Ÿค– Context-Aware AI โ€” Reads your equipped gear (all 19 slots), profession levels, active quest log, reputation standings, talent distribution, zone, level, class, and gold to give character-specific answers.
  • ๐Ÿ“– Research Journal โ€” Organize questions into named topics with full multi-turn Q&A history. The AI remembers the conversation within each topic.
  • ๐Ÿ”€ Multi-Provider Support โ€” Switch between Google Gemini, OpenAI, Anthropic Claude, or local Ollama models at runtime. No restart required.
  • ๐Ÿ–ฅ๏ธ Interactive CLI โ€” Rich terminal interface with commands for topic management, model switching, and diagnostics.
  • ๐Ÿ”Œ MCP Server Mode โ€” Run as a Model Context Protocol server for integration with Claude Code or any MCP-compatible AI agent.
  • ๐ŸŽฎ In-Game Journal Viewer โ€” Draggable, scrollable frame with topic navigation, item quality colorization, right-click context menus, and mouse wheel support.
  • โš™๏ธ Runtime Configuration โ€” Add API keys (/model add), switch models (/model switch), and toggle test mode (/test on|off) without editing files.
  • ๐Ÿงช Test Mode โ€” Validate your configuration and test the full pipeline without consuming API credits.
  • ๐Ÿ”„ Response Caching โ€” Identical queries return instantly from cache, saving API usage and latency.
  • ๐Ÿ› Debug Mode โ€” --debug flag or DEBUG=true in .env enables detailed diagnostic logging to help troubleshoot any issues.

๐Ÿ“‹ Requirements

  • Game: World of Warcraft TBC Classic / Anniversary Edition (Interface version 20505)
  • Runtime: Python 3.10+
  • Python Libraries: litellm, python-dotenv, rich, mcp, pyfiglet
  • API Key: Google Gemini (recommended โ€” free tier available), OpenAI, Anthropic, or a local Ollama instance

โš™๏ธ Installation

1. Addon Setup

Copy the AzerothLM folder into your WoW AddOns directory:

World of Warcraft/_anniversary_/Interface/AddOns/AzerothLM/

2. Python Environment

pip install -r requirements.txt

3. Configuration

Copy .env.example to .env and configure:

cp .env.example .env

Edit .env with your paths and API key:

# Path to your account's SavedVariables file
WOW_SAVED_VARIABLES_PATH=C:\...\WTF\Account\YOUR_ACCOUNT\SavedVariables\AzerothLM.lua

# Path to the installed addon folder
WOW_ADDON_PATH=C:\...\Interface\AddOns\AzerothLM

# Add at least one API key (or use /model add at runtime)
GEMINI_API_KEY=your_key_here
MODEL_NAME=gemini/gemini-2.5-flash

4. First Run

  1. Log into WoW, type /alm scan to collect your character context, then /reload
  2. Start the relay: python AzerothLM_Relay.py
  3. Create your first topic: /new gear-upgrades
  4. Ask a question: /ask gear-upgrades What should I upgrade first?
  5. Type /reload in-game to see the response in the journal

๐Ÿ–ฅ๏ธ CLI Mode

python AzerothLM_Relay.py
Command Description
/new <title> Create a new research topic
/ask <slug> <question> Ask a question on a topic
/topics List all topics
/view <slug> View full Q&A history for a topic
/delete <slug> Delete a topic
/model Show configured providers and active model
/model add Add a new provider API key interactively
/model switch Switch to a different model at runtime
/test on|off Toggle test mode (mock responses, no API cost)
/context Show current character context
/usage Show API usage stats and token counts
/status Show relay configuration
/help Show all commands
/quit Exit the relay

AzerothLM CLI showing help output and command reference


๐Ÿ”Œ MCP Server Mode

AzerothLM can run as a Model Context Protocol server, exposing its research journal tools to any MCP-compatible AI agent โ€” Claude Code, custom apps, or your own tooling.

Starting the server

python AzerothLM_Relay.py --mcp

# With diagnostic logging:
python AzerothLM_Relay.py --mcp --debug 2>>debug.log

Claude Code integration

Add to your .mcp.json:

{
  "mcpServers": {
    "azerothlm": {
      "command": "python",
      "args": ["path/to/AzerothLM_Relay.py", "--mcp"]
    }
  }
}

Once connected, Claude Code can call all six tools directly in conversation โ€” creating topics, asking character-aware questions, and managing your journal without the CLI.

Exposed Tools

Tool Description Parameters
create_topic Create a new research topic in the journal title: str
ask_question Ask a question with full character context and topic history topic_slug: str, question: str
list_topics List all topics with entry counts and last-updated timestamps โ€”
get_topic Retrieve the full Q&A history for a topic topic_slug: str
get_character_context Read live character data from WoW SavedVariables โ€”
delete_topic Delete a topic and sync the removal to the in-game journal topic_slug: str

Character Context

get_character_context returns a JSON object with everything the addon has collected. Every ask_question call includes this automatically.

Category Data
Player Level, class, race, current zone and subzone, gold on hand
Gear All 19 equipment slots with item name and ID (Head through Tabard)
Professions All professions and weapon skills with current rank and max rank
Quests Active quest log with quest ID, title, level, and completion status
Reputations All tracked factions with current standing (Hostile โ†’ Exalted)
Talents Talent point distribution across all three trees

๐ŸŽฎ In-Game Commands

After using the CLI to ask questions, type /reload to sync responses into the journal.

Command Description
/alm Toggle the journal window
/alm scan Refresh character context (gear, professions, quests, reputations)
/alm refresh Reload UI shortcut
/alm topics List all topics in chat
/alm delentry <N> Delete entry N from the current topic
/alm reset Clear and rebuild the journal from the latest relay data
/alm wipe Wipe all journal data and queue deletions on the relay side
/alm help Show all in-game commands

AzerothLM in-game journal showing a character-aware gear upgrade response

AzerothLM in-game command reference printed to chat via /alm help


๐Ÿงช Test Mode

Validate your setup without consuming API credits:

/test on    โ€” enable test mode, run configuration checks
/test off   โ€” disable test mode

When active, all AI responses are replaced with mock data prefixed [TEST MODE]. The /test on command also runs a full diagnostic verifying your .env, API key, SavedVariables path, and addon path.


๐Ÿ“‹ Changelog

v0.1-beta.1 (in testing)

  • ๐Ÿ†• MIT License added โ€” AzerothLM is now open source
  • โœจ Stale model IDs updated in .env.example (Anthropic, OpenAI, Gemini)
  • โœจ Requirements pinned to minimum versions for reproducible installs
  • ๐Ÿ› Fixed user-facing "MCP tools" jargon โ€” messages now say "relay CLI"

v0.1-alpha.3

  • ๐Ÿ†• Context-aware prompts โ€” questions route to relevant character data sections only
  • ๐Ÿ†• First-run interactive path setup โ€” relay guides new users through .env configuration
  • ๐Ÿ†• Help system rework โ€” /help shows category table, /help <cmd> shows detail panel
  • ๐Ÿ› Fixed response cache stale hits โ€” history fingerprint included in cache key

v0.1-alpha.2

  • ๐Ÿ†• In-game journal management commands: /alm reset, /alm wipe, /alm help
  • ๐Ÿ†• Item quality colorization in the journal viewer โ€” gear names display in their rarity color
  • ๐Ÿ†• Richer character context: reputation standings and talent distribution now included
  • ๐Ÿ†• Debug mode: --debug startup flag and DEBUG=true env var for diagnostic logging
  • โœจ CLI UX improvements: ASCII gradient header, loading spinners, cleaner output
  • โœจ Improved error messages and input validation throughout the CLI
  • โœจ Contextual hints when a topic slug isn't found
  • ๐Ÿ› Fixed signal sync edge cases โ€” empty journal now correctly signals in-game cleanup
  • ๐Ÿ› Fixed item quality pattern matching โ€” more lenient parsing with name-based fallback

v0.1-alpha.1

  • ๐Ÿ†• Multi-provider LLM support: Google Gemini, OpenAI, Anthropic, local Ollama
  • ๐Ÿ†• Interactive CLI with /model add, /model switch, /test on|off, /status, /usage
  • ๐Ÿ†• MCP server mode for AI agent integration
  • ๐Ÿ†• Research journal with named topics and full multi-turn Q&A history
  • ๐Ÿ†• In-game journal viewer: draggable frame, topic tabs, right-click menus, mouse wheel scroll
  • ๐Ÿ†• Response caching and built-in rate limiting with exponential backoff
  • ๐Ÿ†• Air-gap bridge architecture: file-based relay between WoW sandbox and external AI

๐Ÿ’ฌ Feedback

AzerothLM is in public beta โ€” core features are stable but rough edges exist. If you hit a bug, have a suggestion, or want to share how you're using it, open an issue on GitHub.

About

An in-game character aware LLM Assistant and Addon for TBC Classic

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors