Skip to content

fhillipgcastillo/agentic-memory

Repository files navigation

LLM Wiki

A persistent, compounding knowledge system powered by LLMs - plus a universal memory layer for agentic AI tools.

What This Is

Two systems in one:

  1. Wiki System - A persistent wiki where an LLM incrementally builds and maintains structured knowledge from sources
  2. Memory System - A shared memory layer any AI tool can use to store/retrieve past conversations, solutions, and context

Directory Structure

llm-memory-wiki/
├── raw/                    # Immutable sources (never modify)
│   ├── sources/            # Articles, documents
│   └── assets/            # Images
├── wiki/                  # LLM-generated wiki content
│   ├── entities/          # People, places
│   ├── concepts/           # Ideas, topics
│   ├── sources/          # Source summaries
│   ├── index.md           # Content catalog
│   └── log.md             # Activity log
├── .memory/              # Shared memory system
│   ├── wiki/              # Stored memories
│   └── scripts/          # MCP server, CLI
├── .wiki/                # qmd collection data (wiki)
├── AGENTS.md              # Schema for OpenCode/Pi, Codex
├── CLAUDE.md              # Schema for Claude Code
└── IDEA.md               # Original concept document

Quick Start

1. Install Dependencies

# Install qmd (search engine)
npm install -g @tobilu/qmd

# Initialize wiki search
cd llm-memory-wiki
qmd collection add wiki --name wiki
qmd embed

2. Open in Your AI Tool

OpenCode / Pi:

  • Open the project folder in Cursor
  • AGENTS.md provides the schema for ingest/query/lint workflows

Claude Code:

  • Open the folder
  • CLAUDE.md provides the schema

Other tools:

  • Reference AGENTS.md for the agent schema

Wiki System

Core Concept

Instead of RAG (rediscovering knowledge every query), the LLM maintains a persistent wiki that compounds over time.

  • Raw sources → Immutable documents
  • Wiki → LLM-generated, cross-linked pages
  • Schema → Conventions for the LLM

Workflows

Ingest a source:

# 1. Add source to raw/sources/
# 2. Tell LLM to process it
# 3. LLM creates: summary, concept pages, updates index, logs

Query the wiki:

# Ask questions
# LLM searches wiki, synthesizes answer
# Can file valuable answers back as new pages

Lint the wiki:

# Check for contradictions
# Find orphaned pages
# Identify missing cross-references

Search

# BM25 search
qmd search "keyword"

# Semantic search
qmd vsearch "concept"

# Hybrid (BM25 + vector + rerank)
qmd query "question"

Memory System

A persistent memory layer any AI tool can use - stores conversations, solutions, conclusions that compound over time.

Quick Start

# Initialize memory collection
cd llm-memory-wiki
qmd collection add .memory/wiki --name memory

# Start MCP server (for any agentic tool)
node .memory/scripts/mcp-server.js &

MCP Tools

The MCP server exposes these tools:

Tool Description
memory_store Store a memory (conversation, solution, conclusion, issue, context)
memory_get Search past memories
memory_query Natural language search via qmd
memory_context Get current project context
memory_patterns Get recurring solution patterns
memory_stats Get memory statistics

Integration

Claude Code

Add to your CLAUDE.md or project config:

{
  "mcpServers": {
    "memory": {
      "command": "node",
      "args": [".memory/scripts/mcp-server.js"]
    }
  }
}

Then in conversation:

memory_store --content "Fixed the auth bug by clearing token on logout" --type solution
memory_query "how did we handle auth tokens"

OpenCode / Pi

Option 1 - MCP (add to .opencode/config.json):

{
  "mcpServers": {
    "memory": {
      "command": "node",
      "args": [".memory/scripts/mcp-server.js"]
    }
  }
}

Option 2 - Skill (reference MEMORY_SKILL.md in your prompts)

Option 3 - CLI:

source .memory/scripts/memory.sh
memory store -c "remember this" -t solution

Codex / Other MCP Clients

{
  "mcpServers": {
    "memory": {
      "command": "node",
      "args": [".memory/scripts/mcp-server.js"]
    }
  }
}

Usage Examples

// Via MCP JSON-RPC
await fetch('localhost:8181/mcp', {
  method: 'POST',
  body: JSON.stringify({
    method: 'memory_store',
    params: {
      content: 'Fixed login bug by clearing localStorage',
      type: 'solution',
      tags: 'auth,bug'
    }
  })
});

// Query past solutions
await fetch('localhost:8181/mcp', {
  method: 'POST', 
  body: JSON.stringify({
    method: 'memory_query',
    params: { query: 'how did we solve auth issue?' }
  })
});

CLI Alternative

# Source the shell functions
source .memory/scripts/memory.sh

# Store
memory store -c "remember this" -t solution

# Get
memory get "search term"

# Get patterns
memory patterns

# Query (uses qmd)
memory query "how did we X"

Files Reference

File Purpose
IDEA.md Original concept document
AGENTS.md Schema for OpenCode/Pi, Codex
CLAUDE.md Schema for Claude Code
.memory/MEMORY.md Memory system docs
.memory/MEMORY_SKILL.md Skill file for quick reference
.memory/scripts/mcp-server.js MCP server
.memory/scripts/memory.sh CLI functions

Requirements

  • Node.js 18+
  • npm or bun
  • qmd (@tobilu/qmd)

License

MIT - Use freely.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors