diff --git a/INSTRUCTIONS.md b/INSTRUCTIONS.md
index b35a820..8d8a2cf 100644
--- a/INSTRUCTIONS.md
+++ b/INSTRUCTIONS.md
@@ -26,16 +26,20 @@ The MCP server is built with TypeScript and communicates over stdio using the Mo
- `static-analysis.ts` - Native linter runner (tsc, eslint, py_compile, cargo check, go vet).
- `propose-commit.ts` - Code gatekeeper validating headers, FEATURE tag, no inline comments, nesting, file length.
- `feature-hub.ts` - Obsidian-style feature hub navigator with bundled skeleton views.
+- `memory-tools.ts` - Memory graph MCP wrappers (upsert, relate, search, prune, interlink, traverse).
+
+The memory graph is a **Retrieval-Augmented Generation (RAG)** system. Agents MUST use `search_memory_graph` at the start of every task to retrieve prior context, and persist learnings with `upsert_memory_node` and `create_relation` after completing work. This prevents redundant exploration and builds cumulative knowledge across sessions.
**Core Layer** (continued):
- `hub.ts` - Wikilink parser for `[[path]]` links, cross-link tags, hub discovery, orphan detection.
+- `memory-graph.ts` - In-memory property graph with JSON persistence, decay scoring, and auto-similarity edges.
**Git Layer** (`src/git/`):
- `shadow.ts` - Shadow restore point system for undo without touching git history.
-**Entry Point**: `src/index.ts` registers 11 MCP tools and starts the stdio transport. Accepts an optional CLI argument for the target project root directory (defaults to `process.cwd()`).
+**Entry Point**: `src/index.ts` registers 17 MCP tools and starts the stdio transport. Accepts an optional CLI argument for the target project root directory (defaults to `process.cwd()`).
## Environment Variables
@@ -136,6 +140,12 @@ Strict order within every file:
| `list_restore_points` | See undo history. |
| `undo_change` | Revert a bad AI change without touching git. |
| `get_feature_hub` | Browse feature graph hubs. Find orphaned files. |
+| `upsert_memory_node` | Create/update memory nodes (concept, file, symbol, note) with auto-embedding. |
+| `create_relation` | Create typed edges between memory nodes (depends_on, implements, etc). |
+| `search_memory_graph` | Semantic search + graph traversal across 1st/2nd-degree neighbors. |
+| `prune_stale_links` | Remove decayed edges (e^(-λt)) and orphan nodes periodically. |
+| `add_interlinked_context` | Bulk-add nodes with auto-similarity linking (cosine ≥ 0.72). |
+| `retrieve_with_traversal` | Start from a node, walk outward, return scored neighbors by decay and depth. |
## Anti-Patterns to Avoid
diff --git a/README.md b/README.md
index 69d9540..13d7db9 100644
--- a/README.md
+++ b/README.md
@@ -2,7 +2,7 @@
Semantic Intelligence for Large-Scale Engineering.
-Context+ is an MCP server designed for developers who demand 99% accuracy. By combining Tree-sitter AST parsing, Spectral Clustering, and Obsidian-style linking, Context+ turns a massive codebase into a searchable, hierarchical feature graph.
+Context+ is an MCP server designed for developers who demand 99% accuracy. By combining RAG, Tree-sitter AST, Spectral Clustering, and Obsidian-style linking, Context+ turns a massive codebase into a searchable, hierarchical feature graph.
https://github.com/user-attachments/assets/a97a451f-c9b4-468d-b036-15b65fc13e79
@@ -39,6 +39,17 @@ https://github.com/user-attachments/assets/a97a451f-c9b4-468d-b036-15b65fc13e79
| `list_restore_points` | List all shadow restore points created by `propose_commit`. Each captures file state before AI changes. |
| `undo_change` | Restore files to their state before a specific AI change. Uses shadow restore points. Does not affect git. |
+### Memory & RAG
+
+| Tool | Description |
+| -------------------------- | ----------------------------------------------------------------------------------------------------------------- |
+| `upsert_memory_node` | Create or update a memory node (concept, file, symbol, note) with auto-generated embeddings. |
+| `create_relation` | Create typed edges between nodes (relates_to, depends_on, implements, references, similar_to, contains). |
+| `search_memory_graph` | Semantic search with graph traversal — finds direct matches then walks 1st/2nd-degree neighbors. |
+| `prune_stale_links` | Remove decayed edges (e^(-λt) below threshold) and orphan nodes with low access counts. |
+| `add_interlinked_context` | Bulk-add nodes with auto-similarity linking (cosine ≥ 0.72 creates edges automatically). |
+| `retrieve_with_traversal` | Start from a node and walk outward — returns all reachable neighbors scored by decay and depth. |
+
## Setup
### Quick Start (npx / bunx)
@@ -125,9 +136,9 @@ npm run build
Three layers built with TypeScript over stdio using the Model Context Protocol SDK:
-**Core** (`src/core/`) - Multi-language AST parsing (tree-sitter, 43 extensions), gitignore-aware traversal, Ollama vector embeddings with disk cache, wikilink hub graph.
+**Core** (`src/core/`) - Multi-language AST parsing (tree-sitter, 43 extensions), gitignore-aware traversal, Ollama vector embeddings with disk cache, wikilink hub graph, in-memory property graph with decay scoring.
-**Tools** (`src/tools/`) - 11 MCP tools exposing structural, semantic, and operational capabilities.
+**Tools** (`src/tools/`) - 17 MCP tools exposing structural, semantic, operational, and memory graph capabilities.
**Git** (`src/git/`) - Shadow restore point system for undo without touching git history.
diff --git a/landing/src/app/api/instructions/route.ts b/landing/src/app/api/instructions/route.ts
index cb0db5f..6c771a2 100644
--- a/landing/src/app/api/instructions/route.ts
+++ b/landing/src/app/api/instructions/route.ts
@@ -28,16 +28,20 @@ The MCP server is built with TypeScript and communicates over stdio using the Mo
- \`static-analysis.ts\` - Native linter runner (tsc, eslint, py_compile, cargo check, go vet).
- \`propose-commit.ts\` - Code gatekeeper validating headers, FEATURE tag, no inline comments, nesting, file length.
- \`feature-hub.ts\` - Obsidian-style feature hub navigator with bundled skeleton views.
+- \`memory-tools.ts\` - Memory graph MCP wrappers (upsert, relate, search, prune, interlink, traverse).
+
+The memory graph is a **Retrieval-Augmented Generation (RAG)** system. Agents MUST use \`search_memory_graph\` at the start of every task to retrieve prior context, and persist learnings with \`upsert_memory_node\` and \`create_relation\` after completing work. This prevents redundant exploration and builds cumulative knowledge across sessions.
**Core Layer** (continued):
- \`hub.ts\` - Wikilink parser for \`[[path]]\` links, cross-link tags, hub discovery, orphan detection.
+- \`memory-graph.ts\` - In-memory property graph with JSON persistence, decay scoring, and auto-similarity edges.
**Git Layer** (\`src/git/\`):
- \`shadow.ts\` - Shadow restore point system for undo without touching git history.
-**Entry Point**: \`src/index.ts\` registers 11 MCP tools and starts the stdio transport. Accepts an optional CLI argument for the target project root directory (defaults to \`process.cwd()\`).
+**Entry Point**: \`src/index.ts\` registers 17 MCP tools and starts the stdio transport. Accepts an optional CLI argument for the target project root directory (defaults to \`process.cwd()\`).
## Environment Variables
@@ -138,6 +142,12 @@ Strict order within every file:
| \`list_restore_points\` | See undo history. |
| \`undo_change\` | Revert a bad AI change without touching git. |
| \`get_feature_hub\` | Browse feature graph hubs. Find orphaned files. |
+| \`upsert_memory_node\` | Create/update memory nodes (concept, file, symbol, note) with auto-embedding. |
+| \`create_relation\` | Create typed edges between memory nodes (depends_on, implements, etc). |
+| \`search_memory_graph\` | Semantic search + graph traversal across 1st/2nd-degree neighbors. |
+| \`prune_stale_links\` | Remove decayed edges (e^(-λt)) and orphan nodes periodically. |
+| \`add_interlinked_context\` | Bulk-add nodes with auto-similarity linking (cosine ≥ 0.72). |
+| \`retrieve_with_traversal\` | Start from a node, walk outward, return scored neighbors by decay and depth. |
## Anti-Patterns to Avoid
diff --git a/landing/src/app/layout.tsx b/landing/src/app/layout.tsx
index 12f2b5d..5ff9e99 100644
--- a/landing/src/app/layout.tsx
+++ b/landing/src/app/layout.tsx
@@ -8,7 +8,7 @@ import "./globals.css";
export const metadata: Metadata = {
title: "Context+ // Semantic Intelligence for Large-Scale Engineering",
description:
- "MCP server designed for developers who demand 99% accuracy. Tree-sitter AST parsing, Spectral Clustering, and Obsidian-style linking.",
+ "MCP server designed for developers who demand 99% accuracy. RAG, Treesitter AST, Spectral Clustering, and Obsidian-style linking.",
icons: {
icon: "/icon.png",
},
diff --git a/landing/src/app/page.tsx b/landing/src/app/page.tsx
index eade757..8e9f959 100644
--- a/landing/src/app/page.tsx
+++ b/landing/src/app/page.tsx
@@ -158,7 +158,7 @@ export default async function Home() {
}}
>
Context+ is an MCP server designed for developers who demand 99%
- accuracy. By combining Tree-sitter AST parsing & Spectral
+ accuracy. By combining RAG, Treesitter AST & Spectral
Clustering, Context+ turns a massive codebase into a searchable,
hierarchical graph.
@@ -221,7 +221,7 @@ export default async function Home() {
Context+ guarantees minimal context bloat. It gives your agent deep
semantic understanding of your codebase, from AST parsing and symbol
navigation to blast radius analysis and commit validation. Nothing
- misses the context.
+ misses the context, with RAG.
Copy the instruction file into your project root to teach your agent
- fast execute mode, line-numbered symbol retrieval, strict formatting
- rules, and anti-patterns that keep context lean and precise.
+ RAG memory traversal & blast radius analysis that keep context
+ lean & precise. Or don't, Context+ already includes the
+ instructions in the new versions.
= 1800) return { cardSize: 360, stackDx: -28, stackDy: 28 };
- if (width >= 1400) return { cardSize: 300, stackDx: -24, stackDy: 24 };
- if (width >= 1025) return { cardSize: 250, stackDx: -20, stackDy: 20 };
- if (width >= 850) return { cardSize: 220, stackDx: -18, stackDy: 18 };
- if (width >= 500) return { cardSize: 240, stackDx: -18, stackDy: 18 };
- return { cardSize: 220, stackDx: -16, stackDy: 16 };
+ if (width >= 1800) return { cardSize: 360, stackDx: -24, stackDy: 24 };
+ if (width >= 1400) return { cardSize: 300, stackDx: -21, stackDy: 21 };
+ if (width >= 1025) return { cardSize: 250, stackDx: -18, stackDy: 18 };
+ if (width >= 850) return { cardSize: 220, stackDx: -16, stackDy: 16 };
+ if (width >= 500) return { cardSize: 240, stackDx: -16, stackDy: 16 };
+ return { cardSize: 220, stackDx: -14, stackDy: 14 };
}
export default function IsometricDiagram() {
diff --git a/landing/src/components/ToolDiagram.tsx b/landing/src/components/ToolDiagram.tsx
index c8db7e4..049cb63 100644
--- a/landing/src/components/ToolDiagram.tsx
+++ b/landing/src/components/ToolDiagram.tsx
@@ -314,6 +314,63 @@ export default function ToolDiagram() {
))}
+
+
+
+
+
+
+
+
+
+
+
+ RAG Functions
+
+
+
>
);
}
diff --git a/src/core/memory-graph.ts b/src/core/memory-graph.ts
new file mode 100644
index 0000000..386b29b
--- /dev/null
+++ b/src/core/memory-graph.ts
@@ -0,0 +1,375 @@
+// In-memory property graph with JSON persistence for linking memory nodes
+// FEATURE: Memory Graph — traversal, decay scoring, auto-similarity edges
+
+import { readFile, writeFile } from "fs/promises";
+import { join } from "path";
+import { fetchEmbedding, ensureMcpDataDir } from "./embeddings.js";
+
+export type NodeType = "concept" | "file" | "symbol" | "note";
+export type RelationType = "relates_to" | "depends_on" | "implements" | "references" | "similar_to" | "contains";
+
+export interface MemoryNode {
+ id: string;
+ type: NodeType;
+ label: string;
+ content: string;
+ embedding: number[];
+ createdAt: number;
+ lastAccessed: number;
+ accessCount: number;
+ metadata: Record;
+}
+
+export interface MemoryEdge {
+ id: string;
+ source: string;
+ target: string;
+ relation: RelationType;
+ weight: number;
+ createdAt: number;
+ metadata: Record;
+}
+
+interface GraphStore {
+ nodes: Record;
+ edges: Record;
+}
+
+export interface TraversalResult {
+ node: MemoryNode;
+ depth: number;
+ pathRelations: string[];
+ relevanceScore: number;
+}
+
+export interface GraphSearchResult {
+ direct: TraversalResult[];
+ neighbors: TraversalResult[];
+ totalNodes: number;
+ totalEdges: number;
+}
+
+const GRAPH_FILE = "memory-graph.json";
+const CACHE_DIR = ".mcp_data";
+const DECAY_LAMBDA = 0.05;
+const SIMILARITY_THRESHOLD = 0.72;
+const STALE_THRESHOLD = 0.15;
+
+let graphCache = new Map();
+let savePending = new Map();
+let saveTimeout = new Map>();
+
+function generateId(prefix: string): string {
+ return `${prefix}-${Date.now()}-${Math.random().toString(36).slice(2, 8)}`;
+}
+
+function cosine(a: number[], b: number[]): number {
+ const len = Math.min(a.length, b.length);
+ if (len === 0) return 0;
+ let dot = 0, normA = 0, normB = 0;
+ for (let i = 0; i < len; i++) {
+ dot += a[i] * b[i];
+ normA += a[i] * a[i];
+ normB += b[i] * b[i];
+ }
+ const denom = Math.sqrt(normA) * Math.sqrt(normB);
+ return denom === 0 ? 0 : dot / denom;
+}
+
+function decayWeight(edge: MemoryEdge): number {
+ const daysSinceCreation = (Date.now() - edge.createdAt) / 86_400_000;
+ return edge.weight * Math.exp(-DECAY_LAMBDA * daysSinceCreation);
+}
+
+async function loadGraph(rootDir: string): Promise {
+ if (graphCache.has(rootDir)) return graphCache.get(rootDir)!;
+ try {
+ const raw = JSON.parse(await readFile(join(rootDir, CACHE_DIR, GRAPH_FILE), "utf-8"));
+ const store: GraphStore = {
+ nodes: raw?.nodes && typeof raw.nodes === "object" ? raw.nodes : {},
+ edges: raw?.edges && typeof raw.edges === "object" ? raw.edges : {},
+ };
+ graphCache.set(rootDir, store);
+ } catch {
+ graphCache.set(rootDir, { nodes: {}, edges: {} });
+ }
+ return graphCache.get(rootDir)!;
+}
+
+async function persistGraph(rootDir: string): Promise {
+ const store = graphCache.get(rootDir);
+ if (!store) return;
+ await ensureMcpDataDir(rootDir);
+ await writeFile(join(rootDir, CACHE_DIR, GRAPH_FILE), JSON.stringify(store, null, 2));
+}
+
+function scheduleSave(rootDir: string): void {
+ const existing = saveTimeout.get(rootDir);
+ if (existing) clearTimeout(existing);
+ savePending.set(rootDir, true);
+ saveTimeout.set(rootDir, setTimeout(() => {
+ if (savePending.get(rootDir)) {
+ persistGraph(rootDir).catch(() => {}).finally(() => savePending.set(rootDir, false));
+ }
+ }, 500));
+}
+
+function getEdgesForNode(graph: GraphStore, nodeId: string): MemoryEdge[] {
+ return Object.values(graph.edges).filter(e => e.source === nodeId || e.target === nodeId);
+}
+
+function getNeighborId(edge: MemoryEdge, fromId: string): string {
+ return edge.source === fromId ? edge.target : edge.source;
+}
+
+export async function upsertNode(rootDir: string, type: NodeType, label: string, content: string, metadata?: Record): Promise {
+ const graph = await loadGraph(rootDir);
+ const existing = Object.values(graph.nodes).find(n => n.label === label && n.type === type);
+
+ if (existing) {
+ existing.content = content;
+ existing.lastAccessed = Date.now();
+ existing.accessCount++;
+ if (metadata) Object.assign(existing.metadata, metadata);
+ existing.embedding = (await fetchEmbedding(`${label} ${content}`))[0];
+ scheduleSave(rootDir);
+ return existing;
+ }
+
+ const node: MemoryNode = {
+ id: generateId("mn"),
+ type,
+ label,
+ content,
+ embedding: (await fetchEmbedding(`${label} ${content}`))[0],
+ createdAt: Date.now(),
+ lastAccessed: Date.now(),
+ accessCount: 1,
+ metadata: metadata ?? {},
+ };
+ graph.nodes[node.id] = node;
+ scheduleSave(rootDir);
+ return node;
+}
+
+export async function createRelation(rootDir: string, sourceId: string, targetId: string, relation: RelationType, weight?: number, metadata?: Record): Promise {
+ const graph = await loadGraph(rootDir);
+ if (!graph.nodes[sourceId] || !graph.nodes[targetId]) return null;
+
+ const duplicate = Object.values(graph.edges).find(e =>
+ e.source === sourceId && e.target === targetId && e.relation === relation
+ );
+ if (duplicate) {
+ duplicate.weight = weight ?? duplicate.weight;
+ if (metadata) Object.assign(duplicate.metadata, metadata);
+ scheduleSave(rootDir);
+ return duplicate;
+ }
+
+ const edge: MemoryEdge = {
+ id: generateId("me"),
+ source: sourceId,
+ target: targetId,
+ relation,
+ weight: weight ?? 1.0,
+ createdAt: Date.now(),
+ metadata: metadata ?? {},
+ };
+ graph.edges[edge.id] = edge;
+ scheduleSave(rootDir);
+ return edge;
+}
+
+export async function searchGraph(rootDir: string, query: string, maxDepth: number = 1, topK: number = 5, edgeFilter?: RelationType[]): Promise {
+ const graph = await loadGraph(rootDir);
+ const nodes = Object.values(graph.nodes);
+ if (nodes.length === 0) return { direct: [], neighbors: [], totalNodes: 0, totalEdges: 0 };
+
+ const [queryVec] = await fetchEmbedding(query);
+ const scored = nodes.map(n => ({ node: n, score: cosine(queryVec, n.embedding) }))
+ .sort((a, b) => b.score - a.score);
+
+ const directHits = scored.slice(0, topK).map(({ node, score }) => {
+ node.lastAccessed = Date.now();
+ return {
+ node,
+ depth: 0,
+ pathRelations: [] as string[],
+ relevanceScore: Math.round(score * 1000) / 10,
+ };
+ });
+
+ const neighborResults: TraversalResult[] = [];
+ const visited = new Set(directHits.map(h => h.node.id));
+
+ for (const hit of directHits) {
+ traverseNeighbors(graph, hit.node.id, queryVec, 1, maxDepth, [hit.node.label], visited, neighborResults, edgeFilter);
+ }
+
+ neighborResults.sort((a, b) => b.relevanceScore - a.relevanceScore);
+
+ scheduleSave(rootDir);
+ return {
+ direct: directHits,
+ neighbors: neighborResults.slice(0, topK * 2),
+ totalNodes: nodes.length,
+ totalEdges: Object.keys(graph.edges).length,
+ };
+}
+
+function traverseNeighbors(
+ graph: GraphStore, nodeId: string, queryVec: number[], depth: number, maxDepth: number,
+ pathLabels: string[], visited: Set, results: TraversalResult[], edgeFilter?: RelationType[],
+): void {
+ if (depth > maxDepth) return;
+
+ for (const edge of getEdgesForNode(graph, nodeId)) {
+ if (edgeFilter && !edgeFilter.includes(edge.relation)) continue;
+ const neighborId = getNeighborId(edge, nodeId);
+ if (visited.has(neighborId)) continue;
+
+ const neighbor = graph.nodes[neighborId];
+ if (!neighbor) continue;
+
+ visited.add(neighborId);
+ const similarity = cosine(queryVec, neighbor.embedding);
+ const edgeDecay = decayWeight(edge);
+ const relevance = similarity * 0.6 + (edgeDecay / Math.max(edge.weight, 0.01)) * 0.4;
+
+ results.push({
+ node: neighbor,
+ depth,
+ pathRelations: [...pathLabels, `--[${edge.relation}]-->`, neighbor.label],
+ relevanceScore: Math.round(relevance * 1000) / 10,
+ });
+
+ neighbor.lastAccessed = Date.now();
+ traverseNeighbors(graph, neighborId, queryVec, depth + 1, maxDepth, [...pathLabels, `--[${edge.relation}]-->`, neighbor.label], visited, results, edgeFilter);
+ }
+}
+
+export async function pruneStaleLinks(rootDir: string, threshold?: number): Promise<{ removed: number; remaining: number }> {
+ const graph = await loadGraph(rootDir);
+ const cutoff = threshold ?? STALE_THRESHOLD;
+ const toRemove: string[] = [];
+
+ for (const [edgeId, edge] of Object.entries(graph.edges)) {
+ if (decayWeight(edge) < cutoff) toRemove.push(edgeId);
+ }
+
+ for (const id of toRemove) delete graph.edges[id];
+
+ const orphanNodeIds = Object.keys(graph.nodes).filter(nodeId =>
+ getEdgesForNode(graph, nodeId).length === 0
+ && graph.nodes[nodeId].accessCount <= 1
+ && (Date.now() - graph.nodes[nodeId].lastAccessed) > 7 * 86_400_000
+ );
+ for (const id of orphanNodeIds) delete graph.nodes[id];
+
+ scheduleSave(rootDir);
+ return { removed: toRemove.length + orphanNodeIds.length, remaining: Object.keys(graph.edges).length };
+}
+
+export async function addInterlinkedContext(rootDir: string, items: Array<{ type: NodeType; label: string; content: string; metadata?: Record }>, autoLink: boolean = true): Promise<{ nodes: MemoryNode[]; edges: MemoryEdge[] }> {
+ const createdNodes: MemoryNode[] = [];
+ for (const item of items) {
+ createdNodes.push(await upsertNode(rootDir, item.type, item.label, item.content, item.metadata));
+ }
+
+ const createdEdges: MemoryEdge[] = [];
+
+ if (autoLink && createdNodes.length > 1) {
+ for (let i = 0; i < createdNodes.length; i++) {
+ for (let j = i + 1; j < createdNodes.length; j++) {
+ const similarity = cosine(createdNodes[i].embedding, createdNodes[j].embedding);
+ if (similarity >= SIMILARITY_THRESHOLD) {
+ const edge = await createRelation(rootDir, createdNodes[i].id, createdNodes[j].id, "similar_to", similarity);
+ if (edge) createdEdges.push(edge);
+ }
+ }
+ }
+ }
+
+ const graph = await loadGraph(rootDir);
+ const existingNodes = Object.values(graph.nodes)
+ .filter(n => !createdNodes.find(cn => cn.id === n.id))
+ .slice(0, 200);
+ if (autoLink) {
+ for (const newNode of createdNodes) {
+ for (const existing of existingNodes) {
+ const similarity = cosine(newNode.embedding, existing.embedding);
+ if (similarity >= SIMILARITY_THRESHOLD) {
+ const edge = await createRelation(rootDir, newNode.id, existing.id, "similar_to", similarity);
+ if (edge) createdEdges.push(edge);
+ }
+ }
+ }
+ }
+
+ return { nodes: createdNodes, edges: createdEdges };
+}
+
+export async function retrieveWithTraversal(rootDir: string, startNodeId: string, maxDepth: number = 2, edgeFilter?: RelationType[]): Promise {
+ const graph = await loadGraph(rootDir);
+ const startNode = graph.nodes[startNodeId];
+ if (!startNode) return [];
+
+ startNode.lastAccessed = Date.now();
+ startNode.accessCount++;
+
+ const results: TraversalResult[] = [{
+ node: startNode,
+ depth: 0,
+ pathRelations: [startNode.label],
+ relevanceScore: 100,
+ }];
+
+ const visited = new Set([startNodeId]);
+ collectTraversal(graph, startNodeId, 1, maxDepth, [startNode.label], visited, results, edgeFilter);
+
+ scheduleSave(rootDir);
+ return results;
+}
+
+function collectTraversal(
+ graph: GraphStore, nodeId: string, depth: number, maxDepth: number,
+ pathLabels: string[], visited: Set, results: TraversalResult[], edgeFilter?: RelationType[],
+): void {
+ if (depth > maxDepth) return;
+
+ for (const edge of getEdgesForNode(graph, nodeId)) {
+ if (edgeFilter && !edgeFilter.includes(edge.relation)) continue;
+ const neighborId = getNeighborId(edge, nodeId);
+ if (visited.has(neighborId)) continue;
+
+ const neighbor = graph.nodes[neighborId];
+ if (!neighbor) continue;
+
+ visited.add(neighborId);
+ neighbor.lastAccessed = Date.now();
+
+ const decayed = decayWeight(edge);
+ const depthPenalty = 1 / (1 + depth * 0.3);
+ const score = decayed * depthPenalty * 100;
+
+ results.push({
+ node: neighbor,
+ depth,
+ pathRelations: [...pathLabels, `--[${edge.relation}]-->`, neighbor.label],
+ relevanceScore: Math.round(score * 10) / 10,
+ });
+
+ collectTraversal(graph, neighborId, depth + 1, maxDepth, [...pathLabels, `--[${edge.relation}]-->`, neighbor.label], visited, results, edgeFilter);
+ }
+}
+
+export async function getGraphStats(rootDir: string): Promise<{ nodes: number; edges: number; types: Record; relations: Record }> {
+ const graph = await loadGraph(rootDir);
+ const types: Record = {};
+ const relations: Record = {};
+
+ for (const node of Object.values(graph.nodes)) types[node.type] = (types[node.type] ?? 0) + 1;
+ for (const edge of Object.values(graph.edges)) relations[edge.relation] = (relations[edge.relation] ?? 0) + 1;
+
+ return { nodes: Object.keys(graph.nodes).length, edges: Object.keys(graph.edges).length, types, relations };
+}
diff --git a/src/index.ts b/src/index.ts
index 79fab22..a1fbf2c 100644
--- a/src/index.ts
+++ b/src/index.ts
@@ -20,6 +20,7 @@ import { proposeCommit } from "./tools/propose-commit.js";
import { listRestorePoints, restorePoint } from "./git/shadow.js";
import { semanticNavigate } from "./tools/semantic-navigate.js";
import { getFeatureHub } from "./tools/feature-hub.js";
+import { toolUpsertMemoryNode, toolCreateRelation, toolSearchMemoryGraph, toolPruneStaleLinks, toolAddInterlinkedContext, toolRetrieveWithTraversal } from "./tools/memory-tools.js";
type AgentTarget = "claude" | "cursor" | "vscode" | "windsurf" | "opencode";
@@ -392,6 +393,117 @@ server.tool(
}),
);
+server.tool(
+ "upsert_memory_node",
+ "Create or update a memory node in the linking graph. Nodes represent concepts, files, symbols, or notes with auto-generated embeddings. " +
+ "If a node with the same label and type exists, it updates content and increments access count. Returns the node ID for use in create_relation.",
+ {
+ type: z.enum(["concept", "file", "symbol", "note"]).describe("Node type: concept (abstract ideas), file (source files), symbol (functions/classes), note (free-form)."),
+ label: z.string().describe("Short identifier for the node. Used for deduplication with type."),
+ content: z.string().describe("Detailed content for the node. Used for embedding generation."),
+ metadata: z.record(z.string()).optional().describe("Optional key-value metadata pairs."),
+ },
+ async ({ type, label, content, metadata }) => ({
+ content: [{
+ type: "text" as const,
+ text: await toolUpsertMemoryNode({ rootDir: ROOT_DIR, type, label, content, metadata }),
+ }],
+ }),
+);
+
+server.tool(
+ "create_relation",
+ "Create a typed edge between two memory nodes. Supports relation types: relates_to, depends_on, implements, references, similar_to, contains. " +
+ "Edges have weights (0-1) that decay over time via e^(-λt). Duplicate edges update weight instead of creating new ones.",
+ {
+ source_id: z.string().describe("ID of the source memory node."),
+ target_id: z.string().describe("ID of the target memory node."),
+ relation: z.enum(["relates_to", "depends_on", "implements", "references", "similar_to", "contains"]).describe("Relationship type between nodes."),
+ weight: z.number().optional().describe("Edge weight 0-1. Higher = stronger relationship. Default: 1.0."),
+ metadata: z.record(z.string()).optional().describe("Optional key-value metadata for the edge."),
+ },
+ async ({ source_id, target_id, relation, weight, metadata }) => ({
+ content: [{
+ type: "text" as const,
+ text: await toolCreateRelation({ rootDir: ROOT_DIR, sourceId: source_id, targetId: target_id, relation, weight, metadata }),
+ }],
+ }),
+);
+
+server.tool(
+ "search_memory_graph",
+ "Search the memory graph by meaning with graph traversal. First finds direct matches via embedding similarity, " +
+ "then traverses 1st/2nd-degree neighbors to discover linked context. Returns both direct hits and graph-connected neighbors with relevance scores.",
+ {
+ query: z.string().describe("Natural language query to search the memory graph."),
+ max_depth: z.number().optional().describe("How many hops to traverse from direct matches. Default: 1."),
+ top_k: z.number().optional().describe("Number of direct matches to return. Default: 5."),
+ edge_filter: z.array(z.enum(["relates_to", "depends_on", "implements", "references", "similar_to", "contains"])).optional()
+ .describe("Only traverse edges of these types. Omit for all types."),
+ },
+ async ({ query, max_depth, top_k, edge_filter }) => ({
+ content: [{
+ type: "text" as const,
+ text: await toolSearchMemoryGraph({ rootDir: ROOT_DIR, query, maxDepth: max_depth, topK: top_k, edgeFilter: edge_filter }),
+ }],
+ }),
+);
+
+server.tool(
+ "prune_stale_links",
+ "Remove stale memory graph edges whose weight has decayed below threshold via e^(-λt) formula. " +
+ "Also removes orphan nodes with no edges, low access count, and >7 days since last access. Keeps the graph lean.",
+ {
+ threshold: z.number().optional().describe("Minimum decayed weight to keep an edge. Default: 0.15. Lower = keep more edges."),
+ },
+ async ({ threshold }) => ({
+ content: [{
+ type: "text" as const,
+ text: await toolPruneStaleLinks({ rootDir: ROOT_DIR, threshold }),
+ }],
+ }),
+);
+
+server.tool(
+ "add_interlinked_context",
+ "Bulk-add multiple memory nodes with automatic similarity linking. Computes embeddings for all items, " +
+ "then creates similarity edges between any pair (new-to-new and new-to-existing) with cosine similarity ≥ 0.72. " +
+ "Ideal for importing related concepts, files, or notes at once.",
+ {
+ items: z.array(z.object({
+ type: z.enum(["concept", "file", "symbol", "note"]),
+ label: z.string(),
+ content: z.string(),
+ metadata: z.record(z.string()).optional(),
+ })).describe("Array of nodes to add. Each needs type, label, and content."),
+ auto_link: z.boolean().optional().describe("Whether to auto-create similarity edges. Default: true."),
+ },
+ async ({ items, auto_link }) => ({
+ content: [{
+ type: "text" as const,
+ text: await toolAddInterlinkedContext({ rootDir: ROOT_DIR, items, autoLink: auto_link }),
+ }],
+ }),
+);
+
+server.tool(
+ "retrieve_with_traversal",
+ "Start from a specific memory node and traverse the graph outward. Returns the starting node plus all reachable neighbors " +
+ "within the depth limit, scored by edge weight decay and depth penalty. Use after search_memory_graph to explore a specific node's neighborhood.",
+ {
+ start_node_id: z.string().describe("ID of the memory node to start traversal from."),
+ max_depth: z.number().optional().describe("Maximum traversal depth from start node. Default: 2."),
+ edge_filter: z.array(z.enum(["relates_to", "depends_on", "implements", "references", "similar_to", "contains"])).optional()
+ .describe("Only traverse edges of these types. Omit for all."),
+ },
+ async ({ start_node_id, max_depth, edge_filter }) => ({
+ content: [{
+ type: "text" as const,
+ text: await toolRetrieveWithTraversal({ rootDir: ROOT_DIR, startNodeId: start_node_id, maxDepth: max_depth, edgeFilter: edge_filter }),
+ }],
+ }),
+);
+
async function main() {
const args = process.argv.slice(2);
if (args[0] === "init") {
diff --git a/src/tools/memory-tools.ts b/src/tools/memory-tools.ts
new file mode 100644
index 0000000..925ef3c
--- /dev/null
+++ b/src/tools/memory-tools.ts
@@ -0,0 +1,142 @@
+// MCP tool wrappers for memory graph operations and interlinked RAG
+// FEATURE: Memory Tools — upsert, relate, search, prune, interlink, traverse
+
+import type { NodeType, RelationType, TraversalResult } from "../core/memory-graph.js";
+import { upsertNode, createRelation, searchGraph, pruneStaleLinks, addInterlinkedContext, retrieveWithTraversal, getGraphStats } from "../core/memory-graph.js";
+
+export interface UpsertMemoryNodeOptions {
+ rootDir: string;
+ type: NodeType;
+ label: string;
+ content: string;
+ metadata?: Record;
+}
+
+export interface CreateRelationOptions {
+ rootDir: string;
+ sourceId: string;
+ targetId: string;
+ relation: RelationType;
+ weight?: number;
+ metadata?: Record;
+}
+
+export interface SearchMemoryGraphOptions {
+ rootDir: string;
+ query: string;
+ maxDepth?: number;
+ topK?: number;
+ edgeFilter?: RelationType[];
+}
+
+export interface PruneStaleLinksOptions {
+ rootDir: string;
+ threshold?: number;
+}
+
+export interface AddInterlinkedContextOptions {
+ rootDir: string;
+ items: Array<{ type: NodeType; label: string; content: string; metadata?: Record }>;
+ autoLink?: boolean;
+}
+
+export interface RetrieveWithTraversalOptions {
+ rootDir: string;
+ startNodeId: string;
+ maxDepth?: number;
+ edgeFilter?: RelationType[];
+}
+
+function formatTraversalResult(result: TraversalResult): string {
+ return [
+ ` [${result.node.type}] ${result.node.label} (depth: ${result.depth}, score: ${result.relevanceScore})`,
+ ` Content: ${result.node.content.slice(0, 120)}${result.node.content.length > 120 ? "..." : ""}`,
+ result.pathRelations.length > 1 ? ` Path: ${result.pathRelations.join(" ")}` : "",
+ ` ID: ${result.node.id} | Accessed: ${result.node.accessCount}x`,
+ ].filter(Boolean).join("\n");
+}
+
+export async function toolUpsertMemoryNode(options: UpsertMemoryNodeOptions): Promise {
+ const node = await upsertNode(options.rootDir, options.type, options.label, options.content, options.metadata);
+ const stats = await getGraphStats(options.rootDir);
+ return [
+ `✅ Memory node upserted: ${node.label}`,
+ ` ID: ${node.id}`,
+ ` Type: ${node.type}`,
+ ` Access count: ${node.accessCount}`,
+ `\nGraph: ${stats.nodes} nodes, ${stats.edges} edges`,
+ ].join("\n");
+}
+
+export async function toolCreateRelation(options: CreateRelationOptions): Promise {
+ const edge = await createRelation(options.rootDir, options.sourceId, options.targetId, options.relation, options.weight, options.metadata);
+ if (!edge) return `❌ Failed: one or both node IDs not found (source: ${options.sourceId}, target: ${options.targetId})`;
+
+ const stats = await getGraphStats(options.rootDir);
+ return [
+ `✅ Relation created: ${options.sourceId} --[${edge.relation}]--> ${options.targetId}`,
+ ` Edge ID: ${edge.id}`,
+ ` Weight: ${edge.weight}`,
+ `\nGraph: ${stats.nodes} nodes, ${stats.edges} edges`,
+ ].join("\n");
+}
+
+export async function toolSearchMemoryGraph(options: SearchMemoryGraphOptions): Promise {
+ const result = await searchGraph(options.rootDir, options.query, options.maxDepth, options.topK, options.edgeFilter);
+ if (result.direct.length === 0) return `No memory nodes found for: "${options.query}"\nGraph has ${result.totalNodes} nodes, ${result.totalEdges} edges.`;
+
+ const sections: string[] = [`Memory Graph Search: "${options.query}"`, `Graph: ${result.totalNodes} nodes, ${result.totalEdges} edges\n`];
+
+ sections.push("Direct Matches:");
+ for (const hit of result.direct) sections.push(formatTraversalResult(hit));
+
+ if (result.neighbors.length > 0) {
+ sections.push("\nLinked Neighbors:");
+ for (const neighbor of result.neighbors) sections.push(formatTraversalResult(neighbor));
+ }
+
+ return sections.join("\n");
+}
+
+export async function toolPruneStaleLinks(options: PruneStaleLinksOptions): Promise {
+ const result = await pruneStaleLinks(options.rootDir, options.threshold);
+ return [
+ `🧹 Pruning complete`,
+ ` Removed: ${result.removed} stale links/orphan nodes`,
+ ` Remaining edges: ${result.remaining}`,
+ ].join("\n");
+}
+
+export async function toolAddInterlinkedContext(options: AddInterlinkedContextOptions): Promise {
+ const result = await addInterlinkedContext(options.rootDir, options.items, options.autoLink);
+ const sections = [
+ `✅ Added ${result.nodes.length} interlinked nodes`,
+ result.edges.length > 0 ? ` Auto-linked: ${result.edges.length} similarity edges (threshold ≥ 0.72)` : " No auto-links above threshold",
+ "\nNodes:",
+ ];
+
+ for (const node of result.nodes) {
+ sections.push(` [${node.type}] ${node.label} → ${node.id}`);
+ }
+
+ if (result.edges.length > 0) {
+ sections.push("\nEdges:");
+ for (const edge of result.edges) {
+ sections.push(` ${edge.source} --[${edge.relation} w:${Math.round(edge.weight * 100) / 100}]--> ${edge.target}`);
+ }
+ }
+
+ const stats = await getGraphStats(options.rootDir);
+ sections.push(`\nGraph total: ${stats.nodes} nodes, ${stats.edges} edges`);
+ return sections.join("\n");
+}
+
+export async function toolRetrieveWithTraversal(options: RetrieveWithTraversalOptions): Promise {
+ const results = await retrieveWithTraversal(options.rootDir, options.startNodeId, options.maxDepth, options.edgeFilter);
+ if (results.length === 0) return `❌ Node not found: ${options.startNodeId}`;
+
+ const sections = [`Traversal from: ${results[0].node.label} (depth limit: ${options.maxDepth ?? 2})\n`];
+ for (const result of results) sections.push(formatTraversalResult(result));
+
+ return sections.join("\n");
+}
diff --git a/test/main/memory-graph.test.mjs b/test/main/memory-graph.test.mjs
new file mode 100644
index 0000000..7097526
--- /dev/null
+++ b/test/main/memory-graph.test.mjs
@@ -0,0 +1,431 @@
+import { describe, it, before, after, beforeEach } from "node:test";
+import assert from "node:assert/strict";
+import { mkdir, rm, readFile } from "fs/promises";
+import { join, resolve } from "path";
+import { Ollama } from "ollama";
+
+const {
+ upsertNode,
+ createRelation,
+ searchGraph,
+ pruneStaleLinks,
+ addInterlinkedContext,
+ retrieveWithTraversal,
+ getGraphStats,
+} = await import("../../build/core/memory-graph.js");
+
+const {
+ toolUpsertMemoryNode,
+ toolCreateRelation,
+ toolSearchMemoryGraph,
+ toolPruneStaleLinks,
+ toolAddInterlinkedContext,
+ toolRetrieveWithTraversal,
+} = await import("../../build/tools/memory-tools.js");
+
+const FIXTURE = resolve("test/_memory_graph_fixtures");
+let embedCounter = 0;
+
+function mockEmbedding() {
+ embedCounter = 0;
+ const original = Ollama.prototype.embed;
+ Ollama.prototype.embed = async function ({ input }) {
+ const batch = Array.isArray(input) ? input : [input];
+ return {
+ embeddings: batch.map((text) => {
+ embedCounter++;
+ const vec = new Array(64).fill(0);
+ for (let i = 0; i < Math.min(text.length, 64); i++) {
+ vec[i] = (text.charCodeAt(i) % 100) / 100;
+ }
+ const norm = Math.sqrt(vec.reduce((s, v) => s + v * v, 0));
+ return norm > 0 ? vec.map((v) => v / norm) : vec;
+ }),
+ };
+ };
+ return () => { Ollama.prototype.embed = original; };
+}
+
+before(async () => {
+ await rm(FIXTURE, { recursive: true, force: true });
+ await mkdir(join(FIXTURE, ".mcp_data"), { recursive: true });
+});
+
+after(async () => {
+ await rm(FIXTURE, { recursive: true, force: true });
+});
+
+describe("memory-graph core", () => {
+ describe("upsertNode", () => {
+ it("creates a new node with embedding", async () => {
+ const restore = mockEmbedding();
+ try {
+ const node = await upsertNode(FIXTURE, "concept", "Auth Flow", "Handles user login and session management");
+ assert.ok(node.id.startsWith("mn-"));
+ assert.equal(node.type, "concept");
+ assert.equal(node.label, "Auth Flow");
+ assert.equal(node.accessCount, 1);
+ assert.ok(node.embedding.length > 0);
+ } finally {
+ restore();
+ }
+ });
+
+ it("updates existing node with same label+type", async () => {
+ const restore = mockEmbedding();
+ try {
+ const first = await upsertNode(FIXTURE, "note", "Test Note", "Original content");
+ const second = await upsertNode(FIXTURE, "note", "Test Note", "Updated content");
+ assert.equal(first.id, second.id);
+ assert.equal(second.content, "Updated content");
+ assert.equal(second.accessCount, 2);
+ } finally {
+ restore();
+ }
+ });
+
+ it("stores metadata on the node", async () => {
+ const restore = mockEmbedding();
+ try {
+ const node = await upsertNode(FIXTURE, "file", "config.ts", "Configuration loader", { language: "typescript" });
+ assert.equal(node.metadata.language, "typescript");
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("createRelation", () => {
+ it("creates edge between existing nodes", async () => {
+ const restore = mockEmbedding();
+ try {
+ const a = await upsertNode(FIXTURE, "concept", "Edge A", "Source concept");
+ const b = await upsertNode(FIXTURE, "concept", "Edge B", "Target concept");
+ const edge = await createRelation(FIXTURE, a.id, b.id, "relates_to", 0.9);
+ assert.ok(edge);
+ assert.ok(edge.id.startsWith("me-"));
+ assert.equal(edge.relation, "relates_to");
+ assert.equal(edge.weight, 0.9);
+ } finally {
+ restore();
+ }
+ });
+
+ it("returns null for nonexistent node IDs", async () => {
+ const restore = mockEmbedding();
+ try {
+ const edge = await createRelation(FIXTURE, "fake-id-1", "fake-id-2", "depends_on");
+ assert.equal(edge, null);
+ } finally {
+ restore();
+ }
+ });
+
+ it("updates duplicate edge weight instead of creating new", async () => {
+ const restore = mockEmbedding();
+ try {
+ const a = await upsertNode(FIXTURE, "symbol", "Dup A", "Function A");
+ const b = await upsertNode(FIXTURE, "symbol", "Dup B", "Function B");
+ const first = await createRelation(FIXTURE, a.id, b.id, "references", 0.5);
+ const second = await createRelation(FIXTURE, a.id, b.id, "references", 0.95);
+ assert.equal(first.id, second.id);
+ assert.equal(second.weight, 0.95);
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("searchGraph", () => {
+ it("returns results ranked by embedding similarity", async () => {
+ const restore = mockEmbedding();
+ try {
+ await upsertNode(FIXTURE, "concept", "Search Target", "Authentication and login");
+ const result = await searchGraph(FIXTURE, "authentication login", 0, 3);
+ assert.ok(result.direct.length > 0);
+ assert.ok(result.totalNodes > 0);
+ } finally {
+ restore();
+ }
+ });
+
+ it("returns empty for empty graph in fresh dir", async () => {
+ const emptyDir = resolve("test/_memory_empty");
+ await mkdir(join(emptyDir, ".mcp_data"), { recursive: true });
+ const restore = mockEmbedding();
+ try {
+ const result = await searchGraph(emptyDir, "anything", 1, 5);
+ assert.equal(result.direct.length, 0);
+ assert.equal(result.neighbors.length, 0);
+ } finally {
+ restore();
+ await rm(emptyDir, { recursive: true, force: true });
+ }
+ });
+
+ it("includes neighbors at depth 1", async () => {
+ const restore = mockEmbedding();
+ try {
+ const a = await upsertNode(FIXTURE, "concept", "Nav Root", "Root navigation");
+ const b = await upsertNode(FIXTURE, "concept", "Nav Child", "Child navigation link");
+ await createRelation(FIXTURE, a.id, b.id, "contains");
+ const result = await searchGraph(FIXTURE, "Nav Root navigation", 1, 1);
+ assert.ok(result.direct.length > 0 || result.neighbors.length > 0);
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("pruneStaleLinks", () => {
+ it("removes edges with decayed weight below threshold", async () => {
+ const restore = mockEmbedding();
+ try {
+ const a = await upsertNode(FIXTURE, "note", "Prune A", "Will be pruned");
+ const b = await upsertNode(FIXTURE, "note", "Prune B", "Will be pruned too");
+ const edge = await createRelation(FIXTURE, a.id, b.id, "relates_to", 0.01);
+ assert.ok(edge);
+ const result = await pruneStaleLinks(FIXTURE, 0.5);
+ assert.ok(result.removed >= 0);
+ assert.ok(typeof result.remaining === "number");
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("addInterlinkedContext", () => {
+ it("creates multiple nodes with auto-linking", async () => {
+ const restore = mockEmbedding();
+ try {
+ const result = await addInterlinkedContext(FIXTURE, [
+ { type: "concept", label: "Interlink A", content: "First interlinked concept about testing" },
+ { type: "concept", label: "Interlink B", content: "Second interlinked concept about testing" },
+ { type: "note", label: "Interlink Note", content: "A note about testing concepts" },
+ ], true);
+ assert.equal(result.nodes.length, 3);
+ assert.ok(Array.isArray(result.edges));
+ } finally {
+ restore();
+ }
+ });
+
+ it("skips auto-linking when disabled", async () => {
+ const restore = mockEmbedding();
+ try {
+ const result = await addInterlinkedContext(FIXTURE, [
+ { type: "concept", label: "No Link A", content: "Should not auto link" },
+ { type: "concept", label: "No Link B", content: "Should not auto link either" },
+ ], false);
+ assert.equal(result.nodes.length, 2);
+ assert.equal(result.edges.length, 0);
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("retrieveWithTraversal", () => {
+ it("returns start node and connected neighbors", async () => {
+ const restore = mockEmbedding();
+ try {
+ const root = await upsertNode(FIXTURE, "concept", "Traversal Root", "Starting point");
+ const child1 = await upsertNode(FIXTURE, "symbol", "Traversal Child 1", "First child");
+ const child2 = await upsertNode(FIXTURE, "symbol", "Traversal Child 2", "Second child");
+ await createRelation(FIXTURE, root.id, child1.id, "contains");
+ await createRelation(FIXTURE, root.id, child2.id, "contains");
+
+ const results = await retrieveWithTraversal(FIXTURE, root.id, 1);
+ assert.ok(results.length >= 1);
+ assert.equal(results[0].node.id, root.id);
+ assert.equal(results[0].depth, 0);
+ } finally {
+ restore();
+ }
+ });
+
+ it("returns empty for nonexistent node", async () => {
+ const restore = mockEmbedding();
+ try {
+ const results = await retrieveWithTraversal(FIXTURE, "nonexistent-id", 2);
+ assert.equal(results.length, 0);
+ } finally {
+ restore();
+ }
+ });
+
+ it("respects edge filter", async () => {
+ const restore = mockEmbedding();
+ try {
+ const a = await upsertNode(FIXTURE, "concept", "Filter Root", "Root for filtering");
+ const b = await upsertNode(FIXTURE, "symbol", "Filter Dep", "Dependency target");
+ const c = await upsertNode(FIXTURE, "note", "Filter Ref", "Reference target");
+ await createRelation(FIXTURE, a.id, b.id, "depends_on");
+ await createRelation(FIXTURE, a.id, c.id, "references");
+
+ const filtered = await retrieveWithTraversal(FIXTURE, a.id, 1, ["depends_on"]);
+ const depNodes = filtered.filter((r) => r.depth > 0);
+ for (const r of depNodes) {
+ assert.ok(r.pathRelations.some((p) => p.includes("depends_on")));
+ }
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("getGraphStats", () => {
+ it("returns node and edge counts with type breakdown", async () => {
+ const restore = mockEmbedding();
+ try {
+ const stats = await getGraphStats(FIXTURE);
+ assert.ok(typeof stats.nodes === "number");
+ assert.ok(typeof stats.edges === "number");
+ assert.ok(typeof stats.types === "object");
+ assert.ok(typeof stats.relations === "object");
+ } finally {
+ restore();
+ }
+ });
+ });
+});
+
+describe("memory-tools MCP wrappers", () => {
+ describe("toolUpsertMemoryNode", () => {
+ it("returns formatted success message with node ID", async () => {
+ const restore = mockEmbedding();
+ try {
+ const output = await toolUpsertMemoryNode({
+ rootDir: FIXTURE,
+ type: "concept",
+ label: "MCP Test Node",
+ content: "Testing the MCP wrapper",
+ });
+ assert.ok(output.includes("✅"));
+ assert.ok(output.includes("MCP Test Node"));
+ assert.ok(output.includes("mn-"));
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("toolCreateRelation", () => {
+ it("returns success for valid node IDs", async () => {
+ const restore = mockEmbedding();
+ try {
+ const a = await upsertNode(FIXTURE, "concept", "Rel MCP A", "A node");
+ const b = await upsertNode(FIXTURE, "concept", "Rel MCP B", "B node");
+ const output = await toolCreateRelation({
+ rootDir: FIXTURE,
+ sourceId: a.id,
+ targetId: b.id,
+ relation: "implements",
+ });
+ assert.ok(output.includes("✅"));
+ assert.ok(output.includes("implements"));
+ } finally {
+ restore();
+ }
+ });
+
+ it("returns error for invalid node IDs", async () => {
+ const restore = mockEmbedding();
+ try {
+ const output = await toolCreateRelation({
+ rootDir: FIXTURE,
+ sourceId: "bad-1",
+ targetId: "bad-2",
+ relation: "relates_to",
+ });
+ assert.ok(output.includes("❌"));
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("toolSearchMemoryGraph", () => {
+ it("returns formatted search results", async () => {
+ const restore = mockEmbedding();
+ try {
+ const output = await toolSearchMemoryGraph({
+ rootDir: FIXTURE,
+ query: "testing concepts",
+ maxDepth: 1,
+ topK: 3,
+ });
+ assert.ok(typeof output === "string");
+ assert.ok(output.length > 0);
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("toolPruneStaleLinks", () => {
+ it("returns pruning summary", async () => {
+ const restore = mockEmbedding();
+ try {
+ const output = await toolPruneStaleLinks({ rootDir: FIXTURE, threshold: 0.99 });
+ assert.ok(output.includes("🧹"));
+ assert.ok(output.includes("Removed"));
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("toolAddInterlinkedContext", () => {
+ it("returns formatted bulk-add results", async () => {
+ const restore = mockEmbedding();
+ try {
+ const output = await toolAddInterlinkedContext({
+ rootDir: FIXTURE,
+ items: [
+ { type: "note", label: "Bulk A", content: "First bulk item" },
+ { type: "note", label: "Bulk B", content: "Second bulk item" },
+ ],
+ autoLink: true,
+ });
+ assert.ok(output.includes("✅"));
+ assert.ok(output.includes("Bulk A"));
+ assert.ok(output.includes("Bulk B"));
+ } finally {
+ restore();
+ }
+ });
+ });
+
+ describe("toolRetrieveWithTraversal", () => {
+ it("returns error for nonexistent node", async () => {
+ const restore = mockEmbedding();
+ try {
+ const output = await toolRetrieveWithTraversal({
+ rootDir: FIXTURE,
+ startNodeId: "ghost-node",
+ maxDepth: 2,
+ });
+ assert.ok(output.includes("❌"));
+ } finally {
+ restore();
+ }
+ });
+
+ it("returns traversal results for valid node", async () => {
+ const restore = mockEmbedding();
+ try {
+ const node = await upsertNode(FIXTURE, "concept", "Trav MCP Root", "Root for MCP traversal");
+ const output = await toolRetrieveWithTraversal({
+ rootDir: FIXTURE,
+ startNodeId: node.id,
+ maxDepth: 1,
+ });
+ assert.ok(output.includes("Trav MCP Root"));
+ assert.ok(!output.includes("❌"));
+ } finally {
+ restore();
+ }
+ });
+ });
+});