Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,17 @@
# Changelog

## v1.5.5 — Housekeeping (2026-04-10)

### Added
- **`auto_link` parameter** on `create` — set to `false` to skip automatic wikilink resolution. Applies to MCP, HTTP, and CLI. Discovered links still appear as suggestions in the response.
- **`reindex_file` MCP tool + HTTP endpoint** — re-indexes a single file after external edits. Reads from disk, re-embeds chunks, rebuilds edges. Available as MCP tool, `POST /api/reindex-file`, and OpenAPI operation.

### Changed
- **rmcp** bumped from 1.2.0 to 1.4.0 — host validation, non-Send handler support, transport fixes. Does not yet fix [#20](https://github.com/devwhodevs/engraph/issues/20) (protocol `2025-11-25` needed for Claude Desktop Cowork/Code modes — blocked upstream on [modelcontextprotocol/rust-sdk#800](https://github.com/modelcontextprotocol/rust-sdk/issues/800)).
- MCP tools: 22 → 23
- HTTP endpoints: 23 → 24
- OpenAPI version: 1.5.0 → 1.5.5

## v1.5.0 — ChatGPT Actions (2026-03-26)

### Added
Expand Down
10 changes: 5 additions & 5 deletions Cargo.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[package]
name = "engraph"
version = "1.5.4"
version = "1.5.5"
edition = "2024"
description = "Local knowledge graph for AI agents. Hybrid search + MCP server for Obsidian vaults."
license = "MIT"
Expand Down Expand Up @@ -31,7 +31,7 @@ rayon = "1"
time = { version = "0.3", features = ["parsing", "formatting", "macros"] }
strsim = "0.11"
ignore = "0.4"
rmcp = { version = "1.2", features = ["transport-io"] }
rmcp = { version = "1.4", features = ["transport-io"] }
tokio = { version = "1", features = ["macros", "rt-multi-thread", "process", "time", "net"] }
notify = "7.0"
notify-debouncer-full = "0.4"
Expand Down
11 changes: 7 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -268,7 +268,7 @@ Returns orphan notes (no links in or out), broken wikilinks, stale notes, and ta

`engraph serve --http` adds a full REST API alongside the MCP server, exposing the same capabilities over HTTP for web agents, scripts, and integrations.

**23 endpoints:**
**24 endpoints:**

| Method | Endpoint | Permission | Description |
|--------|----------|------------|-------------|
Expand All @@ -292,6 +292,7 @@ Returns orphan notes (no links in or out), broken wikilinks, stale notes, and ta
| POST | `/api/unarchive` | write | Restore archived note |
| POST | `/api/update-metadata` | write | Update note metadata |
| POST | `/api/delete` | write | Delete note (soft or hard) |
| POST | `/api/reindex-file` | write | Re-index a single file after external edits |
| POST | `/api/migrate/preview` | write | Preview PARA migration (classify + suggest moves) |
| POST | `/api/migrate/apply` | write | Apply PARA migration (move files) |
| POST | `/api/migrate/undo` | write | Undo last PARA migration |
Expand Down Expand Up @@ -542,8 +543,8 @@ engraph is not a replacement for Obsidian — it's the intelligence layer that s
- LLM research orchestrator: query intent classification + query expansion + adaptive lane weights
- llama.cpp inference via Rust bindings (GGUF models, Metal GPU on macOS, CUDA on Linux)
- Intelligence opt-in: heuristic fallback when disabled, LLM-powered when enabled
- MCP server with 22 tools (8 read, 10 write, 1 diagnostic, 3 migrate) via stdio
- HTTP REST API with 23 endpoints, API key auth (`eg_` prefix), rate limiting, CORS — enabled via `engraph serve --http`
- MCP server with 23 tools (8 read, 10 write, 1 index, 1 diagnostic, 3 migrate) via stdio
- HTTP REST API with 24 endpoints, API key auth (`eg_` prefix), rate limiting, CORS — enabled via `engraph serve --http`
- Section-level reading and editing: target specific headings with replace/prepend/append modes
- Full note rewriting with automatic frontmatter preservation
- Granular frontmatter mutations: set/remove fields, add/remove tags and aliases
Expand Down Expand Up @@ -572,7 +573,9 @@ engraph is not a replacement for Obsidian — it's the intelligence layer that s
- [x] ~~HTTP/REST API — complement MCP with a standard web API~~ (v1.3)
- [x] ~~PARA migration — AI-assisted vault restructuring with preview/apply/undo~~ (v1.4)
- [x] ~~ChatGPT Actions — OpenAPI 3.1.0 spec + plugin manifest + `--setup-chatgpt` helper~~ (v1.5)
- [ ] Multi-vault — search across multiple vaults (v1.6)
- [ ] Identity — user context at session start, enhanced onboarding (v1.6)
- [ ] Timeline — temporal knowledge graph with point-in-time queries (v1.7)
- [ ] Mining — automatic fact extraction from vault notes (v1.8)

## Configuration

Expand Down
55 changes: 55 additions & 0 deletions src/http.rs
Original file line number Diff line number Diff line change
Expand Up @@ -268,6 +268,7 @@ struct CreateBody {
#[serde(default)]
tags: Vec<String>,
folder: Option<String>,
auto_link: Option<bool>,
}

#[derive(Debug, Deserialize)]
Expand Down Expand Up @@ -326,6 +327,11 @@ struct DeleteBody {
mode: Option<String>,
}

#[derive(Debug, Deserialize)]
struct ReindexFileBody {
file: String,
}

// ---------------------------------------------------------------------------
// CORS
// ---------------------------------------------------------------------------
Expand Down Expand Up @@ -380,6 +386,8 @@ pub fn build_router(state: ApiState) -> Router {
.route("/api/unarchive", post(handle_unarchive))
.route("/api/update-metadata", post(handle_update_metadata))
.route("/api/delete", post(handle_delete))
// Index maintenance
.route("/api/reindex-file", post(handle_reindex_file))
// Migration endpoints
.route("/api/migrate/preview", post(handle_migrate_preview))
.route("/api/migrate/apply", post(handle_migrate_apply))
Expand Down Expand Up @@ -712,6 +720,7 @@ async fn handle_create(
tags: body.tags,
folder: body.folder,
created_by: "http-api".into(),
auto_link: body.auto_link,
};
let result = writer::create_note(
input,
Expand Down Expand Up @@ -1011,6 +1020,52 @@ async fn handle_delete(
})))
}

async fn handle_reindex_file(
State(state): State<ApiState>,
headers: HeaderMap,
Json(body): Json<ReindexFileBody>,
) -> Result<impl IntoResponse, ApiError> {
authorize(&headers, &state, true)?;
let store = state.store.lock().await;
let mut embedder = state.embedder.lock().await;
let full_path = state.vault_path.join(&body.file);

let content = std::fs::read_to_string(&full_path)
.map_err(|e| ApiError::internal(&format!("Cannot read file {}: {e}", body.file)))?;

let content_hash = {
use sha2::{Digest, Sha256};
let mut hasher = Sha256::new();
hasher.update(content.as_bytes());
format!("{:x}", hasher.finalize())
};

let config = crate::config::Config::load().unwrap_or_default();

let result = crate::indexer::index_file(
&body.file,
&content,
&content_hash,
&store,
&mut *embedder,
&state.vault_path,
&config,
)
.map_err(|e| ApiError::internal(&format!("{e:#}")))?;

store
.delete_edges_for_file(result.file_id)
.map_err(|e| ApiError::internal(&format!("{e:#}")))?;
crate::indexer::build_edges_for_file(&store, result.file_id, &content)
.map_err(|e| ApiError::internal(&format!("{e:#}")))?;

Ok(Json(serde_json::json!({
"file": body.file,
"chunks": result.total_chunks,
"docid": result.docid,
})))
}

// ---------------------------------------------------------------------------
// Tests
// ---------------------------------------------------------------------------
Expand Down
1 change: 1 addition & 0 deletions src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1280,6 +1280,7 @@ async fn main() -> Result<()> {
tags,
folder,
created_by: "cli".into(),
auto_link: None,
};
let result = engraph::writer::create_note(
input,
Expand Down
26 changes: 24 additions & 2 deletions src/openapi.rs
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ pub fn build_openapi_spec(server_url: &str) -> serde_json::Value {
paths.insert("/api/unarchive".into(), build_unarchive());
paths.insert("/api/update-metadata".into(), build_update_metadata());
paths.insert("/api/delete".into(), build_delete());
paths.insert("/api/reindex-file".into(), build_reindex_file());

// Migration endpoints
paths.insert("/api/migrate/preview".into(), build_migrate_preview());
Expand All @@ -37,7 +38,7 @@ pub fn build_openapi_spec(server_url: &str) -> serde_json::Value {
"openapi": "3.1.0",
"info": {
"title": "engraph",
"version": "1.5.0",
"version": "1.5.5",
"description": "AI-powered semantic search and management API for Obsidian vaults."
},
"servers": [{ "url": server_url }],
Expand Down Expand Up @@ -220,7 +221,8 @@ fn build_create() -> serde_json::Value {
"filename": { "type": "string", "description": "Filename without .md" },
"type_hint": { "type": "string", "description": "Type hint for placement" },
"tags": { "type": "array", "items": { "type": "string" }, "description": "Tags to apply" },
"folder": { "type": "string", "description": "Explicit folder (skips auto-placement)" }
"folder": { "type": "string", "description": "Explicit folder (skips auto-placement)" },
"auto_link": { "type": "boolean", "description": "Set to false to skip automatic wikilink resolution. Defaults to true." }
}
}}}
},
Expand Down Expand Up @@ -428,6 +430,26 @@ fn build_delete() -> serde_json::Value {
})
}

fn build_reindex_file() -> serde_json::Value {
serde_json::json!({
"post": {
"operationId": "reindexFile",
"summary": "Re-index a single file after external edits. Re-reads, re-embeds, and updates search index.",
"requestBody": {
"required": true,
"content": { "application/json": { "schema": {
"type": "object",
"required": ["file"],
"properties": {
"file": { "type": "string", "description": "File path relative to vault root" }
}
}}}
},
"responses": { "200": { "description": "Re-indexed file info (chunks, docid)" } }
}
})
}

fn build_migrate_preview() -> serde_json::Value {
serde_json::json!({
"post": {
Expand Down
69 changes: 69 additions & 0 deletions src/serve.rs
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,8 @@ pub struct CreateParams {
pub tags: Option<Vec<String>>,
/// Explicit folder path (skips placement engine).
pub folder: Option<String>,
/// Set to false to skip automatic wikilink resolution. Defaults to true.
pub auto_link: Option<bool>,
}

#[derive(Debug, Deserialize, JsonSchema)]
Expand Down Expand Up @@ -184,6 +186,12 @@ pub struct DeleteParams {
pub mode: Option<String>,
}

#[derive(Debug, Deserialize, JsonSchema)]
pub struct ReindexFileParams {
/// File path relative to vault root (e.g. "07-Daily/2026-04-10.md").
pub file: String,
}

// ---------------------------------------------------------------------------
// Server
// ---------------------------------------------------------------------------
Expand All @@ -198,6 +206,7 @@ pub struct EngraphServer {
embedder: Arc<Mutex<Box<dyn EmbedModel + Send>>>,
vault_path: Arc<PathBuf>,
profile: Arc<Option<VaultProfile>>,
#[allow(dead_code)] // Required by rmcp #[tool_router] macro infrastructure
tool_router: ToolRouter<Self>,
/// Query expansion orchestrator (None when intelligence is disabled or failed to load).
orchestrator: Option<Arc<Mutex<Box<dyn OrchestratorModel + Send>>>>,
Expand Down Expand Up @@ -512,6 +521,7 @@ impl EngraphServer {
tags: params.0.tags.unwrap_or_default(),
folder: params.0.folder,
created_by: "claude-code".into(),
auto_link: params.0.auto_link,
};
let result = crate::writer::create_note(
input,
Expand Down Expand Up @@ -818,6 +828,64 @@ impl EngraphServer {
});
to_json_result(&result)
}

#[tool(
name = "reindex_file",
description = "Re-index a single file after external edits. Reads the file from disk, re-embeds its chunks, and updates the search index. Use when a file was modified outside engraph and you need the index to reflect current content."
)]
async fn reindex_file(
&self,
params: Parameters<ReindexFileParams>,
) -> Result<CallToolResult, McpError> {
let store = self.store.lock().await;
let mut embedder = self.embedder.lock().await;
let rel_path = params.0.file;
let full_path = self.vault_path.join(&rel_path);

// Read file content from disk
let content = std::fs::read_to_string(&full_path).map_err(|e| {
McpError::new(
rmcp::model::ErrorCode::INVALID_PARAMS,
format!("Cannot read file {rel_path}: {e}"),
None::<serde_json::Value>,
)
})?;

let content_hash = {
use sha2::{Digest, Sha256};
let mut hasher = Sha256::new();
hasher.update(content.as_bytes());
format!("{:x}", hasher.finalize())
};

let config = crate::config::Config::load().unwrap_or_default();

// Re-index the file (handles cleanup of old entries automatically)
let result = crate::indexer::index_file(
&rel_path,
&content,
&content_hash,
&store,
&mut *embedder,
&self.vault_path,
&config,
)
.map_err(|e| mcp_err(&e))?;

// Rebuild edges for the re-indexed file
store
.delete_edges_for_file(result.file_id)
.map_err(|e| mcp_err(&e))?;
crate::indexer::build_edges_for_file(&store, result.file_id, &content)
.map_err(|e| mcp_err(&e))?;

let output = serde_json::json!({
"file": rel_path,
"chunks": result.total_chunks,
"docid": result.docid,
});
to_json_result(&output)
}
}

#[tool_handler]
Expand All @@ -829,6 +897,7 @@ impl rmcp::handler::server::ServerHandler for EngraphServer {
Write: create for new notes, append to add content, edit to modify a section, rewrite to replace body, \
edit_frontmatter for tags/properties, update_metadata for bulk tag/alias replacement. \
Lifecycle: move_note to relocate, archive to soft-delete, unarchive to restore, delete for permanent removal. \
Index: reindex_file to refresh a single file's index after external edits. \
Migration: migrate_preview to classify notes into PARA folders, migrate_apply to execute the migration, migrate_undo to revert.",
)
}
Expand Down
Loading
Loading