This readme covers the major capabilities added to the mcu-debug VS Code extension, including editor integration for quick Live Watch additions, local recording to common file formats, a high-performance real-time graphing engine, and a Model Context Protocol (MCP) bridge for AI agent interoperability.
The extension hosts an embedded MCP server that exposes debugger state to external AI agents (Copilot, Antigravity, Cursor, Claude Desktop, etc.) without requiring Python environments, GDB scripting, or memory dump parsing.
- Open the Command Palette (Ctrl+Shift+P) and run
MCU-Debug: Generate MCP Configuration for AI Agents. - Choose a configuration format:
VS Code Native MCP— writes.vscode/mcp.json(VS Code-native MCP clients can auto-discover it) andmcu-debug-mcp.md.Generic MCP— writes.vscode/mcu-debug-mcp.jsonandmcu-debug-mcp.md; copy the JSON into external AI agents' MCP settings (Antigravity, Cursor, Claude Desktop, etc.).
- Start a new conversation or dialog and send
.vscode/mcu-debug-mcp.mdto agent to ensure it uses the built-in MCP tools.
- A TCP server listens on
127.0.0.1:51234using VS Code's internal Node.js runtime (ELECTRON_RUN_AS_NODE=1). - A bridge script (
support/mcp-bridge.js) connects external MCP clients to this TCP server. - The
mcu-debug: Generate MCP Configurationcommand auto-generates ready-to-use JSON configs and a comprehensive API reference document (mcu-debug-mcp.md).
| Tool | Input | Description |
|---|---|---|
get_livewatch_variables |
None | Returns a JSON snapshot of all watched variables and their current values. Unexpanded structs are flagged with <STRUCT_OR_ARRAY_UNEXPANDED>. |
add_livewatch_variable |
{ "expr": "..." } |
Adds a C/C++ expression to the Live Watch panel. |
expand_livewatch_struct |
{ "expr": "..." } |
Expands an unexpanded struct/array to reveal its children. |
record_livewatch_variables |
{ "duration_ms": N } |
Automatically records all leaf variables for a fixed duration and returns timeseries data. |
record_livewatch_variables_manual |
None | Records with user-controlled Start/Stop buttons in VS Code. Designed for hardware-synchronized data capture. |
When get_livewatch_variables encounters an unexpanded struct or array, it returns the sentinel value <STRUCT_OR_ARRAY_UNEXPANDED> instead of attempting a potentially dangerous bulk memory read. The AI agent must explicitly call expand_livewatch_struct to reveal the struct's children, then re-query get_livewatch_variables to read the expanded members.
Automatic Mode (default): The agent specifies duration_ms and a progress notification with a cancel button appears in VS Code. Duration is capped by mcu-ai-debug.mcpRecordingMaxDuration (default: 30s).
Manual Mode (opt-in via settings): When mcu-ai-debug.mcpRequireManualRecording is enabled, calling record_livewatch_variables returns a MANUAL_MODE_REQUIRED status, directing the agent to use record_livewatch_variables_manual instead. This tool presents two sequential VS Code prompts:
- "Start Recording" / "Cancel" -- the user clicks Start when they are physically ready.
- "Stop Recording" -- the user clicks Stop after completing their hardware operation.
This two-phase workflow allows engineers to precisely synchronize AI data capture with physical hardware operations (e.g., rotating a potentiometer, pressing a button, actuating a motor). Maximum wall-clock time is capped by mcu-ai-debug.mcpManualRecordingMaxDuration (default: 60s).
The MCU-AI-Debug: Generate MCP Configuration for AI Agents command produces a mcu-debug-mcp.md file that serves as both a human setup guide and an AI system prompt. It contains:
- Exhaustive per-tool API reference with all possible
statuscodes in table format. - Behavioral rules that prevent agents from writing custom scrapers or parsing GDB output.
- Cross-referencing logic between automatic and manual recording tools.
- VS Code settings reference table.
Make debugging workflows faster by adding expressions directly from the editor, saving one-shot JSON snapshots, and recording Live Watch data to local files.
- While debugging with MCU-Debug, select a C/C++ expression in the editor, right-click and choose "Add to Live Watch" to push the expression into the Live Watch panel instantly. The same action is available from the Command Palette via the
Add to Live Watchcommand (command id:mcu-ai-debug.liveWatch.addSelectionToLiveWatch).
- The Live Watch view now includes a
Save Snapshotaction that first shows the same leaf-variable picker used by recording, then exports the selected values to a.jsonfile. - Snapshots include
timestamp,isoTime,variable_count, and avariablesobject keyed by expression. - Only selected leaf variables are exported, so structs/arrays must be expanded first just like recording.
- The Live Watch supports recording selected leaf variables to a local file. Click the
Start Recording/Stop Recordingicon from the Live Watch view or run form the Command Palette — a Save dialog lets you choose.csvor.jsonl(newline-delimited JSON). - CSV files include a header row
Timestampplus quoted variable names; JSONL produces one JSON object per line with atimestampfield and the variables as keys. The recorder sanitizes common GDB value formats (char literals, enum assignments) to numeric/string values and caches last-known values so each recorded row contains a value for every selected column.
- Add expressions from the editor (right-click → "Add to Live Watch").
- Run
Save Snapshotfrom the Live Watch view, select the leaf variables you want, then export them to JSON. - Run
Start Recordingfrom the Live Watch view; select variables and a file path when prompted. - Stop recording with
Stop Recording(command or UI). The file will contain a timestamped timeseries in your chosen format.
The Live Watch system includes a hardware-accelerated HTML5 Canvas rendering engine for real-time variable visualization.
- Dual Display Modes: Switch between "Split Mode" (one graph per variable with independent Y-axes) and "Overlay Mode" (all variables on a shared Y-axis).
- Oscilloscope Auto-scroll: By default, the graph behaves like a real-time oscilloscope, automatically scrolling to show the latest data. Users can break into manual "PAN MODE" for free-form navigation of historical data.
- T and Y Sliders: Dedicated slider controls on the right and bottom edges allow precise zooming of the Time and Y axes. Mouse wheel zooming is also supported and synchronized with the sliders.
- Robust Rendering: Strict Canvas
ctx.clip()clipping prevents all label bleeding and border artifacts in both Split and Overlay modes.
| Setting | Type | Default | Description |
|---|---|---|---|
mcu-ai-debug.mcpRequireManualRecording |
boolean | false |
When enabled, automatic recording returns MANUAL_MODE_REQUIRED and agents must use the manual recording tool. |
mcu-ai-debug.mcpRecordingMaxDuration |
number | 30 |
Maximum recording duration in seconds for automatic mode. |
mcu-ai-debug.mcpManualRecordingMaxDuration |
number | 60 |
Maximum recording duration in seconds for manual mode. |
| File | Purpose |
|---|---|
src/frontend/mcp-server.ts |
MCP tool definitions and request handlers |
src/frontend/extension.ts |
MCP config generation and doc generation |
src/frontend/views/live-watch.ts |
Live Watch tree provider, leaf variable gathering, MCP listener hooks |
src/frontend/views/live-watch-grapher.ts |
Graph panel lifecycle management |
src/frontend/views/live-watch-logger.ts |
Recording, sanitization and save logic |
resources/live-watch-graph.js |
Canvas 2D rendering engine |
resources/live-watch-graph.html |
Graph panel HTML layout with slider controls |
support/mcp-bridge.js |
External MCP client TCP bridge |
As the upstream project mcu-debug/mcu-debug is not yet available on the VS Code Marketplace and remains in a phase of rapid internal iteration, several low-level modules (such as remote proxy support and multi-core orchestration) are currently in an "unfinished" or "prototype" state.
The core philosophy of this fork (MCU-AI-Debug) is:
- Lean & Stable: To allow developers to immediately benefit from AI (MCP) integration and UI enhancements (real-time graphing, local recording), we have proactively removed unfinished or unverified upstream components.
- Production Ready: We focus exclusively on mature, verified features to ensure that this distributed version is a stable "product" ready for daily engineering workflows.
- Active Tracking: This is not a detached fork. Once an upstream module (such as remote probe support or Capstone-based disassembly) matures and passes rigorous testing, we will merge and adapt it for AI/UI integration.
The upstream project uses a multi-component license model:
-
VS Code Extension (TypeScript / Debug Adapter layer): MIT License
See:LICENSE-MIT(and component-level license notes inLICENSE). -
MCU-Debug Helper (Rust symbol/disassembly server): Apache License 2.0
See:LICENSE-APACHE(andpackages/mcu-debug-helper/LICENSE).
Where individual source files include their own license headers, those headers apply. Otherwise, the component-level license applies as described above.
This is an unofficial fork and is not affiliated with or endorsed by the upstream maintainers.


