Skip to content

Animadversio/ipython-kernel-mcp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

ipython-kernel-mcp

An MCP server that exposes a persistent IPython kernel to Claude Code (and any MCP client). Claude can execute Python interactively, inspect variables, switch conda environments on the fly, and export the session as a Jupyter notebook.

Inspired by GPT-Auto-Data-Analytics — the tool-calling loop from that project, ported from OpenAI function-calling to the MCP standard so it works natively with Claude Code and any MCP-compatible agent.

Why

Pain point How this helps
Variables lost between tool calls Kernel state persists for the whole session
Plots not visible to the agent Images returned as native MCP image content
Hard to share results kernel_save_notebook exports .ipynb / HTML / PDF
Locked to one Python env kernel_use_env('myenv') switches conda envs on the fly
Session lost when Claude exits Attach to a running JupyterLab kernel for persistence

How it works

Claude Code
  └── MCP client
        └── ipython-kernel-mcp (this server)
              ├── mode: in-process  →  IPython InteractiveShell (default, fresh each session)
              └── mode: attached    →  Jupyter kernel via jupyter_client
                    ├── kernel_use_env('myenv')  →  launches kernel in any conda env
                    └── kernel_attach(id)         →  connects to running JupyterLab kernel

Tools

Tool Description
kernel_exec(code) Run Python — stdout, rich text, and plots returned as images
kernel_inspect(var_name) repr() of any variable in the kernel namespace
kernel_list_vars() Show all user-defined variables and their types
kernel_status() Current mode, active env, and execution count
kernel_restart() Clear all variables and history
kernel_add_markdown(text) Insert a markdown cell into the exported notebook
kernel_save_notebook(path, save_html, save_pdf) Export session as .ipynb
kernel_list_envs() List all conda/mamba environments on this machine
kernel_use_env(env_name) Start a kernel in a named conda env and auto-attach
kernel_list_running() List running Jupyter kernels (for JupyterLab attach)
kernel_attach(kernel_id) Attach to an existing running Jupyter kernel
kernel_detach() Disconnect and return to in-process mode

Installation

git clone https://github.com/Animadversio/ipython-kernel-mcp.git
cd ipython-kernel-mcp
pip install -e .

To install inside a specific conda environment (recommended — the server then runs in that env by default):

conda activate myenv
pip install -e /path/to/ipython-kernel-mcp

Connecting to Claude Code

Quickest way — use the claude CLI to register globally (user scope = all projects):

claude mcp add --scope user ipython-kernel /path/to/env/bin/ipython-kernel-mcp

Manual config — add to ~/.claude.json or a project .mcp.json:

{
  "mcpServers": {
    "ipython-kernel": {
      "command": "/path/to/your/env/bin/ipython-kernel-mcp"
    }
  }
}

Restart Claude Code. Tools appear as mcp__ipython-kernel__kernel_exec etc.

Using a specific conda environment as the default

Install the package into that env and point the MCP command at its binary:

# Install into your preferred env
conda activate torch
pip install -e /path/to/ipython-kernel-mcp

# Register
claude mcp add --scope user ipython-kernel \
  ~/miniforge3/envs/torch/bin/ipython-kernel-mcp

The server (and the in-process kernel) then run entirely inside that environment.

Switching envs without restarting Claude Code

Once registered, the agent can switch environments mid-session using kernel_use_env:

User:   analyze this dataset — it needs torch and sklearn
Claude: [calls kernel_list_envs()]        → sees: base, torch, ml-env
Claude: [calls kernel_use_env('torch')]   → starts kernel in torch env
Claude: [calls kernel_exec('import torch; print(torch.__version__)')]  → 2.x.x
Claude: [calls kernel_exec('..analysis code..')]
Claude: [calls kernel_save_notebook('results.ipynb')]

The target env needs ipykernel installed (conda install ipykernel).

Attach to a running JupyterLab kernel

This lets kernel state persist across Claude Code sessions:

  1. Start JupyterLab in your env and run some cells to set up your data
  2. In Claude Code: kernel_list_running() → see the kernel ID
  3. kernel_attach('<kernel_id>') → Claude now operates in your live kernel
  4. Close and reopen Claude Code — re-attach to pick up where you left off

Example session

You:    Load ~/data/experiment.csv and give me a statistical summary with plots

Claude: [kernel_exec] import pandas as pd; df = pd.read_csv('~/data/experiment.csv')
Claude: [kernel_exec] df.describe()                    → table output
Claude: [kernel_exec] df.hist(figsize=(12,8))          → 📊 image returned
Claude: [kernel_add_markdown] "## Distribution Analysis"
Claude: [kernel_exec] df.corr()                        → correlation matrix
Claude: [kernel_save_notebook] 'experiment_analysis.ipynb', save_html=True
→ opens in browser, all code + outputs + plots included

Requirements

  • Python ≥ 3.10
  • mcp[cli] ≥ 1.0, ipython ≥ 8.0, nbformat, nbconvert, pillow
  • Target conda envs need ipykernel for kernel_use_env

Supported platforms

Tested on macOS. Should work on Linux. Windows support untested (paths use /).


Inspiration & lineage

This project is inspired by and ported from GPT-Auto-Data-Analytics.

That project built a local code-interpreter loop using OpenAI function-calling: an LLM agent writes Python, executes it in a live IPython kernel, reads the output, and iterates — with vision API support for interpreting generated plots.

This repo takes the same core idea and ports it to the MCP (Model Context Protocol) standard so it works natively with Claude Code and any MCP-compatible agent, without a custom tool loop. Claude's own agent loop handles the iteration; the kernel is just exposed as tools.

About

MCP server exposing a persistent IPython kernel to Claude Code — interactive Python, conda env switching, and notebook export

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages