Skip to content

SBrewer15/CellMate

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CellMate

Install CellMate in Development mode

# make sure CellMate package is installed in development mode
$ pip install -e .

# make changes under nbs/ directory
# ...

# compile to have changes apply to CellMate
$ nbdev_prepare

Usage

Installation

Install latest from the GitHub repository:

$ pip install git+https://github.com/SBrewer15/CellMate.git

Future work:

or from conda

$ conda install -c SBrewer15 CellMate

or from pypi

$ pip install CellMate

Documentation

Documentation can be found hosted on this GitHub repository’s pages. Additionally you can find package manager specific guidelines on conda and pypi respectively.

Quirks & Known Limitations

Just use Solveit. Seriously. CellMate is a hack for working locally when you absolutely need to — for example, when dealing with proprietary, confidential, or otherwise sensitive data. If that’s not your situation, you’ll have a much better experience with a purpose-built tool.

That said, here’s what to watch out for:

Notebook saving can hang

save_nb() relies on JupyterLab’s asynchronous save. If it seems stuck, hit Ctrl+S to manually trigger a save and unstick it.

Conversation history

Currently, the entire notebook above the calling cell is sent as context on every call. There’s no persistent back-and-forth chat. A future version may split the notebook into a conversation history using CellMate prompt cells as delimiters.

Images only from code outputs

Images are collected from code cell outputs only. Images embedded in markdown cells (e.g. ![](image.png)) are not currently included in the context.

No audio support

Audio outputs are not yet handled.

Page reload on response

When CellMate inserts its response, it triggers a full page reload (window.location.reload()) to display the new cell. This is a hack — the response doesn’t appear dynamically; the whole notebook reloads.

Response cell detection has edge cases

CellMate locates the calling cell by finding the last unexecuted code cell containing the function name. The response is inserted directly below it. This means: - If you have multiple 🧠🧠

🧠 response cells and re-run an earlier prompt, the response may end up at the wrong position. - It can inadvertently overwrite the most recent LLM response cell.

Only tested in Chrome

No guarantees on other browsers.

Quick Start

from cellmate import CellMate

cm = CellMate()

CellMate auto-detects the variable name you assign it to, so it can find the calling cell. If no model is specified, it selects the smallest available non-embedding model.

Ask a question

In a new cell:

cm(“What is a Fourier transform?”)

Copied!

The response appears as a markdown cell directly below.

Choose a model

cm = CellMate(model=‘qwen2.5:14b’)

Copied!

Or switch later:

cm.change_model(‘llama3:8b’)

Switch modes

CellMate defaults to tutor mode (Socratic, step-by-step), and also includes a pithy mode (terse, code-first, expert-to-expert).

cm.set_mode(‘pithy’) cm.set_mode(‘tutor’, custom=“Focus on statistics concepts”)

Add tools

For models that support function calling, pass Python functions as tools:

def search(query: str) -> str: “““Search the web for a query.”“” …

cm.add_tools(search)

Exclude cells from context

Add @ai-ignore anywhere in a cell’s source to exclude it from the context sent to the model:

@ai-ignore API_KEY = “sk-…”

Vision models

When using a vision-capable model, CellMate automatically includes image outputs (plots, displayed images) from prior cells in the context.

cm = CellMate(model=‘llava:13b’)

API Reference

Method Description
CellMate(model, func_name, tools, mode) Create an instance. All args optional.
cm("query") Send a query with full notebook context.
cm.set_mode(mode, custom) Switch system prompt ('tutor' or 'pithy'). Optional custom string appended.
cm.change_model(name) Switch to a different Ollama model.
cm.add_tools(tools) Add tool function(s) for function-calling models.
cm.save_nb() Manually trigger a notebook save.

Built On

About

Embed LLM interactions directly in Jupyter Notebooks. Sends notebook context to a local Ollama model and inserts responses as inline markdown cells.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors