Loghead is a smart log aggregation tool and MCP server. It collects logs from various sources like your terminal, docker containers, or your browser, stores them in a database, and makes them searchable for AI assistants (like Claude, Cursor, or Windsurf).
Think of it as a "long-term memory" for your development logs that your AI coding agent can read.
Before you start, make sure you have:
- Node.js (v18 or higher).
- Ollama: Download here.
- Ensure it is running (
ollama serve) and accessible athttp://localhost:11434. - Pull the embedding model:
ollama pull qwen3-embedding:0.6b(or similar).
- Ensure it is running (
The core server handles the database and API.
npx @loghead/core
# OR
npx @loghead/core startThis command will:
- Initialize the local SQLite database (
loggerhead.db). - Start the API server on port
4567. - Print an MCP Server Token.
- Launch the Terminal UI for viewing logs.
You need to configure your AI assistant to talk to Loghead using the Model Context Protocol (MCP). Use the token printed in the previous step.
Edit your claude_desktop_config.json (usually in ~/Library/Application Support/Claude/ on macOS):
{
"mcpServers": {
"loghead": {
"command": "npx",
"args": ["-y", "@loghead/mcp"],
"env": {
"LOGHEAD_API_URL": "http://localhost:4567",
"LOGHEAD_TOKEN": "<YOUR_MCP_TOKEN>"
}
}
}
}Add the MCP server in your Windsurf configuration:
{
"mcpServers": {
"loghead": {
"command": "npx",
"args": ["-y", "@loghead/mcp"],
"env": {
"LOGHEAD_API_URL": "http://localhost:4567",
"LOGHEAD_TOKEN": "<YOUR_MCP_TOKEN>"
}
}
}
}Go to Settings > MCP and add a new server:
- Name:
loghead - Type:
stdio - Command:
npx -y @loghead/mcp - Environment Variables:
LOGHEAD_API_URL:http://localhost:4567LOGHEAD_TOKEN:<YOUR_MCP_TOKEN>
You can manage projects via the CLI (in a separate terminal):
npx @loghead/core projects add "My Awesome App"
# Copy the Project ID returnedCreate a stream to pipe logs into.
For Terminal Output:
npx @loghead/core streams add terminal --project <PROJECT_ID> --name "Build Logs"
# Copy the Stream Token returnedFor Docker Containers:
npx @loghead/core streams add docker --project <PROJECT_ID> --name "Backend API" --container my-api-container
# Copy the Stream Token returnedNow, feed logs into the stream using the ingestor tools.
Terminal Pipe:
# Pipe any command into loghead-terminal
npm run build | npx @loghead/terminal --token <STREAM_TOKEN>Docker Logs:
# Attach to a running container
npx @loghead/docker --token <STREAM_TOKEN> --container my-api-containerOnce connected, you can ask your AI assistant questions about your logs:
- "What errors appeared in the build logs recently?"
- "Find any database connection timeouts in the backend logs."
- "Why did the application crash?"
The @loghead/core package provides several commands to manage your log infrastructure.
- Start Server & UI:
npx @loghead/core
- List Projects:
npx @loghead/core projects list
- Add Project:
npx @loghead/core projects add "My Project Name" - Delete Project:
npx @loghead/core projects delete <PROJECT_ID>
-
List Streams:
npx @loghead/core streams list --project <PROJECT_ID>
-
Add Stream:
# Basic npx @loghead/core streams add <TYPE> <NAME> --project <PROJECT_ID> # Examples npx @loghead/core streams add terminal "My Terminal" --project <PROJECT_ID> npx @loghead/core streams add docker "My Container" --project <PROJECT_ID> --container <CONTAINER_NAME>
-
Get Stream Token:
npx @loghead/core streams token <STREAM_ID>
-
Delete Stream:
npx @loghead/core streams delete <STREAM_ID>
We provide a unified Calculator App in sample_apps/calculator_app that combines a Backend API, Frontend UI, and CLI capabilities to help you test all of Loghead's features in one place.
This app runs an Express.js server that performs calculations and logs them. It includes a web interface and can be containerized with Docker.
-
Create a Stream:
npx @loghead/core projects add "Calculator Project" # Copy Project ID npx @loghead/core streams add terminal --project <PROJECT_ID> --name "Terminal Logs" # Copy Stream Token
-
Run & Pipe Logs: Run the server locally and pipe its output to Loghead.
cd sample_apps/calculator_app npm install npm start | npx @loghead/terminal --token <STREAM_TOKEN>
-
Generate Traffic: Open
http://localhost:3000and perform calculations. The logs in your terminal will be sent to Loghead. -
Ask AI: "What calculations were performed recently?"
-
Create a Stream:
npx @loghead/core streams add docker --project <PROJECT_ID> --name "Docker Container" --container loghead-calc # Copy Stream Token
-
Run in Docker: Build and run the app as a container named
loghead-calc.cd sample_apps/calculator_app docker build -t loghead-calc . docker run --name loghead-calc -p 3000:3000 -d loghead-calc
-
Attach Loghead:
npx @loghead/docker --token <STREAM_TOKEN> --container loghead-calc
-
Generate Traffic & Ask AI: Perform actions in the browser. Ask: "Did any errors occur in the docker container?" (Try dividing by zero or simulating a crash).
-
Create a Stream:
npx @loghead/core streams add browser --project <PROJECT_ID> --name "Frontend Logs" # Copy Stream Token
-
Configure Extension: Install the Loghead Chrome Extension (if available) and set the Stream Token.
-
Use the App: Open
http://localhost:3000(or the Docker version). The app logs actions toconsole.log, which the extension will capture. -
Ask AI: "What interactions did the user have in the browser?"
To build from source:
- Clone the repo.
- Install dependencies:
npm install
- Build all packages:
npm run build
The @loghead/core server exposes a REST API on port 4567 (by default).
- List Projects
GET /api/projects
- Create Project
POST /api/projects- Body:
{ "name": "string" }
- Delete Project
DELETE /api/projects/:id
- List Streams
GET /api/streams?projectId=<PROJECT_ID>
- Create Stream
POST /api/streams/create- Body:
{ "projectId": "string", "type": "string", "name": "string", "config": {} }
- Delete Stream
DELETE /api/streams/:id
-
Get Logs
GET /api/logs- Query Params:
streamId: (Required) The Stream ID.q: (Optional) Semantic search query.page: (Optional) Page number (default 1).pageSize: (Optional) Logs per page (default 100, max 1000).
-
Ingest Logs
POST /api/ingest- Headers:
Authorization: Bearer <STREAM_TOKEN> - Body:
Note:
{ "streamId": "string", "logs": [ { "content": "log message", "metadata": { "level": "info" } } ] }logscan also be a single string or object.