Skip to content

zorahrel/agent-notch

agent-notch logo

agent-notch

Live AI coding sessions, in your MacBook Pro dynamic notch.

Always-on HUD over the macOS dynamic notch + menu-bar surface. Shows which sessions are awaiting your input, which are stuck on a tool call, and your top open todos — without flipping windows.

version license ci tests multi-provider swift macos


What it shows

When you hover the notch, you get an expanded HUD with:

  • Sessions sidebar (right peek) — every live AI coding agent session your machine is running. Each row shows:
    • Outer ring colour = provider (Claude orange-amber, Aider green, Cursor blue, ChatGPT teal-green, custom = grey)
    • Inner dot colour = refined status (awaiting_user_input orange, tool_pending blue, crashed red, working green, idle grey)
    • Tooltip = provider display name (e.g. "Claude Code")
    • Click → opens the orchestrator dashboard tab focused on that pid
  • Todo strip (thin row, top or bottom) — top-3 open todos from your Apple Reminders list. Tap = mark complete. Long-press = reassign to a different session.
  • Aura + Orb visualisations — voice activity feedback (driven by SSE events).
  • Live transcription bubble — local on-device Apple speech recognition during voice input.

The notch widget is built on DynamicNotchKit, the de-facto Swift API for the M3+ MacBook Pro dynamic notch.

Why a standalone app

It started as Phase 2 of an internal multi-channel router (Jarvis). Extracted because the HUD is genuinely reusable — anyone running multiple AI coding agents in parallel benefits from a unified notch surface, regardless of which backend orchestrates the sessions. Two consumers planned:

  • Jarvis Router — current default backend. HTTP at localhost:3340.
  • Topics App — future consumer. Will run its own backend.
  • Standalone with agent-conductor — planned for v0.2. The CLI in agent-conductor will gain a watch subcommand that streams JSON-Lines on stdout; this app will spawn it as a subprocess. Zero HTTP, zero ports, zero config.

Install

For now, build from source:

git clone https://github.com/zorahrel/agent-notch.git
cd agent-notch
swift build -c release
cp -r .build/release/agent-notch /Applications/agent-notch

A signed .app bundle + Homebrew Cask are on the v0.2 roadmap.

Configuration

The app reads its backend URL from an environment variable.

# Default (matches Jarvis Router setup):
AGENT_NOTCH_BACKEND_URL=http://localhost:3340

# Topics App on a different port:
AGENT_NOTCH_BACKEND_URL=http://localhost:4200

# Remote backend (over an SSH tunnel, e.g.):
AGENT_NOTCH_BACKEND_URL=http://127.0.0.1:9999

The backend must expose:

Endpoint Method Purpose
/api/notch/stream GET (SSE) Push channel for all real-time events (sessions, todos, voice)
/api/notch/send POST Submit a user-typed message to the chat backend
/api/notch/prefs GET/POST Read & persist user preferences (UI density, theme)
/api/notch/voice POST (multipart) Upload an audio blob, returns transcription
/api/notch/abort POST Cancel the in-flight LLM call
/api/local-sessions GET List live AI coding sessions (drives the sidebar)
/api/todos/<id> GET/PATCH Read / reassign a todo
/api/todos/<id>/complete POST Mark a todo done

Reference implementations live in jarvis-claudecode (router/dashboard) and the examples/ folder of agent-conductor.

Roadmap

v0.2 — Subprocess backend (planned)

  • OrchestratorBackend Swift class that spawns agent-conductor watch --format json-lines and pipes stdout into the event bus
  • NotchBackend protocol so users can pick HTTP vs subprocess in the Preferences pane
  • Drop the AGENT_NOTCH_BACKEND_URL env var as the only config — add a ~/Library/Application Support/agent-notch/config.json
  • Pre-built signed .app bundle via GitHub Releases

v0.3 — Multi-provider chat surface

  • ChatProvider protocol: ClaudeChatProvider, OpenAIChatProvider
  • Subscription-based auth (Claude.ai web session, ChatGPT web session) — no per-token API keys required for users with a subscription
  • Per-provider chat tab in the expanded notch view
  • Tool exposure: agent-conductor primitives (snapshot, inject, todos) become callable tools that any chat provider can invoke through function calling / MCP

v0.4 — Polish

  • Homebrew Cask: brew install --cask agent-notch
  • Auto-update via Sparkle
  • Localisation (it, fr, de, ja, zh)
  • Accessibility: VoiceOver labels for every HUD element

Maybe-someday

  • iOS companion (notch on iPhone shows the same view via Bonjour)
  • System extensions for non-notch Macs (M1 / Intel / external displays) → menu-bar window equivalent
  • Plug-in API: third-party views render as widgets inside the expanded notch

Architecture

┌─────────────────────────────────────────────────┐
│  agent-notch.app  (this repo)                   │
│                                                 │
│   ┌──────────────────┐    ┌──────────────────┐  │
│   │  NotchController │ ←→ │   NotchEventBus  │  │
│   └────────┬─────────┘    └────────┬─────────┘  │
│            │                       │            │
│            ▼                       ▼            │
│   ┌──────────────────┐    ┌──────────────────┐  │
│   │ DynamicNotchKit  │    │  HTTP / SSE      │  │
│   │   panels +       │    │  client          │  │
│   │   SwiftUI views  │    │ (Wave 2:         │  │
│   └──────────────────┘    │  subprocess too) │  │
│                           └──────────────────┘  │
└─────────────────────────────────────────────────┘
            │                       ▲
            │                       │
            ▼                       │
  ┌─────────────────┐   HTTP / SSE  │
  │  Your backend   │ ──────────────┘
  │                 │
  │  jarvis-router  │  ← default consumer today
  │  topics-app     │  ← future consumer
  │  agent-conductor│  ← future, via subprocess (Wave 2)
  └─────────────────┘

Development

git clone https://github.com/zorahrel/agent-notch.git
cd agent-notch
swift build
swift test
swift run agent-notch              # foreground run with logs

The notch will appear as soon as you hover the top centre of your screen. Quit via the menu-bar host or pkill agent-notch.

Privacy & data

  • No telemetry. No analytics. No network calls except to the backend URL you configure.
  • Voice transcription is on-device (Apple's SFSpeechRecognizer framework).
  • Audio data is only uploaded to your backend if you explicitly trigger a voice command via hover-record.
  • No PII in any committed fixture or test.

Contributing

See CONTRIBUTING.md. Bug reports and feature requests via GitHub issues.

License

MIT

About

Live AI coding sessions, in your MacBook Pro dynamic notch. Standalone macOS HUD app. Built on DynamicNotchKit. Extracted from jarvis-claudecode Phase 2 for reuse across orchestrators (Jarvis, Topics, agent-conductor).

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Contributors