A real-time AI coding assistant that watches your Neovim buffer and provides live feedback from multiple specialized agents. Think of it as pair programming with a council of AI experts watching your code as you type.
Algopeeps Council consists of three components working together:
- OpenCode Server - Manages AI agent sessions and handles prompts
- TUI Dashboard - Beautiful terminal UI showing agent feedback in real-time
- Neovim Plugin - Captures buffer events and sends them to the dashboard
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β β TCP β β SSE β β
β Neovim Plugin ββββββββββΆβ TUI Dashboard βββββββββββ OpenCode β
β (Lua client) β :9999 β (Bubble Tea) β β Server β
β β β β β (AI Agents) β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β β β
β β β
Watches Displays Analyzes
buffer feedback code
Data Flow:
- Neovim plugin detects buffer changes (debounced 5s)
- Sends buffer content + metadata via TCP to TUI
- TUI forwards to OpenCode server
- OpenCode runs prompts through configured agents
- Agent responses stream back via SSE
- TUI displays feedback in agent cards
- Go 1.24.4+
- Neovim 0.9+
- OpenCode CLI installed and configured
- Anthropic API key (for Claude models)
# Clone the repository
git clone https://github.com/abhirupda/algopeeps.git
cd algopeeps
# Build the binary
make build
# Or use go directly
go build -o algopeeps ./cmd/algopeepsCreate opencode.json in the project root (or copy the provided example):
{
"provider": "anthropic",
"model": "claude-sonnet-4-20250514",
"agents": {
"code-reviewer": {
"name": "code-reviewer",
"description": "Reviews code quality, style, and best practices",
"prompt": "You are a code review assistant..."
},
"bug-spotter": {
"name": "bug-spotter",
"description": "Identifies potential bugs and edge cases",
"prompt": "You are a bug detection assistant..."
}
}
}Start the OpenCode server with this config:
opencode serve --config opencode.jsonAdd to your Neovim config (lazy.nvim example):
{
dir = "~/path/to/algopeeps/nvim",
name = "algopeeps",
config = function()
require("algopeeps").setup({
host = "127.0.0.1",
port = 9999,
debounce_ms = 5000, -- Wait 5s after typing before sending update
})
end,
}For other plugin managers, ensure the nvim/ directory is in your runtimepath.
Terminal 1 - OpenCode Server:
opencode serve --config opencode.jsonTerminal 2 - Algopeeps TUI:
./algopeeps
# Or: make runTerminal 3 - Neovim:
nvim somefile.go
# Inside Neovim:
:AlgopeepsConnect- Open a file in Neovim and run
:AlgopeepsConnect - Edit code, move cursor, or save files
- After 5 seconds of inactivity, buffer content is sent to TUI
- TUI forwards to OpenCode agents
- Agents analyze code and stream responses
- Feedback appears in the dashboard cards:
- π Code Reviewer - Quality, style, readability
- π Bug Spotter - Potential bugs, edge cases, errors
:AlgopeepsConnect- Connect to the TUI server:AlgopeepsDisconnect- Disconnect and stop sending updates
qorCtrl+C- Quit the dashboard
{
"provider": "anthropic", // AI provider (anthropic, openai, etc.)
"model": "claude-sonnet-4-...", // Model to use
"agents": {
"agent-id": {
"name": "agent-id", // Agent identifier
"description": "...", // What the agent does
"prompt": "..." // System prompt for the agent
}
}
}Adding Custom Agents:
- Add a new entry to the
agentsobject - Update
internal/tui/app.goto handle the agent ID - Add a new card in the
View()function - Restart OpenCode server and TUI
require("algopeeps").setup({
host = "127.0.0.1", -- TUI server host
port = 9999, -- TUI server port
debounce_ms = 5000, -- Debounce delay (milliseconds)
})Debounce Behavior:
TextChanged,TextChangedI,CursorMovedβ Debounced (5s default)BufWritePost,BufEnter,BufLeaveβ Immediate
Buffer events are sent as JSON over TCP:
{
"event": "buffer_changed",
"buffer": {
"name": "/path/to/file.go",
"filetype": "go",
"content": "package main\n...",
"cursor": { "line": 42, "col": 10 },
"line_count": 100
}
}Check:
- Is
opencode serverunning? - Is the config file valid JSON?
- Check OpenCode logs for errors
Fix:
# Test OpenCode is working
opencode --version
# Check if server is running
ps aux | grep "opencode serve"
# Restart with verbose logging
opencode serve --config opencode.json --verboseCheck:
- Did you run
:AlgopeepsConnect? - Is the TUI running on port 9999?
- Check for port conflicts
Fix:
# Check if port is in use
lsof -i :9999
# In Neovim
:messages " Check for Lua errors
:AlgopeepsDisconnect
:AlgopeepsConnectCheck:
- Are agent IDs in
opencode.jsonmatching the IDs in code? - Check OpenCode server logs
- Verify API key is set
Fix:
# Check agent IDs match
cat opencode.json | jq '.agents | keys'
# Should include: "code-reviewer", "bug-spotter"
# Test API key
export ANTHROPIC_API_KEY="your-key"
opencode serve --config opencode.jsonCheck:
- Is debounce delay too long?
- Are you in insert mode? (Try exiting to normal mode)
- Check TUI shows "Neovim β" (connected)
Fix:
-- Reduce debounce in Neovim config
require("algopeeps").setup({
debounce_ms = 2000, -- Reduce to 2 seconds
})Tips:
- Increase debounce delay to reduce API calls
- Use smaller/cheaper models (Claude Haiku instead of Sonnet)
- Set buffer size limits in
internal/tui/app.go(currently 100KB with 50-line context)
algopeeps/
βββ cmd/algopeeps/ # Main entry point
β βββ main.go # Starts TUI and TCP server
βββ internal/
β βββ config/ # Configuration management
β βββ opencode/ # OpenCode SDK client
β βββ protocol/ # TCP protocol types
β βββ server/ # TCP server for Neovim
β βββ tui/ # Bubble Tea TUI
β βββ app.go # Main TUI model
β βββ components/ # UI components (cards, status bar)
β βββ messages.go # Bubble Tea messages
β βββ styles.go # Lipgloss styles
βββ nvim/ # Neovim plugin
β βββ lua/algopeeps/
β βββ init.lua # Plugin entry point
β βββ client.lua # TCP client
β βββ debounce.lua # Debounce logic
βββ opencode.json # OpenCode agent configuration
βββ go.mod # Go dependencies
βββ Makefile # Build commands
βββ README.md # This file
- Agent session persistence (survive restarts)
- Multiple file context (send related files, not just current buffer)
- Interactive agent commands (ask questions, request refactors)
- Custom agent triggers (e.g., only run on save, not on every change)
- Agent response history/timeline
- Configurable UI themes
- Agent performance metrics (response time, token usage)
- Code diff mode (show what changed, agents comment on diffs)
- Integration with LSP (send diagnostics to agents)
- Multi-cursor support
- Project-level context (send project structure, dependencies)
- Agent voting/consensus (agents agree/disagree on feedback)
- Voice output (TTS for agent feedback)
- Web dashboard (view feedback in browser)
Contributions welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
MIT License - see LICENSE file for details
- Built with Bubble Tea TUI framework
- Powered by OpenCode SDK
- Inspired by the need for real-time AI pair programming