A terminal-based dashboard for monitoring the status of code assistant services in real-time.
- ✅ Real-time status monitoring of code assistant services
- ✅ Automatic refresh every 10 minutes
- ✅ Interactive keyboard navigation
- ✅ One-click access to status pages
- ✅ Beautiful terminal UI with emoji status indicators
- ✅ Self-contained executable (no Python installation needed)
- GitHub Copilot
- Cursor
- Claude Code
- Gemini Code Assist (GCP)
brew tap jbarson/coding-llm-monitor
brew install coding-llm-monitorThen run:
coding-llm-monitor- Download the
coding-llm-monitorexecutable from the latest release - Make it executable:
chmod +x coding-llm-monitor-macos(orcoding-llm-monitor-linux) - Run:
./coding-llm-monitor-macos(macOS) or./coding-llm-monitor-linux(Linux)
# 1. Create virtual environment
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# 2. Install dependencies
pip install -r requirements.txt
# 3. Run the application
python status.py- ↑/↓ Arrow Keys: Navigate between services
- Enter: Open selected service's status page in browser
- Q: Quit the application
- Ctrl+C: Force quit (if needed)
To create a standalone executable for distribution:
# macOS/Linux
./build.sh
# Windows
build.batSee DISTRIBUTION.md for detailed build instructions.
- ✅ Green: Operational/Available
⚠️ Yellow: Degraded Performance- ❌ Red: Major Outage/Error
- 🔧 Cyan: Maintenance
- ❓ Grey: Unknown
This project includes a comprehensive test suite with 59 tests covering all major functionality.
# Install test dependencies
pip install -r requirements-test.txt
# Run all tests
pytest
# Run with coverage report
pytest --cov=status --cov-report=htmlSee TESTING.md for detailed testing documentation.
QUICK_START.md- Quick start guide for usersDISTRIBUTION.md- Guide for building and distributing executablesTESTING.md- Testing guide and documentation
- Python 3.8+ (if running from source)
- Internet connection (for status checks)
This project is licensed under the MIT License - see the LICENSE file for details.
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.