Spark is a powerful, multi-provider LLM interface for conversational AI with integrated tool support. It supports AWS Bedrock, Anthropic Direct API, and Ollama local models through both CLI and Web interfaces.
- Multi-Provider Support - AWS Bedrock, Anthropic Direct API, and Ollama local models
- Dual Interface - Rich CLI terminal UI and modern Web browser interface
- MCP Tool Integration - Connect external tools via Model Context Protocol
- Intelligent Context Management - Automatic conversation compaction with model-aware limits
- Security Features - Prompt inspection, tool permissions, and audit logging
- Multiple Database Backends - SQLite, MySQL, PostgreSQL, and Microsoft SQL Server
pip install dtSparkRun the interactive setup wizard to configure Spark:
spark --setupThis guides you through:
- LLM provider selection and configuration
- Database setup
- Interface preferences
- Security settings
# Start with CLI interface
spark
# Or use the alternative command
dtSparkComprehensive documentation is available in the docs folder:
- Installation Guide - Detailed installation instructions
- Configuration Reference - Complete config.yaml documentation
- Features Guide - Detailed feature documentation
- CLI Reference - Command-line options and chat commands
- Web Interface - Web UI guide
- MCP Integration - Tool integration documentation
- Security - Security features and best practices
graph LR
subgraph Interfaces
CLI[CLI]
WEB[Web]
end
subgraph Core
CM[Conversation<br/>Manager]
end
subgraph Providers
BEDROCK[AWS Bedrock]
ANTHROPIC[Anthropic]
OLLAMA[Ollama]
end
subgraph Tools
MCP[MCP Servers]
BUILTIN[Built-in Tools]
end
CLI --> CM
WEB --> CM
CM --> BEDROCK
CM --> ANTHROPIC
CM --> OLLAMA
CM --> MCP
CM --> BUILTIN
- Python 3.10 or higher
- AWS credentials (for Bedrock)
- Anthropic API key (for direct API)
- Ollama server (for local models)
MIT Licence - see LICENSE for details.
Matthew Westwood-Hill matthew@digital-thought.org
- Documentation: docs/
- Issues: GitHub Issues