AIDEN (AI Documentation Engine for AcreetionOS) is a web-based AI assistant that answers questions about AcreetionOS using Retrieval-Augmented Generation (RAG). It provides an intuitive chat interface for users to query documentation with source citations.
- Chat Interface: Natural language Q&A about AcreetionOS
- Local RAG Pipeline: No external API calls required for AI responses
- Qdrant Vector Database: Fast similarity search over documentation
- Ollama Integration: Local LLM inference with privacy-preserving design
- Automatic Indexing: Background documentation processing
- Source Citations: Every response links to relevant documentation
- Modern Web UI: Dark theme, responsive design, real-time status
- WebSocket Streaming: Smooth streaming responses
- Progress Tracking: Visual indexing progress indicators
┌─────────────────────────────────────────────────────────────────────┐
│ Web Client (Browser) │
│ src/frontend/index.html │
│ - Chat interface, status indicators │
│ - WebSocket for streaming responses │
└────────────────────────────────┬────────────────────────────────────┘
│ HTTP/WebSocket
▼
┌─────────────────────────────────────────────────────────────────────┐
│ Axum Web Server (Port 8081) │
│ Rust + Tokio │
│ ┌────────────────────────────────────────────────────────────────┐ │
│ │ AppState (RwLock) │ │
│ │ - Config (ollama, search, indexing settings) │ │
│ │ - Messages (conversation history) │ │
│ │ - IndexingStatus (background job progress) │ │
│ └────────────────────────────────────────────────────────────────┘ │
└────────────────────────────────┬────────────────────────────────────┘
│
┌───────────────────────┼───────────────────────┐
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ OllamaService │ │ QdrantService │ │ RAGService │
│ │ │ │ │ │
│ - chat() │ │ - init_ │ │ - query() │
│ - embeddings() │ │ collection() │ │ │
│ - health_check()│ │ - search() │ │ │
│ │ │ - upsert() │ │ │
└────────┬────────┘ └────────┬────────┘ └────────┬────────┘
│ │ │
│ ┌──────────────────┘ │
│ │ │
▼ ▼ ▼
┌─────────────────┐ ┌─────────────────────────────────────────────┐
│ Ollama API │ │ Qdrant │
│ (LLM + Embed) │ │ (Vector Database) │
│ Port 11434 │ │ Port 6334 │
└─────────────────┘ └─────────────────────────────────────────────┘
-
Ollama - For local LLM inference
curl -fsSL https://ollama.com/install.sh | sh -
Qdrant - For vector database
docker pull qdrant/qdrant docker run -p 6333:6333 -p 6334:6334 -v qdrant_data:/qdrant/storage qdrant/qdrant
-
Rust - For building
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
# Clone the repository
git clone https://gitlab.acreetionos.org/natalie/aiden.git
cd aidenaiden
# Install Ollama models
ollama pull llama3.2:1b
ollama pull nomic-embed-text
# Build
cargo build --release
# Run
./target/release/aidenOpen http://localhost:8081 in your browser.
Configuration is defined in src/state.rs. Key options:
| Option | Default | Description |
|---|---|---|
ollama.host |
"localhost:11434" |
Ollama API server |
ollama.chat_model |
"llama3.2:1b" |
Chat LLM model |
ollama.embed_model |
"nomic-embed-text" |
Embedding model |
search.threshold |
0.7 |
Minimum similarity score |
search.max_results |
5 |
Max documents to retrieve |
indexing.docs_path |
"./docs" |
Documentation directory |
indexing.chunk_size |
512 |
Words per chunk |
| Endpoint | Method | Description |
|---|---|---|
/ |
GET | Web interface |
/api/chat |
POST | Non-streaming chat |
/api/chat/stream |
POST (WS) | Streaming chat |
/api/index |
POST | Trigger indexing |
/api/index/status |
GET | Indexing status |
/api/health |
GET | Health check |
See DEPLOYMENT.md for detailed deployment instructions including:
- Docker deployment
- Systemd service setup (for boot)
- Reverse proxy configuration
- Production optimizations
- Architecture - System design and data flow
- API Reference - REST API documentation
- Configuration - All configuration options
- Troubleshooting - Common issues and solutions
See CONTRIBUTING.md for contribution guidelines.
See CHANGELOG.md for version history.
See SECURITY.md for security policy.
MIT License. See LICENSE.
- Natalie Spiva - natalie@acreetionos.org
- GitLab: https://gitlab.acreetionos.org/natalie/aiden
- GitHub: https://github.com/AcreetionOS-Code/aiden
- Live Instance: https://aiden.acreetionos.org