A desktop AI assistant with persistent memory and a floating Clippy-style interface.
- Floating Assistant UI - Draggable chat window with modern design
- Persistent Memory - Remembers user identity, preferences, and conversation context
- Semantic Memory Retrieval - Embedding-based search understands meaning, not just keywords
- Privacy Controls - User consent system for memory storage (full/limited/session-only)
- Memory Viewer - See, filter, and delete stored memories
- Intent Routing - Extensible tool system for time, system info, and more
- LLM Integration - Google Gemini API support
- Control Center - Qt-based GUI for configuration and management
sentri/
├── brain/ # Memory and logic core
│ ├── controller.py # Main orchestrator
│ ├── memory_store.py # Persistent storage
│ └── memory_retriever.py # Smart retrieval
├── clippy/ # Floating window UI
├── sentri_ui/ # Control center GUI
├── llm_adapters/ # LLM provider integrations
├── tools/ # Extensible tool system
└── sentri_core/ # Intent routing and registry
- Python 3.10+
- Linux (tested on Ubuntu/Debian)
- Clone the repository:
git clone <repo-url>
cd Sentri- Create virtual environment:
python3 -m venv venv
source venv/bin/activate- Install dependencies:
pip install -r requirements.txtOr manually:
pip install PySide6 google-genai python-dotenv sentence-transformers chromadb cryptography- Configure API key:
echo "GEMINI_API_KEY=your_api_key_here" > .envpython main.pyFrom the control center:
- Go to LLM Settings tab
- Enter your Gemini API key
- Click Save Configuration
- Go to Clippy tab
- Click Launch Assistant
Sentri automatically stores:
- Identity - Name, age, personal info
- Preferences - User likes/dislikes
- Relationships - People and connections
- Skills - What you know/can do
- Goals - Long-term objectives
- Habits - Patterns and routines
- Facts - General knowledge about the user
- Tasks - Reminders and to-dos
Semantic Search: Uses sentence-transformers (runs locally) to understand meaning:
- "What do I like?" finds "I enjoy hiking" and "My favorite is pizza"
- No exact keyword matching required
- Retrieval scored by: semantic similarity (60%), confidence (20%), recency (20%)
Privacy Controls: First-run consent dialog with three levels:
- Full - Remember everything
- Limited - Only preferences, tasks, goals
- Session Only - No persistence
Memory Viewer: Access from Control Center to view, filter, and delete stored memories.
Create a new tool in tools/:
from tools.base_tool import BaseTool
class MyTool(BaseTool):
name = "mytool"
def execute(self, user_input: str) -> dict:
return {"result": "tool output"}Register in sentri_core/intent_router.py:
if "keyword" in text:
return "mytool"Config stored in data/config.json:
{
"provider": "gemini",
"api_key": "your_key"
}Working:
- ✅ Memory storage and retrieval
- ✅ Semantic search with embeddings
- ✅ Privacy consent system
- ✅ Memory viewer UI
- ✅ Gemini LLM integration
- ✅ Floating window UI
- ✅ Intent routing
- ✅ Tool system foundation
In Progress:
- 🚧 Chat history UI
- 🚧 Settings panel
- 🚧 Animations
- 🚧 Local LLM support
- API key stored in plain text (use
.envfor now) - No error handling for network failures
- Some UI components incomplete
If you're upgrading from an older version, see UPGRADE_GUIDE.md for migration instructions.
- Semantic search with embeddings
- Memory management UI
- Privacy consent system
- Secure credential storage
- Multi-LLM support (OpenAI, local models)
- Export/import conversations
- Plugin system
- Voice input/output
- Memory decay and consolidation
[Add your license here]
[Add contribution guidelines]