An intelligent conversational learning companion that integrates with your Obsidian knowledge base to provide contextual responses and automatically generate session summaries.
The assistant now uses a tool-based workflow, where the LLM suggests tool calls and the Python backend executes them. This enables:
- Modular tool functions for context search, session saving, and more
- Clear separation between LLM reasoning and backend actions
- Easier extensibility for new tools and features
How it works:
- The LLM returns a tool call (e.g.,
chat_with_context_toolorsave_session_tool) - The backend inspects the tool call, executes the corresponding Python function, and returns results
- The LLM then uses the tool results to generate a final conversational response
You can configure the assistant using the .env file and Python settings:
ANTHROPIC_API_KEY: Claude API keyVOYAGE_API_KEY: Voyage AI embeddings keyOBSIDIAN_VAULT_PATH: Path to your Obsidian vaultLLM_MODEL: Claude model (e.g.,claude-3-haiku-20240307)LLM_TEMPERATURE: LLM response creativity (float, e.g.,0.2)EMBEDDING_MODEL: Voyage model (e.g.,voyage-3-lite)TOP_K: Number of relevant documents to retrieve (default: 3)CHUNK_SIZE: Document chunk size for indexing (default: 512)CHUNK_OVERLAP: Overlap between chunks (default: 50)
These can be set in .env or in the relevant Python config files.
The project is organized for clarity and modularity:
LearningAssistant/
├── main.py # Application entry point & tool orchestration
├── requirements.txt # Python dependencies
├── .env # Environment variables
├── services/
│ ├── llm_service.py # LLM integration & prompt management
│ ├── vector_store.py # ChromaDB & Voyage AI embeddings
│ ├── obsidian_service.py # Obsidian vault operations
├── agent/
│ ├── learning_agent.py # Agent logic
│ └── tools/
│ ├── chat.py # Context search tool
│ ├── storage.py # Session save tool
│ └── analysis.py # Analysis tool helper
├── core/
│ ├── config.py # Configuration management
│ ├── conversation.py # Conversation state/history
├── utils/
│ └── prompt_templates.py # LLM prompt templates
├── tests/
│ ├── test_integration_basic.py
│ └── test_integration_complete.py
└── chroma_db/ # Vector database storage
Session summaries are automatically saved to your Obsidian vault in a daily folder structure, with unique filenames and backlinks to referenced files.
File Structure:
<OBSIDIAN_DAILY_NOTES_FOLDER>/
July 25, 2025/
What_is_machine_143022.md
Session Summary Example:
---
created: 2025-07-11T14:30:22
type: learning_session_summary
daily_folder: July 11, 2025
tags: [learning, session, summary]
---
# Learning Session Summary
<session summary content>
## Referenced Files
- [[Some_Note]]
- [[Another_Note]]Backlinks:
-
For every referenced file, a backlink is automatically added to the original note under a
## Referencessection:## References - [[Daily Notes/July 11, 2025/What_is_machine_143022]]
File Naming:
- Session summaries are named as
<SessionName>_<HHMMSS>.mdfor uniqueness and clarity.
This structure ensures easy navigation, traceability, and rich interlinking between your learning sessions and your existing notes.
- 🤖 Intelligent Conversations: Chat naturally about any topic with Claude AI
- 🔍 Knowledge Base Integration: Searches your Obsidian vault for relevant context
- 📝 Automatic Note Generation: Creates comprehensive session summaries
- 🔄 Seamless Integration: Saves learning sessions back to your Obsidian vault
- 💡 Context-Aware Responses: Maintains conversation flow and references your existing knowledge
- ⚡ Real-time Search: Vector-based search through your personal knowledge base
- Vector Store: ChromaDB with Voyage AI embeddings for semantic search
- LLM Service: Claude Sonnet 4 for advanced, intelligent responses
- Obsidian Integration: Direct file system integration with markdown generation
The Learning Assistant now includes a modern, chat-like graphical user interface (GUI) built with PySide6. This GUI provides a seamless and interactive experience for managing conversations, viewing context, and saving notes.
- Chat Display: Messages are shown in a visually appealing, chat-style format with Markdown rendering (including code blocks).
- Message Alignment: User and assistant messages are aligned for clarity, with consistent font sizing and spacing.
- Scroll & Focus: The chat window automatically scrolls to the latest message and keeps the input field focused for fast interaction.
- Session State: The GUI maintains session state and displays confirmation messages when notes are saved.
- Backend Integration: The backend runs in a separate process for stability and responsiveness.
- Tool Calls: Tool calls and results are handled transparently, with only relevant conversational output shown to the user.
- Start the GUI by running:
python gui.py
- Interact with the assistant in the chat window. All features available in the CLI are supported, plus:
- Rich Markdown rendering
- Automatic scroll-to-bottom
- Confirmation and introductory messages
- Note saving and session management
The GUI is designed to closely resemble modern messaging apps, making your learning experience more intuitive and enjoyable.
- Python 3.8+
- Obsidian vault with markdown files
- API keys for Anthropic (Claude) and Voyage AI
# Clone the repository
git clone <repository-url>
cd LearningAssistant
# Install dependencies
pip install -r requirements.txtCreate a .env file in the root directory:
ANTHROPIC_API_KEY=your_anthropic_key_here
VOYAGE_API_KEY=your_voyage_key_here
OBSIDIAN_VAULT_PATH=C:\path\to\your\obsidian\vaultpython services/build_index.pypython gui.py- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
Ready to enhance your learning journey? Start exploring with the Learning Assistant today! 🚀