A local-first AI coding tutor that quizzes developers on their own code to ensure understanding before commits.
Socratic IDE uses a "Trinity Architecture" with three main components:
- THE BODY (Rust/Tauri) - Desktop application, file watching, state management
- THE MIND (Python/FastAPI) - LLM orchestration, quiz generation, chat explanations
- THE ENGINE (Ollama) - Local LLM inference for privacy and zero API costs
- Code Quizzing - Generate quizzes about your codebase to test understanding
- Code Explanations - Chat with an AI about any file or function
- Syntax Highlighting - View code with language-aware highlighting
- Function Extraction - Automatically detect and list functions (Rust, Python, TypeScript, JavaScript)
- Local-First - All processing happens locally via Ollama
# Clone the repository
git clone https://github.com/yourusername/socratic-ide.git
cd socratic-ide
# Install dependencies
npm run setup
# Pull an Ollama model (if you haven't already)
ollama pull llama3# Terminal 1: Start Ollama
ollama serve
# Terminal 2: Start the application
npm run dev:allOr let Tauri manage everything:
# Ollama must be running first
npm run tauri dev# Run all tests
npm run test # Frontend (Vitest)
npm run test:rust # Rust (cargo test)
npm run test:python # Python (pytest)
# Lint code
npm run lint # ESLint
npm run lint:rust # Clippy
npm run lint:python # Ruffsocratic-ide/
├── desktop/
│ ├── core/ # Rust/Tauri backend
│ └── interface/ # React frontend
├── agent/ # Python FastAPI server
├── .ai_context/ # AI assistant documentation
└── scripts/ # Development utilities
For detailed documentation, see the .ai_context/ directory:
ARCHITECTURE.md- System design and data flowROADMAP.md- Development phases and goalsSESSION_LOG.md- Development progress tracker
- Phase 1: Infrastructure (Complete)
- Phase 2: LLM Integration (In Progress)
See ROADMAP.md for the full development plan.
MIT