A comprehensive microservices-based platform for generating educational materials using LangGraph workflows, featuring handwritten note recognition, material synthesis, and interactive question generation.
This project implements a sophisticated AI-powered educational platform built on a microservices architecture with the following core components:
- Core AI Service - Main LangGraph workflow orchestrator
- Article Service - File storage and export management
- Prompt Studio Service - Dynamic prompt generation and user customization
- PostgreSQL - Primary data storage
The platform is built around a sophisticated LangGraph workflow that processes educational materials through multiple AI-powered nodes:
graph TD
A[START] --> B[Input Processing]
B --> C[Content Generation]
C --> D{Images Present?}
D -->|Yes| E[Handwritten Recognition]
D -->|No| F[Question Generation]
E --> G[Material Synthesis]
G --> H[Material Editing]
H --> F[Question Generation]
F --> I[Answer Generation]
I --> J[END]
- Purpose: Analyzes user input and determines workflow path
- Functionality:
- Processes text input and image uploads
- Validates content using security guards
- Determines next workflow step based on input type
- Purpose: Generates educational material based on exam questions
- Functionality:
- Creates comprehensive study materials
- Uses personalized prompts from Prompt Strudio Service
- Determines workflow path (recognition vs. direct question generation)
- Purpose: Processes handwritten notes from uploaded images
- Functionality:
- OCR processing of handwritten content
- Text extraction and formatting
- Integration with image processing services
- Purpose: Combines generated content with recognized notes
- Functionality:
- Merges AI-generated material with handwritten notes
- Creates comprehensive study materials
- Ensures content coherence and completeness
- Purpose: Interactive material editing with human-in-the-loop (HITL)
- Functionality:
- Iterative material refinement
- User feedback integration
- Content improvement based on user input
- Purpose: Creates assessment questions with HITL feedback
- Functionality:
- Generates relevant exam questions
- Implements feedback loop for question refinement
- Ensures question quality and relevance
- Purpose: Generates comprehensive answers to questions
- Functionality:
- Creates detailed answer explanations
- Provides educational value
- Finalizes the learning material package
The workflow uses a comprehensive state model (GeneralState) that tracks:
class GeneralState(BaseModel):
# Input data
input_content: str
display_name: Optional[str]
image_paths: List[str]
# Processing results
recognized_notes: str
generated_material: str
synthesized_material: str
# Questions and answers
questions: List[str]
questions_and_answers: List[str]
# HITL interaction
feedback_messages: List[Any]
edit_count: int
needs_user_input: bool
agent_message: Optional[str]- Security Guard: Content validation and sanitization
- Input Validation: Multi-layer content checking
- Fuzzy Matching: Advanced threat detection
- Graceful Degradation: System continues operation even with security failures
The platform supports multiple LLM providers:
- OpenAI (GPT-4, GPT-3.5)
- OpenRouter (Various models)
- Fireworks AI
- DeepSeek
Each node can be configured with different models based on requirements.
- Framework: FastAPI + LangGraph
- Functionality:
- Workflow orchestration
- State management
- HITL interaction handling
- Artifact management
- Framework: FastAPI
- Functionality:
- File storage and management
- Export capabilities (PDF, Markdown, ZIP)
- Thread-based organization
- Web3 authentication support
- Framework: FastAPI
- Functionality:
- Dynamic prompt generation
- User profile management
- Template customization
- Prompt versioning
- Type: PostgreSQL 16
- Functionality:
- Workflow state persistence
- User data storage
- Artifact metadata
- Authentication data
# Core Configuration
OPENAI_API_KEY=your_openai_key
DATABASE_URL=postgresql://postgres:postgres@postgres:5432/core
# Optional LLM Providers
DEEPSEEK_API_KEY=your_deepseek_key
# Security
SECURITY_ENABLED=true
SECURITY_FUZZY_THRESHOLD=0.85
SECURITY_MIN_CONTENT_LENGTH=10
# Artifacts
ARTIFACTS_BASE_PATH=/app/data/artifacts
ARTIFACTS_MAX_FILE_SIZE=10485760configs/graph.yaml- LangGraph node configurationsconfigs/prompts.yaml- Prompt templatesconfigs/providers.yaml- LLM provider settings
# Start all services
docker-compose up -d
- Core:
http://localhost:8000/health - Article:
http://localhost:8001/health - Prompt Strudio:
http://localhost:8002/health
- Centralized logging in
/logsdirectory - Structured logging with different levels
- Service-specific log files
The platform implements sophisticated HITL patterns:
- Generation: AI creates initial content
- Feedback: User provides input/feedback
- Editing: AI refines content based on feedback
- Completion: Process continues until satisfaction
- PostgreSQL checkpoints for workflow state
- Resume capability for interrupted workflows
- Thread-based session management
- AI-powered content creation
- Handwritten note recognition
- Material synthesis and editing
- Interactive question generation
- Markdown export for documentation
- Blockchain authentication
- NFT content verification
- Decentralized ownership tracking
- Content hash validation
- Content sanitization
- Injection attack prevention
- Fuzzy threat detection
- Graceful security degradation
- Clone the repository
- Set up environment variables
- Run database migrations
- Start services with Docker Compose
- Access the API endpoints
POST /process- Start new workflowGET /status/{thread_id}- Check workflow statusPOST /continue/{thread_id}- Continue HITL interaction
GET /threads- List all threadsPOST /export/pdf/{thread_id}/{session_id}- Export to PDFPOST /export/markdown/{thread_id}/{session_id}- Export to Markdown
GET /prompts/{user_id}/{node_name}- Get personalized promptsPOST /profiles- Create user profilesPUT /profiles/{profile_id}- Update user profiles
- Advanced AI model integration
- Enhanced Web3 features
- Real-time collaboration
- Advanced analytics and insights
- Multi-language support
- Mobile application
This platform represents a cutting-edge approach to AI-powered educational content generation, combining the power of LangGraph workflows with modern microservices architecture and Web3 technologies.