A comprehensive AI assistant that uses Model Context Protocol (MCP) to integrate multiple data sources, enabling complex queries and context-aware answers.
- MCP Integration: Seamless integration with multiple data sources (files, databases, APIs)
- AI-Powered Processing: LLM-powered summarization and content generation
- Interactive Dashboard: React.js + Plotly/Dash visualization interface
- Context-Aware Queries: Intelligent query processing with multi-source context
- Real-time Analytics: User interaction metrics and query performance tracking
├── backend/ # FastAPI backend
│ ├── api/ # API endpoints
│ ├── core/ # Core business logic
│ ├── models/ # Data models
│ └── services/ # Service layer
├── frontend/ # React.js dashboard
│ ├── src/
│ │ ├── components/ # React components
│ │ ├── pages/ # Dashboard pages
│ │ └── utils/ # Utility functions
├── mcp_server/ # MCP server implementation
│ ├── tools/ # MCP tools
│ └── integrations/ # Data source integrations
├── data/ # Sample data and configurations
└── tests/ # Test files
- Install dependencies:
pip install -r requirements.txt
- Set up environment variables:
cp .env.example .env
# Edit .env with your API keys and configurations
- Start the MCP server:
python mcp_server/main.py
- Start the backend API:
python backend/main.py
- Start the frontend dashboard:
cd frontend && npm start
- OpenAI API key for LLM functionality
- Database connections for data sources
- MCP server configurations
- Dashboard customization options
Once running, visit http://localhost:8000/docs
for interactive API documentation.