The MCP Server is an AI-powered integration system that automatically manages project items across multiple platforms (GitHub, Jira, Linear) by listening to terminal coding tools and conversations.
The MCP Server bridges the gap between your development workflow and project management systems. It listens to coding activities, analyzes them using AI, and automatically creates or updates corresponding issues/tickets in your project management tools.
┌─────────────────┐ ┌─────────────────┐ ┌─────────────────┐
│ GitHub │ │ JIRA │ │ Linear │
│ Issues/PRs │ │ Tickets │ │ Issues │
└─────────┬───────┘ └─────────┬───────┘ └─────────┬───────┘
│ │ │
└──────────────────────┼──────────────────────┘
│
┌─────────────────────────────┐
│ MCP Server │
│ (Integration Hub) │
│ │
│ ┌─────────────────────────┐ │
│ │ AI Context Engine │ │
│ │ (LLM Analysis & NLP) │ │
│ └─────────────────────────┘ │
│ ┌─────────────────────────┐ │
│ │ Data Ingestion Layer │ │
│ │ (Multiple formats & │ │
│ │ sources) │ │
│ └─────────────────────────┘ │
│ ┌─────────────────────────┐ │
│ │ Conversation Manager │ │
│ │ (Terminal, Files, etc) │ │
│ └─────────────────────────┘ │
│ ┌─────────────────────────┐ │
│ │ Monitoring & Observ. │ │
│ │ (Langfuse/LiteLLM) │ │
│ └─────────────────────────┘ │
└─────────────┬───────────────┘
│
┌───────────────────┼───────────────────┐
│ │ │
┌─────────▼────────┐ ┌───────▼───────┐ ┌───────▼────────┐
│ Terminal Tools │ │ Data Sources │ │ AI/LLM Models │
│ (Cursor, VSCode, │ │ (JSON, MD, │ │ (Context │
│ Vim, etc) │ │ CSV, XML, etc)│ │ Processing) │
└──────────────────┘ └───────────────┘ └────────────────┘
- Linear: Issues, Projects, Cycles, Pulses, Reviews, Views, Initiatives, Customers, Teams
- GitHub: Issues, Pull Requests, Projects
- Jira: Issues, Projects, Epics, Stories
- OpenAI: GPT-4, GPT-3.5-turbo and other models
- Anthropic: Claude 3 models
- OpenRouter: Access to multiple models through one API
- Ollama: Local models for privacy-conscious users
- Multiple format support: JSON, YAML, Markdown, CSV, XML, TXT, SQL
- File-based ingestion from terminal tools
- Stream-based ingestion for real-time processing
- Multi-source consolidation
- Langfuse Integration: Full observability for AI calls and operations
- Performance tracking: Latency, token usage, success rates
- Error tracking and analysis
- Conversation analysis metrics
-
Clone the repository:
git clone https://github.com/yourusername/mcp-server-project.git cd mcp-server-project -
Install dependencies:
pip install -r requirements.txt # For extended features including monitoring: pip install -r requirements_extended.txt -
Set up environment variables:
cp .env.example .env # Edit .env with your API keys and configuration
The server requires several environment variables to function:
OPENAI_API_KEY: For OpenAI integrationANTHROPIC_API_KEY: For Anthropic integrationLINEAR_API_KEY: For Linear integrationLANGFUSE_PUBLIC_KEY&LANGFUSE_SECRET_KEY: For monitoring (optional)
Configuration is managed through config/config.py which provides:
- Server settings (host, port, etc.)
- AI provider configuration
- Integration settings
- Data source settings
- Monitoring configuration
- Feature flags
cd mcp_server
python server_ai_enabled.pyPOST /event: Process events from terminal toolsPOST /ingest: Ingest data from various sourcesPOST /analyze: Analyze data using AIPOST /monitor_directory: Monitor directories for conversation filesGET /health: Check server healthGET /configured_platforms: List configured platformsGET /available_ai_providers: List available AI providers
The server can monitor your terminal coding environment for:
- File changes that might indicate bugs to fix
- Git operations that correspond to issues
- Conversation logs from AI coding assistants
- Commit messages that close issues
The server includes comprehensive monitoring via Langfuse:
-
AI Call Tracking: Every AI API call is tracked with:
- Input prompts and output responses
- Latency measurements
- Token usage
- Success/failure status
-
Conversation Analysis Tracking:
- What conversations were analyzed
- What actions were suggested
- Which provider was used
-
Linear Operations Tracking:
- Issues created/updated/closed
- Projects created
- Success metrics
To enable Langfuse:
- Sign up at https://langfuse.com
- Get your public and secret keys
- Set
LANGFUSE_PUBLIC_KEYandLANGFUSE_SECRET_KEYenvironment variables - Set
MCP_MONITORING_ENABLED=trueandMCP_MONITORING_PROVIDER=langfuse
The monitoring dashboard will show you:
- AI usage patterns
- Performance metrics
- Error rates
- Token consumption
- Conversation analysis effectiveness
Alternatively, you can use LiteLLM proxy for monitoring:
- Set
MCP_MONITORING_PROVIDER=litellm - Configure
LITELLM_PROXY_HOSTandLITELLM_API_KEY
- API keys are stored securely and never logged
- All communication is encrypted
- Rate limiting prevents abuse
- Input sanitization prevents injection attacks
The server is designed to be extensible:
- New AI Providers: Implement the
AIProviderinterface - New Integrations: Add new platform managers following the Linear manager pattern
- New Data Sources: Extend the
DataIngestorclass - New Monitoring Providers: Add new monitoring integrations
The server supports multiple environments:
- Development:
ENVIRONMENT=development - Staging:
ENVIRONMENT=staging - Production:
ENVIRONMENT=production
- API Key Issues: Verify your environment variables are set correctly
- Monitoring Not Working: Check Langfuse API keys and network connectivity
- AI Provider Issues: Verify the provider is enabled in config and has valid credentials
- Terminal Monitoring Not Working: Check file paths and permissions
Mutation: issueCreate
- title: String!
- description: String
- projectId: String
- teamId: String!
- priority: Int
- labels: [String!]
Mutation: projectCreate
- name: String!
- description: String
- teamId: String
- startDate: Date
- targetDate: Date
Query: issues
- filter parameters for existing issues
- status tracking
- relationship mapping
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Add tests if applicable
- Run tests (
pytest) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.