π An Open-Source Project by Kamil DoXToR-G.
A next-generation, AI-powered API documentation service built on the Model Context Protocol (MCP) that automatically fetches, maintains, and provides intelligent search and assistance for APIs from various providers such as Atlassian (Jira), Datadog, Kubernetes, and more.
π This is an open-source project - contributions are welcome!
Project is part-done. Its functional but needs further development - feel free to continue with it.
Kamil DoXToR-G. - Full-stack developer passionate about creating useful tools for the developer community.
- π― Project Lead - Architecture, backend development, and system design
- π§ Tech Stack - FastAPI, PostgreSQL, Elasticsearch, Docker
- π Open Source - Committed to building and sharing quality software
- π§ Contact - Open issues on GitHub for questions and contributions
- π€ MCP (Model Context Protocol) Integration - Modern AI agent framework
- π AI-Powered Semantic Search - Vector-based search with ChromaDB
- π¬ Intelligent Chat Interface - Real-time AI assistance via WebSocket
- π Automatic Documentation Fetching - Scheduled updates from multiple providers
- π± Modern Web Interface - Responsive design with dark/light themes
- β° Scheduled Updates - Keep documentation current with background tasks
- π·οΈ Smart Categorization - Automatic tagging and organization of API endpoints
- π Advanced Analytics - AI-driven insights and usage patterns
- π§ Context-Aware Responses - Intelligent understanding of user intent
- π€ MCP Layer: Model Context Protocol server with tool definitions
- π§ AI Agent: Intelligent query processing and context management
- π Vector Store: ChromaDB for semantic search and embeddings
- β‘ Backend: FastAPI with WebSocket support for real-time communication
- ποΈ Database: PostgreSQL with SQLAlchemy ORM for structured data
- π Search Engine: Elasticsearch + ChromaDB for hybrid search
- π Task Queue: Celery with Redis for background processing
- π³ Containerization: Docker & Docker Compose for easy deployment
- FastAPI (Python 3.11+) - High-performance web framework with WebSocket support
- MCP (Model Context Protocol) - Modern AI agent framework
- SQLAlchemy + Alembic - Database ORM & migrations
- PostgreSQL - Reliable data storage
- ChromaDB - Vector database for semantic search
- Elasticsearch - Hybrid search engine
- Celery + Redis - Background task processing
- Pydantic - Data validation and serialization
- Next.js 14 - React framework with App Router
- React 18 - Modern React with hooks and server components
- TypeScript - Type-safe development
- Tailwind CSS - Utility-first CSS framework
- Lucide Icons - Beautiful, customizable icons
- Axios - HTTP client for API requests
- Docker & Docker Compose - Containerized development environment
- Health Checks - Service monitoring and dependency management
- Environment Configuration - Flexible configuration management
- Docker and Docker Compose
- Python 3.11+ (for local development)
- Git
-
Clone the repository:
git clone <your-github-repo-url> cd Latest_api_project
-
Configure environment variables:
# Copy the example environment file cp .env.example backend/.env # Edit backend/.env and add your API keys (optional for basic functionality) # IMPORTANT: Change SECRET_KEY for production use!
-
Start the development environment:
docker-compose up -d
-
Access the application:
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
- Admin Dashboard: http://localhost:3000/admin (default credentials: see Security section)
- Health Check: http://localhost:8000/health
Latest_api_project/
βββ backend/ # FastAPI backend
β βββ app/
β β βββ api/ # API route handlers
β β β βββ v1/ # API v1 endpoints (providers, search, AI settings)
β β βββ core/ # Core functionality & config
β β βββ db/ # Database models & setup
β β βββ fetchers/ # API documentation fetchers
β β βββ mcp/ # Model Context Protocol server
β β βββ services/ # Business logic (AI agent, OpenAI MCP client)
β β βββ vector_store/ # ChromaDB integration
β β βββ main.py # Main FastAPI application
β βββ Dockerfile # Backend container config
β βββ requirements.txt # Python dependencies
βββ frontend/ # Next.js 14 frontend
β βββ app/ # Next.js App Router pages
β β βββ admin/ # Admin login & dashboard
β β βββ page.tsx # Main landing page
β β βββ layout.tsx # Root layout
β βββ components/ # React components
β β βββ AIConfigPanel.tsx # OpenAI settings panel
β β βββ ChatInterface.tsx # AI chat interface
β β βββ GameOfLife.tsx # Background animation
β β βββ ThemeToggle.tsx # Dark/light mode toggle
β βββ Dockerfile # Frontend container config
β βββ package.json # Node.js dependencies
βββ docker-compose.yml # Development environment
βββ .env.example # Environment variables template
βββ README.md # This file
The application uses environment variables for configuration. A template file .env.example is provided in the root directory.
-
Copy the example file:
cp .env.example backend/.env
-
Edit
backend/.envwith your configuration:# Database DATABASE_URL=postgresql://api_user:password@localhost:5432/api_docs_db # Security - CHANGE THESE IN PRODUCTION! SECRET_KEY=your-secret-key-here # AI API Keys (Optional - for AI-powered features) OPENAI_API_KEY=sk-your-openai-api-key-here ANTHROPIC_API_KEY=sk-ant-your-anthropic-api-key-here # API Provider Keys (Optional - for documentation fetching) ATLASSIAN_API_TOKEN=your-atlassian-token DATADOG_API_KEY=your-datadog-key DATADOG_APP_KEY=your-datadog-app-key
-
See
.env.examplefor all available configuration options
- Atlassian (Jira Cloud) - Issue tracking and project management
- Datadog - Monitoring and observability platform
- Kubernetes - Container orchestration platform
- Prometheus - Metrics monitoring system
- Grafana - Data visualization and alerting
- Kibana - Elasticsearch data visualization
GET /- Application informationGET /health- Health checkGET /docs- Interactive API documentation (Swagger UI)
GET /api/v1/providers- Manage API providersGET /api/v1/documentation- Browse API documentationPOST /api/v1/search- Search across documentationGET /api/v1/analytics- Usage analyticsPOST /api/v1/agent- AI agent interactions
Run the test suite:
cd backend
pytestBefore deploying to production, YOU MUST change the following default credentials:
-
Database Password in
docker-compose.yml:POSTGRES_PASSWORD: password # β οΈ CHANGE THIS!
-
Secret Keys in
docker-compose.yml:SECRET_KEY: dev-secret-key-change-in-production # β οΈ CHANGE THIS!
-
Database Connection String:
DATABASE_URL: postgresql://api_user:password@... # β οΈ CHANGE PASSWORD!
The project includes default credentials for LOCAL DEVELOPMENT ONLY:
- PostgreSQL:
api_user/password - Secret Key:
dev-secret-key-change-in-production
- Frontend: OpenAI API keys entered in the admin dashboard are stored in browser localStorage only
- Backend: API keys should be set via environment variables in the
.envfile - Never commit:
.envfiles are gitignored and should never be committed to version control
- Use strong, randomly generated passwords (32+ characters)
- Store secrets in a secure secret management system (e.g., AWS Secrets Manager, HashiCorp Vault)
- Enable HTTPS/TLS for all connections
- Use environment-specific configuration files
- Implement proper access controls and firewall rules
β οΈ CRITICAL: Change all default passwords and secret keys (see Security section above)- Set up production environment variables in
backend/.env - Configure production database and Redis with strong credentials
- Set up Elasticsearch cluster
- Enable HTTPS/TLS
- Deploy using Docker or your preferred method
# Ensure you've updated credentials first!
docker-compose -f docker-compose.prod.yml up -d- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Kamil DoXToR-G. - Project creator and lead developer
- FastAPI community for the excellent web framework
- Elasticsearch for powerful search capabilities
- Tailwind CSS for the beautiful design system
- All contributors and supporters
- README.md - This file, project overview and quick start
- AI_SYSTEM_STATUS.md - Current AI system status and configuration
- MCP_OPENAI_IMPLEMENTATION_SUMMARY.md - Complete MCP + OpenAI integration guide
- CHAT_INTERFACE_IMPROVEMENTS.md - Chat UI features and improvements
The system includes:
- β True MCP Protocol - Resources, Tools, and Prompts implementation
- β OpenAI Integration - GPT-4o-mini for intelligent responses
- β Smart Chat Interface - Session persistence, fixed positioning, real-time AI assistance
- β PostgreSQL Database - 1,660+ API endpoints indexed (Atlassian, Kubernetes)
- β Admin Dashboard - Configure OpenAI settings, sync providers, monitor status
- β Vector Search Ready - ChromaDB for semantic search
- β Modern UI - Next.js 14 with dark/light themes, Conway's Game of Life background
To use AI features:
- Get an OpenAI API key from https://platform.openai.com/api-keys
- Go to http://localhost:3000/admin/dashboard
- Click "Settings" and enter your API key
- Click "Validate" then "Save"
- Start chatting with the AI assistant!
If you have any questions or need help:
- Open an issue on GitHub
- Check the documentation at
/docs - Review the health endpoint at
/health - Read the guides above for specific topics
Made with β€οΈ for the developer community