Open-CITE (Cataloging Intelligent Tools in the Enterprise), pronounced like "Open-Sight", is a Python library, service, and application designed to facilitate the discovery and management of AI/ML Assets (including tools, models, and infrastructure) across multiple cloud platforms and protocols.
Open-CITE provides a unified interface for discovering and cataloging AI/ML resources across different platforms. Whether you're managing models in Databricks, tracking AI agent usage via OpenTelemetry, discovering resources in Azure AI Foundry, or working with Google Cloud's Vertex AI, Open-CITE brings everything together under one roof.
- Multi-Platform Discovery: Automatic discovery of AI/ML resources across Databricks, AWS, Azure, Google Cloud, and more
- Protocol Support: Native support for OpenTelemetry, MCP (Model Context Protocol), and major cloud APIs
- Trace Analysis: Collect and analyze traces from AI agents, tools, and model invocations
- Model Cataloging: Track models, endpoints, deployments, and usage patterns
- Infrastructure Discovery: Find MCP servers, compute instances, and AI services
- Unified Schema: Export discoveries in a standardized JSON format for downstream processing
- Runs as a library or service: Open-CITE can be run with or without the GUI, in a docker container or Kubernetes to provide a headless AI asset discovery service, or leveraged as a library in your own Python application
Open-CITE is provided to the community by the team at LangGuard.AI, home of the AI Control Plane for enterprise AI governance and monitoring. LangGuard leverages Open-CITE for internal AI Asset discovery.
# Create and activate a virtual environment
python3 -m venv venv
source venv/bin/activate
# Install in editable mode
pip install -e .
# Start the GUI
opencite gui
# Access at http://localhost:5000
# Or start the headless API
opencite api
# Access at http://0.0.0.0:8080Full documentation is available in the docs/ folder:
- docs/README.md -- Full usage guide, Python API examples, and project structure
- docs/DEPLOYMENT.md -- Docker and Kubernetes deployment
- docs/REST_API.md -- REST API reference
- docs/SENDING_TRACES.md -- Configure Cloudflare AI Gateway, OpenRouter, and other sources to send traces
- docs/PLUGINS.md -- Plugin authoring guide
- docs/DEVELOPMENT.md -- Development setup and debugging
- docs/SCHEMA_DOCUMENTATION.md -- JSON export schema
- OpenTelemetry Plugin
- Azure AI Foundry Plugin
- AWS Plugins (Bedrock & SageMaker)
- Google Cloud Plugin
- Microsoft Fabric Plugin
Contributions are welcome! The plugin architecture makes it easy to add support for new platforms. See docs/PLUGINS.md for the full plugin authoring guide.