A high-throughput and memory-efficient inference and serving engine for LLMs
-
Updated
Aug 22, 2025 - Python
A high-throughput and memory-efficient inference and serving engine for LLMs
Learn how to design, develop, deploy and iterate on production-grade ML applications.
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Qdrant - High-performance, massive-scale Vector Database and Vector Search Engine for the next generation of AI. Also available in the cloud https://cloud.qdrant.io/
Label Studio is a multi-type data labeling and annotation tool with standardized output format
The open source developer platform to build AI/LLM applications and models with confidence. Enhance your AI applications with end-to-end tracking, observability, and evaluations, all in one integrated platform.
☁️ Build multimodal AI applications with cloud-native stack
A curated list of awesome open source libraries to deploy, monitor, version and scale your machine learning
Turns Data and AI algorithms into production-ready web applications in no time.
Workflow Engine for Kubernetes
Machine Learning Engineering Open Book
Weaviate is an open-source vector database that stores both objects and vectors, allowing for the combination of vector search with structured filtering with the fault tolerance and scalability of a cloud-native database.
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
An orchestration platform for the development, production, and observation of data assets.
A curated list of references for MLOps
Free MLOps course from DataTalks.Club
Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.
This repository delivers end-to-end, code-first tutorials covering every layer of production-grade GenAI agents, guiding you from spark to scale with proven patterns and reusable blueprints for real-world launches.
Example 📓 Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using 🧠 Amazon SageMaker.
Always know what to expect from your data.
Add a description, image, and links to the mlops topic page so that developers can more easily learn about it.
To associate your repository with the mlops topic, visit your repo's landing page and select "manage topics."