Workflow Engine for Kubernetes
-
Updated
May 10, 2024 - Go
Workflow Engine for Kubernetes
Weaviate is an open-source vector database that stores both objects and vectors, allowing for the combination of vector search with structured filtering with the fault tolerance and scalability of a cloud-native database.
Scalable and flexible workflow orchestration platform that seamlessly unifies data, ML and analytics stacks.
Determined is an open-source machine learning platform that simplifies distributed training, hyperparameter tuning, experiment tracking, and resource management. Works with PyTorch and TensorFlow.
🏕️ Reproducible development environment
Distributed ML Training and Fine-Tuning on Kubernetes
Automated Machine Learning on Kubernetes
The open source, end-to-end computer vision platform. Label, build, train, tune, deploy and automate in a unified platform that runs on any cloud and on-premises.
Aqueduct is no longer being maintained. Aqueduct allows you to run LLM and ML workloads on any cloud infrastructure.
☁️ Terraform plugin for machine learning workloads: spot instance recovery & auto-termination | AWS, GCP, Azure, Kubernetes
One-click machine learning deployment (LLM, text-to-image and so on) at scale on any cluster (GCP, AWS, Lambda labs, your home lab, or even a single machine).
Tools for easing the handoff between AI/ML and App/SRE teams.
A lightweight CLI tool for versioning data alongside source code and building data pipelines.
Finetune LLMs on K8s by using Runbooks
Kubernetes-friendly ML model management, deployment, and serving.
Transform your pythonic research to an artifact that engineers can deploy easily.
A lightweight tool to get an AI Infrastructure Stack up in minutes not days. K3ai will take care of setup K8s for You, deploy the AI tool of your choice and even run your code on it.
Experiment tracking server focused on speed and scalability
Machine Learning Operator & Controller for Kubernetes
Add a description, image, and links to the mlops topic page so that developers can more easily learn about it.
To associate your repository with the mlops topic, visit your repo's landing page and select "manage topics."