Stars
Run Generative AI models with simple C++/Python API and using OpenVINO Runtime
Unified runtime-adapter image of the sidecar containers which run in the modelmesh pods
OpenVINO operator for OpenShift and Kubernetes
OpenVINO™ is an open source toolkit for optimizing and deploying AI inference
Pre-trained Deep Learning models and demos (high quality and extremely fast)
A scalable inference server for models optimized with OpenVINO™
OpenVINO on Ubuntu Containers demo running a Coloriser app with MicroK8s
Inference Model Manager for Kubernetes