KServe is a standard model inference platform on Kubernetes, built for highly scalable use cases. It provides performant, standardized inference protocol across ML frameworks including TensorFlow, PyTorch, scikit-learn, XGBoost, and more.
URL: Visit APIs.json URL
- Type: Index
- Position: Consuming
- Access: 3rd-Party
- Machine Learning, Kubernetes, Model Serving, Inference, MLOps
- Created: 2025-01-01
- Modified: 2026-03-16
KServe's standardized model inference protocol for serving predictions across multiple ML frameworks on Kubernetes.
Human URL: https://kserve.github.io/website/
- Inference, Model Serving
FN: Kin Lane
Email: kin@apievangelist.com