The Inference Gateway ADK allowing for seamless creation of A2A-compatible agents
-
Updated
Jun 20, 2025 - Go
The Inference Gateway ADK allowing for seamless creation of A2A-compatible agents
An SDK written in Rust for the Inference Gateway
The UI for the inference-gateway, providing a user-friendly interface to interact with and visualize inference results and manage models
A2A-compatible agent enabling Google Calendar scheduling, retrieval, and automation
This project provides a Kubernetes Operator for managing the lifecycle of the inference-gateway and its related components. It simplifies deployment, configuration, and scaling of the gateway within Kubernetes clusters, enabling seamless integration of inference workflows.
Extensive documentation of the inference-gateway
An SDK written in Typescript for the Inference Gateway
An SDK written in Python for the Inference Gateway
An open-source, high-performance gateway unifying multiple LLM providers, from local solutions like Ollama to major cloud providers such as OpenAI, Groq, Cohere, Anthropic, Cloudflare and DeepSeek.
Add a description, image, and links to the inference-gateway topic page so that developers can more easily learn about it.
To associate your repository with the inference-gateway topic, visit your repo's landing page and select "manage topics."