Skip to content

warble-dev/steadyhelm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SteadyHelm

A Kubernetes MCP (Model Context Protocol) server that bridges AI agents to live cluster state. Lets Gemini, Claude, or GPT-4 reason over your namespaces, pods, events, and HPA metrics — in real time — without kubectl proxy hacks.

Part of the Warble Cloud open-source ecosystem. Powers the ShrikeOps AI agent bridge inside the Reflexion Engine.


Features

  • Structured cluster context for LLMs — all resources serialised as clean text the model can reason over
  • Read-only & read-write modes — write tools (scale_deployment, delete_pod) are only registered when --mode=readwrite
  • RBAC gates — runs under your existing kubeconfig or a scoped ServiceAccount inside the cluster
  • stdio + HTTP transports — drop into Claude Desktop / Cursor via stdio, or expose as an HTTP sidecar
  • Metrics-server aware — live CPU/memory per pod and node (gracefully degrades if metrics-server is absent)

Available Tools

Tool Mode Description
list_namespaces read All namespaces and their phase
list_pods read Pods with ready/total, phase, restart count
get_pod read Full pod status, conditions, container states
get_pod_logs read Tail N lines from a container
list_nodes read Node status, roles, kernel version
list_events read Events with field-selector support
list_deployments read Deployments with replica counts
list_hpas read HPA min/max/current/desired replicas
list_pod_metrics read Live CPU + memory per pod
list_node_metrics read Live CPU + memory per node
list_services read Services with type and port mappings
get_configmap read ConfigMap key/value data
cluster_summary read Health snapshot: nodes, pod phases, warnings, HPA saturation
scale_deployment write Scale a Deployment to N replicas
delete_pod write Delete a pod (triggers restart)

Quick Start

Homebrew / binary

go install github.com/warblecloud/steadyhelm@latest

Run against current kubeconfig (read-only, stdio)

steadyhelm --mode=readonly --transport=stdio

Run as HTTP server (e.g. as a sidecar)

steadyhelm --mode=readwrite --transport=http --addr=:8811

Restrict to one namespace

steadyhelm --namespace=production --transport=stdio

Claude Desktop / Cursor Configuration

Add to your claude_desktop_config.json:

{
  "mcpServers": {
    "steadyhelm": {
      "command": "steadyhelm",
      "args": ["--mode=readonly", "--transport=stdio"],
      "env": {
        "KUBECONFIG": "/Users/you/.kube/config"
      }
    }
  }
}

In-Cluster Deployment (minimal RBAC)

apiVersion: v1
kind: ServiceAccount
metadata:
  name: steadyhelm
  namespace: steadyhelm
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRole
metadata:
  name: steadyhelm-readonly
rules:
  - apiGroups: [""]
    resources: ["namespaces","pods","pods/log","nodes","events","services","configmaps"]
    verbs: ["get","list"]
  - apiGroups: ["apps"]
    resources: ["deployments","deployments/scale"]
    verbs: ["get","list"]
  - apiGroups: ["autoscaling"]
    resources: ["horizontalpodautoscalers"]
    verbs: ["get","list"]
  - apiGroups: ["metrics.k8s.io"]
    resources: ["pods","nodes"]
    verbs: ["get","list"]
---
apiVersion: rbac.authorization.k8s.io/v1
kind: ClusterRoleBinding
metadata:
  name: steadyhelm-readonly
subjects:
  - kind: ServiceAccount
    name: steadyhelm
    namespace: steadyhelm
roleRef:
  kind: ClusterRole
  name: steadyhelm-readonly
  apiGroup: rbac.authorization.k8s.io

Building

go build -o steadyhelm .

Docker:

docker build -t steadyhelm:latest .
docker run --rm -v ~/.kube:/root/.kube steadyhelm:latest \
  --mode=readonly --transport=http --addr=:8811

Architecture

LLM (Gemini / Claude / GPT-4)
        │  tool call (JSON-RPC 2.0)
        ▼
  SteadyHelm MCP Server
  ├── stdio transport   ← Claude Desktop, Cursor
  └── HTTP transport    ← ShrikeOps sidecar, custom agents
        │
        ▼
  k8s client-go
  ├── Core API (pods, nodes, events, services, configmaps)
  ├── Apps API (deployments/scale)
  ├── Autoscaling API (HPAs)
  └── Metrics API (pod + node metrics)
        │
        ▼
  Kubernetes API Server

License

Apache 2.0 — see LICENSE.

Built with ❤️ by Warble Cloud / ChirpStack LLP.

About

Kubernetes MCP server that bridges AI agents (Gemini, Claude, GPT-4) to live cluster state. Read-only and read-write modes with RBAC gates.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors