We are a Swiss-based engineering studio focused on AI Integration and DevOps Infrastructure.
Our mission is to bridge the gap between "Demo AI" and Production Infrastructure. We build tools that are privacy-first, self-hosted, and designed for environments where data egress is not an option.
- Local LLM Infrastructure: Running inference on standard K8s nodes (CPU/GPU).
- Observability: Enhancing the LGTM stack (Loki, Grafana) with semantic intelligence.
- MCP Servers: Building standard interfaces for AI agents to talk to infrastructure.
- LogDistill – The local AI engine for Loki. Summarize 10,000 logs into 5 insights without sending data to the cloud.
- More internal tools are being open-sourced soon.
"Hands-on implementation, not slide decks." Website • Contact