Stars
An open-source runtime for composable workflows. Great for AI agents and CI/CD.
A highly customizable homepage (or startpage / application dashboard) with Docker and service API integrations.
MCP-Hub and -Bride, Multi-Model Workflow and Chat Interface
🧠 Roo Code Memory Bank: Seamless project context in Roo Code. No more repetition, just continuous development. Includes Architect, Code, Ask, Debug and Test modes!
Command and Conquer: Generals - Zero Hour
Command and Conquer Tiberian Dawn
Completely free, private, UI based Tech Documentation MCP server. Designed for coders and software developers in mind. Easily integrate into Cursor, Windsurf, Cline, Roo Code, Claude Desktop App
A desktop application that adds powerful tools to Claude and AI platforms
A privacy-first, self-hosted, fully open source personal knowledge management software, written in typescript and golang.
Enable LLMs to Program Themselves.
Call another MCP client from your MCP client. Offload context windows, delegate tasks, split between models
This extension allows you to interact with the DeepSeek models for Github Copilot Chat, fully locally and offline. It uses Ollama under the hood to provide a seamless experience.
[CVPR 2025] Magma: A Foundation Model for Multimodal AI Agents
OctoTools: An agentic framework with extensible tools for complex reasoning
The AI runtime that turns your framework functions into OpenAI compatible endpoints
An MCP server & prompt runner for all of Docker. Simple Markdown. BYO LLM.
Reliable LLM Memory for AI Applications and AI Agents
A Model Context Protocol (MCP) server implementation providing persistent note management created with Python SDK.