Caplets is a progressive-disclosure gateway for Model Context Protocol (MCP) servers, native OpenAPI endpoints, and native GraphQL endpoints.
Instead of connecting an MCP client to many downstream servers or HTTP APIs and exposing every operation up front, Caplets exposes one top-level tool per configured capability. An agent first chooses a capability domain, then asks Caplets to list, search, inspect, or call that backend's underlying tools or operations.
This keeps the initial MCP tool list small, makes tool selection easier, and avoids flattened tool-name collisions across servers.
Caplets is a mashup of two ideas that work well separately but leave a gap together: agent skills and MCP servers.
Agent skills are great at progressive disclosure. They show an agent a compact capability card first, then let it read deeper instructions only when that skill is relevant. MCP servers are great at live tool execution, but most clients expose their tools as one flat list up front. That means a powerful MCP setup can flood the agent with every tool from every server before it knows which capability area matters.
Caplets borrows the skill-shaped discovery model and applies it to MCP. Each downstream server becomes a skill-like capability card first; its actual MCP tools stay hidden until the agent chooses that server and asks to search, list, inspect, or call them.
- Reads downstream MCP server definitions, native OpenAPI endpoint definitions, and native GraphQL endpoint definitions from
~/.caplets/config.json. - Registers one generated MCP tool for each enabled MCP server, OpenAPI endpoint, or GraphQL endpoint.
- Uses the configured server ID as the generated tool name.
- Uses the configured
nameanddescriptionas the capability card shown to agents. - Starts downstream MCP servers and loads OpenAPI specs lazily when an operation needs them.
- Supports stdio, Streamable HTTP, and legacy HTTP+SSE downstream servers.
- Lets agents
list_tools,search_tools,get_tool, andcall_toolwithin one selected Caplet namespace. - Converts OpenAPI operations into MCP-style tool metadata and executes HTTP calls directly.
- Converts configured GraphQL operations into MCP-style tool metadata, and can auto-generate GraphQL tools from schema root query and mutation fields.
- Preserves downstream tool results instead of rewriting them into a custom format.
- Redacts secrets from structured errors.
- Supports static remote auth and OAuth token storage for remote servers.
Caplets requires Node.js 22 or newer.
pnpm add -g capletsFor local development from this repository:
pnpm install
pnpm buildCreate a starter ~/.caplets/config.json:
caplets initThe generated config includes a disabled example server. Replace it with the MCP servers you want Caplets to expose:
{
"$schema": "https://raw.githubusercontent.com/spiritledsoftware/caplets/main/schemas/caplets-config.schema.json",
"version": 1,
"defaultSearchLimit": 20,
"maxSearchLimit": 50,
"mcpServers": {
"filesystem": {
"name": "Project Files",
"description": "Read, search, and edit local project files.",
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/home/you/code"],
"cwd": "/home/you/code",
"startupTimeoutMs": 10000,
"callTimeoutMs": 60000,
"toolCacheTtlMs": 30000
},
"docs": {
"name": "Hosted Docs",
"description": "Search hosted product and API documentation.",
"transport": "http",
"url": "https://mcp.example.com/mcp",
"auth": {
"type": "bearer",
"token": "$env:DOCS_MCP_TOKEN"
}
}
},
"openapiEndpoints": {
"users": {
"name": "Users API",
"description": "Manage users through the internal HTTP API.",
"specPath": "./openapi.json",
"baseUrl": "https://api.example.com",
"auth": {
"type": "bearer",
"token": "$env:USERS_API_TOKEN"
}
}
},
"graphqlEndpoints": {
"catalog": {
"name": "Catalog GraphQL",
"description": "Query and update catalog records through GraphQL.",
"endpointUrl": "https://api.example.com/graphql",
"introspection": true,
"auth": {
"type": "oidc",
"issuer": "https://login.example.com"
}
}
}
}The default config path can be overridden with CAPLETS_CONFIG:
CAPLETS_CONFIG=/path/to/config.json caplets init
CAPLETS_CONFIG=/path/to/config.json caplets serveInspect the installed CLI version and resolved config locations:
caplets --version
caplets config path
caplets config paths
caplets config paths --jsonCaplets validates this file at startup. Config changes take effect after restarting the Caplets MCP server.
The optional $schema field points editors at the generated JSON Schema in
schemas/caplets-config.schema.json. CI verifies that
the committed schema stays in sync with the Zod config validator.
For richer skill-like cards, add Markdown Caplet files beside config.json. Every Caplet
file must include exactly one executable backend: mcpServer, openapiEndpoint, or
graphqlEndpoint;
serverless Caplets are intentionally out of scope.
Top-level files derive the Caplet ID from the filename:
---
$schema: https://raw.githubusercontent.com/spiritledsoftware/caplets/main/schemas/caplet.schema.json
name: GitHub
description: Interact with GitHub repositories, issues, and pull requests.
tags:
- code
- review
mcpServer:
command: npx
args: ["-y", "github-mcp-server"]
---
# GitHub
Use this Caplet for repository, issue, pull request, and code review workflows.OpenAPI-backed Caplet files use openapiEndpoint:
---
name: Users API
description: Manage users through the internal HTTP API.
openapiEndpoint:
specPath: ./openapi.json
baseUrl: https://api.example.com
auth:
type: bearer
token: $env:USERS_API_TOKEN
---
# Users APIGraphQL-backed Caplet files use graphqlEndpoint:
---
name: Catalog GraphQL
description: Query and update catalog records through GraphQL.
graphqlEndpoint:
endpointUrl: https://api.example.com/graphql
schemaPath: ./schema.graphql
auth:
type: oidc
issuer: https://login.example.com
---
# Catalog GraphQLTop-level files derive their Caplet ID from the filename. Directory-style Caplets use
linear/CAPLET.md, which is exposed as linear; sibling files can be referenced with
normal Markdown links from CAPLET.md.
This repository includes polished working examples under caplets/:
github: GitHub's official MCP server container, usingGITHUB_PERSONAL_ACCESS_TOKEN.linear: Linear's hosted OAuth MCP endpoint.context7: Context7 documentation lookup through@upstash/context7-mcp.
Install every example from a repo's caplets/ directory:
caplets install spiritledsoftware/capletsInstall one or more individual Caplets by ID:
caplets install spiritledsoftware/caplets github
caplets install spiritledsoftware/caplets github linearcaplets install accepts a GitHub owner/repo shorthand, a Git URL, or a local repository path.
It installs into your user Caplets root, which is ~/.caplets by default or the parent directory
of CAPLETS_CONFIG when that environment variable is set. Existing Caplets are not overwritten
unless --force is passed.
Caplets always loads user Caplet files from ~/.caplets. Project ./.caplets/config.json
is still loaded as project config, but project Markdown Caplet files are executable
configuration and are ignored unless explicitly trusted:
CAPLETS_TRUST_PROJECT_CAPLETS=1 caplets serveLater sources override earlier ones in this order: user config.json, user Caplet files,
project config.json, and, only when trusted, project Caplet files.
caplets init refuses to overwrite an existing config. To intentionally replace the file:
caplets init --forceEach key under mcpServers or openapiEndpoints is the stable Caplet ID. It becomes the
generated MCP tool name exactly, so keep it short and specific:
{
"mcpServers": {
"linear": {
"name": "Linear",
"description": "Read and update Linear issues and projects.",
"command": "npx",
"args": ["-y", "linear-mcp-server"]
}
}
}Caplet IDs must match ^[a-zA-Z0-9_-]{1,64}$ and must be unique across mcpServers,
openapiEndpoints, and graphqlEndpoints. Spaces, dots, slashes, colons, and Unicode IDs are rejected.
Use command for a local stdio MCP server. args, env, and cwd are optional.
{
"name": "Local Tools",
"description": "Run local development tools through stdio.",
"command": "node",
"args": ["./server.mjs"],
"env": {
"API_TOKEN": "${API_TOKEN}"
},
"cwd": "/home/you/project"
}Use transport and url for remote MCP servers.
{
"name": "Remote Docs",
"description": "Search documentation from a remote MCP server.",
"transport": "http",
"url": "https://mcp.example.com/mcp",
"auth": {
"type": "headers",
"headers": {
"x-api-key": "$env:REMOTE_DOCS_API_KEY"
}
}
}transport can be http for MCP Streamable HTTP or sse for legacy HTTP+SSE. Remote
URLs must use https://, except loopback development URLs such as http://localhost.
Use openapiEndpoints for native HTTP APIs described by OpenAPI 3 specs. Each entry
points at one spec through either specPath or specUrl, and may override the request
base URL with baseUrl.
{
"name": "Users API",
"description": "Manage users through the internal HTTP API.",
"specPath": "./openapi.json",
"baseUrl": "https://api.example.com",
"auth": { "type": "none" }
}OpenAPI auth is explicit and supports:
{"type": "none"}{"type": "bearer", "token": "$env:TOKEN"}{"type": "headers", "headers": {"x-api-key": "$env:API_KEY"}}{"type": "oauth2", ...}{"type": "oidc", ...}
OpenAPI call_tool.arguments uses grouped HTTP inputs:
{
"operation": "call_tool",
"tool": "GET /users/{id}",
"arguments": {
"path": { "id": "42" },
"query": { "active": true },
"body": { "name": "Ada" }
}
}Every OpenAPI endpoint can set:
requestTimeoutMs: timeout for HTTP calls. Defaults to60000.operationCacheTtlMs: how long OpenAPI operation metadata stays fresh. Defaults to30000;0refreshes every time.disabled: omit the endpoint from Caplets discovery. Defaults tofalse.
Use graphqlEndpoints for native GraphQL APIs. Each entry points at a GraphQL HTTP
endpoint and exactly one schema source: schemaPath, schemaUrl, or introspection: true.
{
"name": "Catalog GraphQL",
"description": "Query and update catalog records through GraphQL.",
"endpointUrl": "https://api.example.com/graphql",
"schemaPath": "./schema.graphql",
"auth": { "type": "oidc", "issuer": "https://login.example.com" },
"operations": {
"product": {
"document": "query Product($id: ID!) { product(id: $id) { id name } }",
"operationName": "Product",
"description": "Fetch a product by ID."
}
}
}When operations is omitted or empty, Caplets auto-generates tools from schema root
fields: query_<field> and mutation_<field>. Generated tools use bounded scalar
selection sets and pass call_tool.arguments directly as GraphQL variables/root-field
arguments.
Every GraphQL endpoint can set:
requestTimeoutMs: timeout for HTTP calls. Defaults to60000.operationCacheTtlMs: how long GraphQL operation metadata stays fresh. Defaults to30000;0refreshes every time.selectionDepth: maximum depth for generated selection sets. Defaults to2; maximum5.disabled: omit the endpoint from Caplets discovery. Defaults tofalse.
Remote servers can use:
{"type": "none"}{"type": "bearer", "token": "$env:TOKEN"}{"type": "headers", "headers": {"x-api-key": "$env:API_KEY"}}{"type": "oauth2", ...}{"type": "oidc", ...}
For OAuth/OIDC-backed MCP, OpenAPI, and GraphQL Caplets, authenticate once with:
caplets auth login <server>For headless terminals:
caplets auth login <server> --no-openOAuth/OIDC tokens are stored under ~/.caplets/auth/<server>.json with owner-only file
permissions where the platform supports them. Caplets supports well-known OAuth/OIDC
discovery and dynamic client registration when advertised. When a token expires, run
caplets auth login <server> again.
To inspect or remove stored OAuth credentials:
caplets auth list
caplets auth logout <server>To list configured Caplets without starting downstream backends:
caplets list
caplets list --all
caplets list --jsonEvery server can set:
startupTimeoutMs: timeout for starting or checking the downstream server. Defaults to10000.callTimeoutMs: timeout for downstream tool calls. Defaults to60000.toolCacheTtlMs: how long downstream tool metadata stays fresh. Defaults to30000;0refreshes every time.disabled: omit the server from Caplets discovery. Defaults tofalse.
Configure your MCP client to run Caplets as a stdio server:
{
"mcpServers": {
"caplets": {
"command": "caplets",
"args": ["serve"]
}
}
}If your client starts the configured command directly, caplets without arguments also
starts the MCP server. serve is explicit and recommended for clarity.
Caplets initially exposes one MCP tool per enabled Caplet. If the config has filesystem,
docs, and users, the client sees three top-level tools: filesystem, docs, and
users.
Each generated Caplet tool accepts an operation:
{
"operation": "list_tools"
}Search within a selected server:
{
"operation": "search_tools",
"query": "read file",
"limit": 10
}Inspect one exact downstream tool:
{
"operation": "get_tool",
"tool": "read_file"
}Call one exact downstream tool:
{
"operation": "call_tool",
"tool": "read_file",
"arguments": {
"path": "/home/you/code/project/README.md"
}
}Available operations:
get_caplet: return the configured capability card without starting the downstream server.check_backend: verify the selected backend, whether MCP, OpenAPI, or GraphQL.check_mcp_server: start or connect to an MCP server and verify its tool list.list_tools: return compact downstream tool metadata.search_tools: search downstream tool names and descriptions within this Caplet.get_tool: return full metadata for one exact downstream tool.call_tool: invoke one exact downstream tool with JSON object arguments.
Requests are strict: operation-specific extra fields are rejected, and call_tool requires
arguments to be a JSON object.
pnpm install
pnpm devUseful commands:
pnpm build
pnpm test
pnpm typecheck
pnpm lint
pnpm format:check
pnpm schema:generate
pnpm schema:check
pnpm verifypnpm dev rebuilds on source changes and restarts the local stdio MCP server from
dist/index.js. Use it for local development, not as the command configured in an MCP
client, because build logs are written to stdout.
The product requirements document lives at
docs/product/caplets-progressive-mcp-disclosure-prd.md.
It describes the progressive MCP disclosure model, configuration rules, MVP tool surface,
security expectations, and non-goals.
Caplets intentionally does not provide a hosted service, GUI, cross-server flattened tool
search, automatic MCP client config import, or namespaced flattened tool IDs such as
server.tool.
Progressive disclosure is context management, not a security boundary. Caplets reduces the tool surface shown to the agent up front, but downstream MCP servers remain responsible for their own tool behavior and any client-side confirmations.
User-facing changes should include a changeset:
pnpm changesetMerging changesets to main lets the release workflow open a version PR. Merging that
version PR publishes the package to npm through trusted publishing.
MIT