Skip to content

fcvr1010/cardea

Repository files navigation

Cardea

Proxy to allow AI agents to safely access services, without the need for credentials.

To do useful stuff, our companion AI agents often need access to external services, such as gmail, github, telegram, ...

There are 2 important aspects to consider from a security standpoint:

  1. Is it safe to grant them that access at all?
  2. Is there a way to give the access without explicitly sharing credentials?

Point 1 is a personal decision. A safe approach is typically to create dedicated accounts for the bots, rather than giving them access to our own personal account. Clearly this comes with trade-offs and everybody should think this through given their own specific situation.

Cardea helps with point 2. If you give a Large Language Model (LLM) your credentials, nothing ensures that they won't end up in the provider logs and/or leaked on the web. Despite the remarkable efforts of the frontier labs when it comes to increasing LLM security, the prompt-injection risk will always remain as it is intrinsic in the computational model of LLMs, where "code" and data are not separated.

So the only safe way it to not share credentials at all. But then how can the agent access the service? Through a separate local proxy that handles the credentials. That's Cardea.

Cardea exposes local endpoints for, e.g., sending an email via gmail. The agent just calls the endpoint with no auth. Then Cardea injects the credentials and submits the actual request to the gmail APIs.

Architecture

Request flow

Client (AI agent)
  │
  ▼
Cardea (FastAPI)
  ├─ Match route by prefix
  ├─ Load secret from /run/secrets/ or env var
  ├─ Inject credentials into upstream request
  │
  ▼
Upstream service (Gmail, GitHub, Telegram, …)

Every request flows through the same pattern: the agent calls a local Cardea endpoint with no authentication, Cardea looks up the appropriate secret and injects it (as a Bearer token, Basic auth header, query parameter, etc.), then streams the upstream response back to the caller.

Two kinds of services

Config-driven generic services are declared entirely in config.toml under [services.*] sections. Each section specifies a URL prefix, an upstream base URL, and an auth type. The generic proxy engine (cardea.proxies.generic) builds a catch-all route for each service at startup — no Python code required. This covers most REST API integrations.

Custom modules handle cases that need special logic (OAuth2 token refresh, protocol translation, multi-step flows). Each module is a Python file in src/cardea/proxies/ that exports:

Export Required Description
router Yes A FastAPI APIRouter with the endpoints
PREFIX No URL prefix (defaults to /<module_name>)
TAG No OpenAPI tag (defaults to module name)

Module auto-discovery

At startup, app.py iterates over all Python files in cardea.proxies using pkgutil.iter_modules. A module is loaded only if it is enabled in the [modules] table of config.toml (e.g. email = true). The browser credential manager is a special case — it loads automatically when a [browser] config section exists, without needing a [modules] entry.

Secrets

Secrets are resolved lazily on each request via cardea.secrets.get_secret: first as a file in /run/secrets/<name> (for container secret mounts), falling back to an environment variable of the same name.

Running

Directly with uv

Copy config.toml.example to config.toml and enable the modules you need. Set the required credentials as environment variables (listed in config.toml.example), then:

Tip: Set CARDEA_CONFIG=/path/to/config.toml to override the default config location. When unset, Cardea looks for config.toml in the repository root.

uv run cardea --host 127.0.0.1 --port 8000

As a container with docker / podman

Build the image from the repo root:

podman build -t cardea .

When running in a container, credentials can be provided as files under /run/secrets/ (e.g. via podman secret or docker secret) instead of environment variables. Each module looks for its secret by name (e.g. cardea_github_token) — first as a file in /run/secrets/<name>, then as an env var. See config.toml.example for the full list.

# Example with podman secrets
echo -n "ghp_..." | podman secret create cardea_github_token -
podman run --secret cardea_github_token -v ./config.toml:/app/config.toml:ro -p 8000:8000 cardea

Mount your config.toml into the container at /app/config.toml.

Client library

Cardea ships an optional Python client library that wraps the proxy endpoints into simple function calls. Install it with:

pip install cardea[client]

Quick start

from cardea.client.email import send_email, list_messages
from cardea.client.github import github_api, create_pr
from cardea.client.browser import fill_credentials

# List unread emails
messages = list_messages(query="UNSEEN")

# Send an email
send_email(to="alice@example.com", subject="Hello", body="Hi from Cardea!")

# Create a GitHub pull request
create_pr("owner", "repo", title="My PR", head="feature-branch")

# Fill a login form in a browser
fill_credentials("github.com/login")

By default the client connects to http://localhost:8000. Override this with the CARDEA_URL environment variable or pass base_url explicitly to any function:

list_messages(base_url="http://cardea.local:8000")

Using with AI agents

AI agents can call the client directly from a shell:

python3 -c "
from cardea.client.email import list_messages
import json
print(json.dumps(list_messages(query='UNSEEN'), indent=2))
"

This pattern avoids raw httpx calls and keeps agent tool code minimal.

Contributing

Contributions from coding agents are welcome too. Respecting the architecture is mandatory.

Adding a new service

Config-driven (no code changes)

For simple REST API proxying, add a [services.<name>] section to config.toml:

[services.my-api]
prefix = "/my-api"
upstream = "https://api.example.com"
auth = { type = "bearer", secret = "my_api_token" }

Supported auth types: bearer, basic, header, query, none.

Then create the secret (podman secret create my_api_token /path/to/token) and restart.

Custom module (for complex logic)

For services requiring custom logic (OAuth2 token refresh, non-HTTP protocols, multi-tenant routing), create a Python module in src/cardea/proxies/ with a router, PREFIX, and TAG.

Browser credential manager

Cardea includes a CDP-based (Chrome DevTools Protocol) credential manager that can auto-fill login forms in a remote Chromium instance without the AI agent ever seeing the actual credentials. This is useful when an agent drives a browser (e.g. via the agent's browser tool) and needs to log in to a website.

The browser module is loaded automatically when a [browser] section exists in config.toml -- it does not need an entry in [modules].

Configuration

The [browser] section sets the CDP connection:

Key Description
cdp_endpoint WebSocket URL of the Chromium CDP debugging port (e.g. ws://localhost:9222)

Each [browser.sites.<name>] section defines a site whose login form Cardea can fill:

Key Description
url_pattern Substring matched against the domain/URL passed by the caller
secret Name of the Podman/Docker secret containing credentials as JSON
fields Array of { selector, key } objects (CSS selector + JSON key)

The secret must be a JSON object whose keys match the key values in fields. For example, {"username": "alice", "password": "s3cret"}.

Example

[browser]
cdp_endpoint = "ws://localhost:9222"

[browser.sites.github]
url_pattern = "github.com/login"
secret = "browser_github"
fields = [
  { selector = "#login_field", key = "username" },
  { selector = "#password", key = "password" },
]

Then create the secret:

echo -n '{"username": "alice", "password": "s3cret"}' | podman secret create browser_github -

How it works

  1. The caller sends POST /browser/fill with {"domain": "github.com/login"}.
  2. Cardea matches the domain against url_pattern in the configured sites.
  3. Loads the credential JSON from the named secret.
  4. Connects to Chromium via CDP and fills each form field using Runtime.evaluate, dispatching input and change events.
  5. Returns {"status": "filled", "fields_filled": N}.

The agent's browser tool calls this endpoint automatically when it needs to log in to a configured site.

Who's cardea

In the Roman tradition, Cardea is a deity protecting households from harmful spirits entering through doors. Symbolism: the hinge (the mechanism that allows a door or gate to open and close), in Latin cardo, cardinis.

About

Proxy to allow AI agents to safely access services, without the need for credentials

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors