The secure secret handoff tool and credential setup wizard for AI agents.
When your AI assistant needs a password, API key, or any sensitive credential, where does it go? Through chat history. Logged. Stored. Exposed.
Confidant solves this. It creates a secure, time-limited channel where humans can submit secrets directly to AI assistants — without the secret ever touching chat logs.
AI assistants like OpenClaw, Nanobot, Picoclaw, Zeroclaw, Claude Code, and others are becoming trusted collaborators. They need credentials to help you:
- Deploy to your servers
- Access your APIs
- Configure your services
- Manage your accounts
But every time you paste a password in chat, it's logged somewhere. Chat history, model training data, audit logs. That's not secure.
Confidant provides a pull-based secret handoff:
- AI requests a secret → Creates a secure request with a unique URL
- Human receives the URL → Opens it in their browser
- Human submits the secret → Direct browser-to-server, bypassing chat
- AI retrieves the secret → Auto-polls until the secret arrives
- Secret self-destructs → Deleted immediately after retrieval
┌─────────────┐ 1. Request ┌─────────────┐
│ │ ─────────────────▶ │ │
│ AI │ │ Confidant │
│ Assistant │ ◀───────────────── │ Server │
│ │ 4. Secret │ │
└─────────────┘ └─────────────┘
▲
│ 3. Submit
│ (HTTPS)
┌─────────────┐
2. URL via chat │ │
─────────────────────────────▶ │ Human │
│ Browser │
└─────────────┘
The secret never passes through chat. It's a direct handshake between human and AI.
No installation required:
npx @aiconnect/confidant serve-request --label "API Key"A secure URL is generated. Open it in your browser, submit the secret, and it's delivered directly to the terminal.
For frequent use, install globally:
npm install -g @aiconnect/confidantOr run any command with npx @aiconnect/confidant <command> without installing.
If you're running an AI agent like OpenClaw, install the Confidant skill from ClawHub:
clawdhub install confidantOnce installed, your agent learns how to:
- Request secrets from users without exposing them in chat
- Deliver secrets to users securely
- Exchange credentials with other agents
The primary flow — an AI assistant needs a credential from a human:
# Start server + create request in one command
confidant serve-request --label "API Key"
# Or, if the server is already running:
confidant request --label "API Key"The AI shares the URL in chat, the human opens it in their browser and submits the secret. The secret is delivered directly to the AI — never logged in chat.
Confidant can save received secrets directly to your filesystem with secure permissions (chmod 600), making it a one-command setup wizard:
Convention mode — saves to ~/.config/<service>/api_key:
confidant request --service serpapi
confidant request --service openai --env OPENAI_API_KEYExplicit path mode — full control over file location:
confidant request --save ~/.credentials/my-secret.txt
confidant request --save ~/.config/aws/credentials --env AWS_ACCESS_KEY_IDAuto-save also works with the get command:
confidant get <secret-id> --service myserviceWhen the AI needs to securely deliver a secret to a user (generated password, API key):
# User runs this (they will receive the secret)
confidant serve-request --label "Generated Password"
# → http://localhost:3000/requests/abc123...
# AI executes this to send the secret
confidant fill "http://localhost:3000/requests/abc123..." --secret "my-secure-password-123"The secret travels from AI → server → user terminal, never appearing in chat.
Confidant supports direct secret submission between automated agents without requiring a browser:
# Agent A - Creates a request
confidant request --label "API Key"
# → http://192.168.1.100:3000/requests/abc123...
# Agent B - Submits the secret programmatically
confidant fill "http://192.168.1.100:3000/requests/abc123..." --secret "sk-xxxx"For production, avoid passing secrets on the command line:
# Read from stdin (safer - avoids shell history)
echo "$SECRET" | confidant fill <url> --secret -
# From password managers
op read "op://Vault/Item/password" | confidant fill <url> --secret -JSON output for scripting:
result=$(confidant fill "$URL" --secret "$SECRET" --json)
if echo "$result" | jq -e '.success' > /dev/null; then
echo "Secret delivered"
fiOrchestrator pattern — distributing secrets to multiple agents:
#!/bin/bash
for agent in "http://agent1:3000" "http://agent2:3000"; do
request=$(curl -s "$agent/requests" -X POST -H "Content-Type: application/json" \
-d '{"expiresIn": 300, "label": "DB Credentials"}')
hash=$(echo "$request" | jq -r '.hash')
confidant fill "$agent/requests/$hash" --secret "$DB_PASSWORD"
doneconfidant request --quiet
# Outputs just the URL to share with the humanconfidant create --secret "$DEPLOY_KEY" --max-access-count 1
# Share the ID with your pipelineconfidant create --secret "temp-password" --ttl 300000
# Secret expires in 5 minutesStart the Confidant server.
Start the server and immediately create a secret request.
Create a secret request and wait for submission.
Options:
--expires-in <seconds>- Request expiration (default: 86400)--poll-interval <seconds>- Polling interval (default: 2)--poll <id>- Manually poll for an existing request by ID--label <text>- Label describing the requested secret (max 200 characters)--save <path>- Save received secret to file path (with chmod 600)--service <name>- Save to ~/.config/<service>/api_key (convention mode)--env <varname>- Set environment variable after saving (requires --save or --service)--quiet- Minimal output (just URLs and secret)--json- JSON output format--verbose- Show detailed info including network detection
Create a secret directly.
Options:
--secret <value>- The secret to store--ttl <ms>- Time-to-live in milliseconds--max-access-count <n>- Maximum access count
Retrieve a secret by ID.
Options:
--save <path>- Save to file path (with chmod 600)--service <name>- Save to ~/.config/<service>/api_key (convention mode)--env <varname>- Set environment variable after saving (requires --save or --service)
Submit a secret to an existing request.
Options:
--secret <value>- Secret value to submit (use-for stdin)--json- JSON output format
Examples:
confidant fill "http://localhost:3000/requests/abc123..." --secret "sk-xxxx"
confidant fill abc123... --secret "sk-xxxx" --api-url "http://192.168.1.100:3000"
echo "my-secret" | confidant fill <url> --secret -Delete a secret.
Check secret status.
| Method | Endpoint | Description |
|---|---|---|
| POST | /requests |
Create a secret request |
| GET | /requests/:hash |
Secret submission form (HTML) |
| POST | /requests/:hash |
Submit a secret |
| GET | /requests/:id/poll |
Poll for secret availability |
| POST | /secrets |
Create a secret directly |
| GET | /secrets/:id |
Retrieve a secret |
| GET | /secrets/:id/status |
Check secret status |
| DELETE | /secrets/:id |
Delete a secret |
| GET | /api/urls |
Get server URLs (localhost and network) |
| GET | /health |
Health check |
npm run devdocker build -t confidant .
docker run -p 3000:3000 confidantDeploy behind a reverse proxy (nginx, Caddy) with proper TLS.
When creating a request, Confidant automatically detects and displays URLs for different scenarios:
- Localhost — for same-machine access (
http://localhost:3000/requests/...) - Local Network IP — auto-detected for cross-device access on the same network (
http://192.168.1.100:3000/requests/...)
Use --verbose to see network detection details.
For containers, VMs, or remote access, expose Confidant with a tunneling service:
| Service | Setup | Notes |
|---|---|---|
| ngrok | ngrok http 3000 |
Quick setup, free tier available |
| localtunnel | npx localtunnel --port 3000 |
No signup required |
| Cloudflare Tunnel | cloudflared tunnel --url http://localhost:3000 |
Zero trust |
| Tailscale | tailscale ip -4 (use the IP) |
Mesh VPN |
Then point the CLI to the public URL:
export CONFIDANT_API_URL=https://abc123.ngrok.io
confidant request- No persistence by default: Secrets live in memory only
- Auto-expiration: Secrets self-destruct after TTL or access limit
- No chat logging: Secrets bypass chat history entirely
- Secure file permissions: Auto-saved files use chmod 600 (owner read/write only)
- HTTPS required: Always use TLS in production (ngrok provides this automatically)
Localhost URL not working?
- Ensure the server is running:
confidant serve - Check if port 3000 is available:
lsof -i :3000
Local Network IP not accessible?
- Verify both devices are on the same network
- Check firewall settings on the host machine
- Use
--verboseto see the detected IP
Cross-device not working?
- Mac Mini + iPhone: ensure both are on the same Wi-Fi
- Docker + Host: use
--network hostor port mapping - VM + Host: use bridged or host-only network adapter
Tunneling issues?
- Verify the tunnel is forwarding to port 3000
- Check the tunnel service status page
- Some tunnels require authentication — check service docs
| Variable | Default | Description |
|---|---|---|
PORT |
3000 | Server port |
CONFIDANT_PORT |
3000 | Server port (alternative) |
CONFIDANT_API_URL |
http://localhost:3000 |
API URL for CLI |
- Runtime: Node.js 18+
- Language: TypeScript
- Framework: Hono
- Validation: Zod
For internal documentation on the modular registry system, see docs/registry.md.
MIT. See LICENSE.
AI Connect is a registered trademark under the control of ERIC SANTOS LLC.
See TRADEMARKS.md.
Confidant: Because your AI assistant shouldn't have to ask for passwords in public.