Skip to content

Desktop: Remove Anthropic key leak from /api-keys, polish harness switching, align with #6594 architecture #6928

@beastoin

Description

@beastoin

Current State (Facts)

1. Anthropic API key leaked via /api-keys endpoint

  • Endpoint: GET /v1/config/api-keys in desktop/Backend-Rust/src/routes/config.rs (line 36)
  • Response: Returns anthropic_api_key, firebase_api_key, google_calendar_api_key as JSON
  • Source env var: ANTHROPIC_API_KEY loaded from env in Backend-Rust/src/config.rs (line 134)
  • Flow: Rust backend reads ANTHROPIC_API_KEY → returns via HTTP → APIKeyService.swift stores it → sets in process env via setenv("ANTHROPIC_API_KEY", key, 1) (line 156) → ACP bridge subprocess inherits it
  • The same ANTHROPIC_API_KEY env var is also used by the Rust backend's own /v2/chat/completions proxy (Backend-Rust/src/routes/chat_completions.rs, line 462)
  • Pi-mono adapter correctly scrubs ANTHROPIC_API_KEY from its subprocess env (acp-bridge/src/adapters/pi-mono.ts, line 243), but the key still transits through the Swift app process and ACP bridge

2. Harness switching requires app restart

  • Settings UI picker in SettingsPage.swift (lines 2310-2372, 3130-3195): Picker with "Omi AI".tag("piMono") and "Your Claude Account".tag("claudeCode")
  • switchBridgeMode() in ChatProvider.swift (lines 818-843) stops and restarts the bridge subprocess
  • Despite the programmatic restart, the switch does not take effect without a full app restart in practice
  • BridgeMode enum values: .omiAI = "agentSDK" (legacy, auto-migrated), .userClaude = "claudeCode", .piMono = "piMono" (default)

3. No provider attribution/branding in UI

  • When "Omi AI" is selected, the description text says "Using your Omi account. All inference routed through api.omi.me." — text only, no logo
  • When "Your Claude Account" is selected, a green checkmark icon shows "Connected to Claude" — no Anthropic/Claude logo
  • No pi-mono logo or Omi branding displayed alongside the active provider

4. Omi credit handling exists but is client-side only

  • @AppStorage("omiAICumulativeCostUsd") in ChatProvider.swift (line 536) tracks cumulative cost
  • $50 threshold triggers "Upgrade to Omi Pro" alert (ChatPage.swift, lines 207-219)
  • Cost comes from queryResult.costUsd returned by pi-mono adapter via PiUsage.cost
  • No server-side credit balance check — client trusts backend cost reporting

5. Current env var naming

Env var Used by Purpose
ANTHROPIC_API_KEY Rust backend + ACP bridge Single key for both backend proxy and client-side ACP
OMI_API_KEY Pi-mono extension Firebase ID token (~1hr, set as OMI_API_KEY in subprocess)
OMI_AUTH_TOKEN ACP bridge pi-mono mode Firebase ID token passed from Swift to bridge
OMI_API_BASE_URL Pi-mono extension Rust backend URL for /v2/chat/completions

6. Architecture gap vs #6594

Issue #6594 defines the target architecture:

  • Desktop App → HarnessBridge → AcpAdapter / PiMonoAdapter → pi --mode rpc → omi-provider extension → api.omi.me
  • "Omi AI" (default) routes ALL LLM calls through api.omi.me for server-side cost control
  • "Claude Account" uses ACP with user's own OAuth — no Omi key involved
  • The Omi API key should never leave the server; the client authenticates via Firebase token only

Current state diverges:

  • ANTHROPIC_API_KEY is returned to the client via /api-keys and injected into subprocess env
  • ACP "Mode A" passes the Omi Anthropic key directly to the ACP subprocess (bypasses api.omi.me proxy)
  • No separation between "legacy key for ACP passthrough" and "backend-only key for completion proxy"

What needs to change

  1. Stop returning ANTHROPIC_API_KEY from /v1/config/api-keys to clients. Rename the current env var as a legacy identifier. Create a separate backend-only key for /v2/chat/completions.
  2. Fix harness hot-reload so switching between Omi AI and Claude Account takes effect without restarting the app.
  3. Add pi-mono / Omi attribution (logo) when "Omi AI" is active. Ensure Omi credit tracking is preserved.
  4. Align naming and architecture with Desktop: add pi-mono harness with Omi API proxy for server-side cost control #6594: client authenticates via Firebase token, all LLM calls route through api.omi.me (Omi AI mode), no raw Anthropic key on the client.

Ref: #6594

Metadata

Metadata

Assignees

No one assigned

    Labels

    intelligenceLayer: Summaries, insights, action itemsp2Priority: Important (score 14-21)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions