fix(api): safely join api_url + path so misconfigured base can't corrupt routes#1650
Conversation
…rrupt routes A misconfigured `api_url` (e.g. user pasted an LLM endpoint `https://api.tinyhumans.ai/openai/v1/chat/completions` into the base URL setting) silently corrupted every `/agent-integrations/...` call because `format!("{base}{path}")` just concatenated. The result — `…/openai/v1/chat/completions/agent-integrations/composio/toolkits` — 404s, breaking Composio connections and toolkit loading. Add `api::config::api_url(base, path)` that uses `url::Url::join` so an absolute-path reference replaces any path baked into the base (RFC 3986). Empty path returns the normalized base; unparseable base falls back to a slash-safe concat so callers always get a usable string. Replace the three call sites in the integrations + composio HTTP clients.
📝 WalkthroughWalkthroughThis PR adds a centralized API URL joining helper ( ChangesAPI URL Normalization Consolidation
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~18 minutes Poem
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Warning Review ran into problems🔥 ProblemsStopped waiting for pipeline failures after 30000ms. One of your pipelines takes longer than our 30000ms fetch window to run, so review may not consider pipeline-failure results for inline comments if any failures occurred after the fetch window. Increase the timeout if you want to wait longer or run a Tip 💬 Introducing Slack Agent: The best way for teams to turn conversations into code.Slack Agent is built on CodeRabbit's deep understanding of your code, so your team can collaborate across the entire SDLC without losing context.
Built for teams:
One agent for your entire SDLC. Right inside Slack. Comment |
config.api_url was double-purposed as both the OpenHuman product backend URL (auth/billing/voice/integrations/...) AND the LLM inference endpoint introduced in tinyhumansai#1342. Pointing api_url at a custom OpenAI-compat provider silently rerouted every other backend call to OpenAI, breaking GET /auth/me, GET /auth/google/login, voice transcribe, billing, etc. This commit cleanly separates the two concerns: - New optional `config.inference_url`. When set together with `api_key`, the inference provider talks directly to that URL; otherwise inference flows through the OpenHuman backend at `{api_url}/openai/v1/...`. - `api_url` always means the OpenHuman product backend URL (defaults to api.tinyhumans.ai). Auth/billing/voice/etc. callers are untouched. - New `effective_inference_url()` helper derives the URL via the safe `api_url()` joiner from tinyhumansai#1650. - Inference provider factories (`create_backend_inference_provider`, `create_resilient_provider*`, `create_routed_provider*`, `create_intelligent_routing_provider`) now take separate `inference_url` and `backend_url` params; all callers updated. - ChannelRuntimeContext gains an `inference_url` field plumbed from Config so per-channel provider creation honours the override. - Settings: `update_model_settings` accepts an `inference_url` patch field; `get_client_config` returns it. - Frontend: BackendProviderPanel reads/writes `inference_url` (never `api_url`). OpenHuman preset clears `inference_url` so inference flows back through the backend. Migration: on config load, any legacy `api_url` value ending in `/chat/completions` is moved into `inference_url` (and cleared if it pointed at the OpenHuman backend itself). Also: RouterProvider now maps OpenHuman's abstract tier names (`reasoning-v1`, `agentic-v1`, `coding-v1`, `summarization-v1`) through the user's `model_routes` so a custom provider receives the configured model id instead of the literal tier name, which would 404 on OpenAI/Anthropic/etc. Reported by user: chat hit "model 'reasoning-v1' does not exist" on OpenAI, and login redirected to api.openai.com/.../auth/google/login.
Summary
api_url(e.g. a user pasted the LLM endpointhttps://api.tinyhumans.ai/openai/v1/chat/completionsinto the base URL setting) silently corrupted every/agent-integrations/...call because the HTTP clients justformat!("{base}{path}")-concatenated.…/openai/v1/chat/completions/agent-integrations/composio/toolkitsthat 404 — breaking Composio connections ("stale") and toolkit loading for affected users.api::config::api_url(base, path)that usesurl::Url::joinso an absolute-path reference replaces any path baked into the base (RFC 3986), and replace the three concat call sites.Problem
The
Config.api_urlfield is the single base for both LLM-proxy calls and/agent-integrations/*calls. When a user puts a path-bearing URL in that field, the integrations client compounds the path. Every Composio call (list_toolkits,list_tools,list_connections,authorize,triggers, …) and every shared integrationsget/postis affected. The user-visible symptom is "Connections are showing stale status" plus 404s fromlist_toolkits.Solution
pub fn api_url(base: &str, path: &str) -> Stringinsrc/api/config.rs.path→ normalized base (no trailing slash).path→url::Url::parse(base)?.join(path)?. For an absolute-path reference (/agent-integrations/...) this replaces the base's path per RFC 3986, neutralising the misconfigured-base case.fallback_concatso callers always get a usable string.src/openhuman/integrations/client.rs(POST + GET) andsrc/openhuman/composio/client.rs(DELETE).Url::joinwould drop the base's last path segment if path is relative — documented in the helper's docstring. All our API paths start with/, so this is safe.Submission Checklist
api_urlitself, which are the changed lines.Impact
Url::parseper request; negligible.https://api.tinyhumans.ai, with or without trailing slash) produce identical output to the oldformat!path.Related
api_urlat config load to warn the user proactively; not done here to keep the diff minimal.AI Authored PR Metadata (required for Codex/Linear PRs)
Linear Issue
Commit & Branch
Validation Run
pnpm --filter openhuman-app format:checkapp/TypeScript changes in this PR.cargo test --manifest-path Cargo.toml --lib api::config::(11/11 pass, including 5 newapi_url_*tests)cargo fmt --checkcleanValidation Blocked
command:N/Aerror:N/Aimpact:N/ABehavior Changes
api_url(e.g. pasted an LLM endpoint URL) no longer see/agent-integrations/*404s. Composio connections stop showing "stale" for that misconfiguration.Parity Contract
api_urlis the correct origin (e.g.https://api.tinyhumans.ai), the joined URL is byte-identical to what the previousformat!produced.api_url_unparseable_base_falls_back_to_concat).Duplicate / Superseded PR Handling