fix: force https for extension registry fetch to prevent MITM catalog tampering#162
Merged
jmaxdev merged 2 commits intoTrixtyAI:mainfrom Apr 21, 2026
Merged
Conversation
|
Thanks for the contribution! I'll review it as soon as possible. If you have still changes, please mark this PR as draft and all reviews will be cancelled. Tests reviews will be re-run only when the PR is marked as ready for review. |
There was a problem hiding this comment.
Pull request overview
This PR mitigates a registry-catalog MITM vector by ensuring the extension marketplace catalog is only fetched over HTTPS, with an additional backend guardrail to reject insecure registry URLs.
Changes:
- Updated the production registry URL in the desktop frontend to use
https://raw.githubusercontent.com/.... - Updated the Tauri backend
get_registry_catalogcommand to rejecthttp://registry URLs with an explicit error.
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| apps/desktop/src/context/ExtensionContext.tsx | Switches the production registry catalog fetch to HTTPS. |
| apps/desktop/src-tauri/src/extensions.rs | Adds backend-side rejection of insecure http:// registry URLs before fetching. |
Comments suppressed due to low confidence (1)
apps/desktop/src-tauri/src/extensions.rs:74
get_registry_catalogdoesn’t checkresponse.status()before reading/parsing the body. If the host returns a 404/500 HTML page, the error becomes “Failed to parse registry JSON…” which hides the real HTTP failure and makes troubleshooting harder. Consider callingerror_for_status()(or checkingis_success()and returning a status-aware error) before parsing.
let response = shared_client()
.get(&url)
.timeout(DEFAULT_REQUEST_TIMEOUT)
.send()
.await
.map_err(|e| e.to_string())?;
let body = read_text_capped(response, MAX_RESPONSE_BYTES).await?;
let catalog: RegistryCatalog = serde_json::from_str(&body).map_err(|e| {
let err = format!("Failed to parse registry JSON from {}: {}", url, e);
error!("{}", err);
err
})?;
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
4 tasks
… tampering The prod registry URL was hardcoded to http://raw.githubusercontent.com, which allows a network attacker to inject arbitrary repository/data entries into the catalog. Those entries are then passed to install_extension and cloned via git, escalating a network MITM directly into code execution on the user's machine. - Frontend: switch the registry URL to https://. - Backend: reject http:// URLs in get_registry_catalog outright; only accept https:// or local file paths (dev mode). - Drop the stale comment about an HTTP/local fallback that never existed (the ternary always picked exactly one branch based on NODE_ENV).
Addresses review feedback on TrixtyAI#162: - Trim whitespace and lower-case the scheme before matching so inputs like " HTTP://…" or "Https://…" hit the intended branch instead of silently falling through to the local-file fallback with a confusing "Failed to read local registry file" error. - Check response.status() after send(). A 404/500 HTML body was previously becoming "Failed to parse registry JSON", hiding the real HTTP failure (typo'd URL, registry missing, proxy blocking).
0241b69 to
ca70a2f
Compare
jmaxdev
pushed a commit
that referenced
this pull request
Apr 21, 2026
# [Fix]: Allow-list `git_url` before `install_extension` spawns `git clone` ## Description `install_extension` forwarded the caller-supplied `git_url` straight to `git clone`. Any scheme went through — `file://`, Windows UNC, `ssh://`, `git://`, and the especially dangerous `ext::` transport helpers. A compromised catalog entry could therefore trigger arbitrary code at install time via git helpers, symlink tricks, or credential-embedding URLs. Combined with the plain-HTTP registry fixed in #162 and the fragile `.replace` derivation fixed in #157, this was the final gate between a MITM and code execution on the user's machine. ## Change `apps/desktop/src-tauri/src/extensions.rs` — new `validate_git_clone_url` is called **before** `install_extension` builds a target directory, and the clone invocation now runs with defensive `git -c` flags. ### URL validation - **Strict `https://` scheme** (case-sensitive). Catches `HTTP://`, `http://`, `file://`, `ssh://`, `git://`, UNC `\\host\share`, etc. - **Host allowlist**: `github.com`, `gitlab.com`, `bitbucket.org`. Every legitimate catalog entry today already satisfies this, and the frontend helper `resolveGitRepoUrl` produces exactly this shape. - **Path**: exactly `<owner>/<repo>` or `<owner>/<repo>.git`. - **Segment content**: ASCII alphanumeric + `-`, `_`, `.`; no leading `-` (prevents `--upload-pack` / `--config=` flag injection through the path). - **No credentials or alternate-host tricks**: rejects any `@` in the authority section (`https://user@github.com/...`, `https://github.com@evil.example/...`). - **No transport-helper syntax**: rejects any `::` anywhere in the URL. - **No control characters or whitespace**. ### Clone hardening Clone is now invoked as: ``` git -c protocol.ext.allow=never \ -c protocol.file.allow=never \ -c protocol.git.allow=never \ clone --depth 1 <url> <target> ``` The allow-listed `https` host is unaffected. If anything ever gets past the URL check, git itself refuses those transports at runtime. ### Tests Added `git_url_validation_tests` with 12 cases covering the accept/reject matrix. `cargo test git_url_validation` → 12/12 pass. ## Trade-offs - **Allowlist is conservative.** Only three hosts are accepted today. Catalog entries on other git hosts (codeberg, sourcehut, self-hosted instances) would be rejected. That matches the marketplace's current scope; widening it later is a one-line diff. - **No `.git` suffix required.** I considered requiring `.git` on the repo segment, but GitHub/GitLab both redirect `…/owner/repo` to the `.git` form for git clients, and `resolveGitRepoUrl` on the frontend already appends `.git` when it normalizes URLs. Both shapes are accepted. - **`update_extension` / `uninstall_extension` not touched.** Those operate on the already-cloned directory and don't take a URL argument, so they're not part of the `git clone` attack surface the issue describes. Hardening them (e.g. adding the same `-c` flags to `git pull`) is a separate improvement. ## Verification - `cargo check`, `cargo clippy --tests -- -D warnings` → clean. - `cargo test git_url_validation` → 12 passed. - Manual trace against the in-tree marketplace entry (the example addon on github.com/TrixtyAI) → clone still succeeds. - Each rejection branch exercised by a unit test, listed in `git_url_validation_tests`. ## Related Issue Fixes #55 ## Checklist - [x] I have tested this on the latest version. - [x] I have followed the project's coding guidelines. - [x] My changes generate no new warnings or errors. - [x] I have verified the fix on: - OS: Windows - Version: v1.0.10
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
[Fix]: Force https for extension registry fetch
Description
The prod registry URL in
ExtensionContext.tsxwas hardcoded tohttp://raw.githubusercontent.com/.... Plain HTTP lets a network attackerrewrite the catalog in flight, injecting arbitrary
repository/dataentries. Those entries are then fed straight to
install_extensioninRust, which hands them to
git clone. Combined with the unvalidated gitURL path (issue #55), that turns a MITM into code execution on the user's
machine.
Context / Problem
HTTPS for
raw.githubusercontent.comis free and available today; thereis no reason to fetch the catalog over clear text. Switching the URL alone
fixes the immediate vector, but nothing stopped a future caller of
get_registry_catalogfrom passing anotherhttp://URL. So the fix isapplied in two layers.
Change
apps/desktop/src/context/ExtensionContext.tsx): changethe prod
registryUrltohttps://raw.githubusercontent.com/....Removed a stale comment about an HTTP/local fallback that never existed
— the ternary picks exactly one branch based on
NODE_ENV.apps/desktop/src-tauri/src/extensions.rs): inget_registry_catalog, rejecthttp://URLs outright with a clearerror instead of fetching them. Only
https://and local file paths(dev mode) are accepted.
Trade-offs
http://(e.g., on an internal LAN)would now be blocked. That is intentional: the command cannot tell a
legitimate intranet HTTP server apart from a MITM, so HTTPS is required
for any remote fetch.
left out — the current catalog is expected to come from
raw.githubusercontent.com, but forks and self-hosted catalogs on otherHTTPS origins are a legitimate use case. Catalog signing (suggestion 3)
is a larger piece of work and belongs in its own PR.
Verification
cargo check(src-tauri) → clean.pnpm tsc --noEmitacross the desktop app → clean.get_registry_catalog:http://...hits the early return,https://...goes throughshared_client, and local paths still fallthrough to
std::fs::read_to_stringfor dev mode.Related Issue
Fixes #56
Checklist