diff --git a/AGENTS-CN.md b/AGENTS-CN.md index 82265d220..a9b856acb 100644 --- a/AGENTS-CN.md +++ b/AGENTS-CN.md @@ -18,6 +18,7 @@ BitFun 是一个由 Rust workspace 与共享 React 前端组成的项目。 |---|---|---| | Core(产品逻辑) | `src/crates/core` | [AGENTS.md](src/crates/core/AGENTS.md) | | 已拆出的 core 支撑 crate | `src/crates/{core-types,agent-stream,runtime-ports,terminal,tool-runtime}` | (使用 core 指南) | +| Core owner crate | `src/crates/{services-core,services-integrations,agent-tools,tool-packs,product-domains}` | (使用 core 指南 + 拆解护栏) | | Transport 适配层 | `src/crates/transport` | (使用 core 指南) | | API layer | `src/crates/api-layer` | (使用 core 指南) | | AI adapters | `src/crates/ai-adapters` | [AGENTS.md](src/crates/ai-adapters/AGENTS.md) | diff --git a/AGENTS.md b/AGENTS.md index 1f8c29de6..d4b55997f 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -18,6 +18,7 @@ Repository rule: **keep product logic platform-agnostic, then expose it through |---|---|---| | Core (product logic) | `src/crates/core` | [AGENTS.md](src/crates/core/AGENTS.md) | | Extracted core support | `src/crates/{core-types,agent-stream,runtime-ports,terminal,tool-runtime}` | (use core guide) | +| Core owner crates | `src/crates/{services-core,services-integrations,agent-tools,tool-packs,product-domains}` | (use core guide + decomposition guardrails) | | Transport adapters | `src/crates/transport` | (use core guide) | | API layer | `src/crates/api-layer` | (use core guide) | | AI adapters | `src/crates/ai-adapters` | [AGENTS.md](src/crates/ai-adapters/AGENTS.md) | diff --git a/docs/architecture/core-decomposition.md b/docs/architecture/core-decomposition.md index ebc5f303a..5eb55ce58 100644 --- a/docs/architecture/core-decomposition.md +++ b/docs/architecture/core-decomposition.md @@ -81,6 +81,14 @@ owner 边界,否则不要把一个 feature group 继续拆成更小的 crate - 新拆出的 crate 不得反向依赖 `bitfun-core`。 - `bitfun-core` 可以依赖新拆出的 crate,并通过 re-export 保持旧路径兼容。 +- 在声明 P3 边界收敛前,运行 `node scripts/check-core-boundaries.mjs`,确认已拆出的 + owner crate 没有新增 `bitfun-core` 反向依赖,并确认 `core-types`、`runtime-ports` + 和 `agent-tools` 没有引入重 runtime / concrete service 依赖。 +- 已迁移回 `bitfun-core` 的 legacy facade 只能 re-export owner crate;例如 Git 旧路径、 + remote SSH types facade、MCP tool contract facade、MCP protocol types facade、 + MCP config location facade 和 announcement types facade 由边界脚本检查,不得重新承载实现逻辑。 +- 对仍嵌在 core runtime 文件中的旧公开类型,必须至少保留禁止回流检查;例如 MCP server + type/status 已由 owner crate 拥有,`MCPServerProcess` 只保留 runtime 逻辑。 - `bitfun-runtime-ports` 必须保持 DTO/trait-only;不得依赖 concrete manager、 service implementation、app crate 或 platform adapter。 - `bitfun-core-types` 不得依赖 runtime manager、service crate、agent runtime、 @@ -97,6 +105,9 @@ owner 边界,否则不要把一个 feature group 继续拆成更小的 crate 代码必须面向 port contract,而不是新增对 coordinator 或 manager 的直接依赖。 - Agent runtime 必须通过 ports/providers 依赖 service 行为,不要依赖 concrete 的重集成 crate。 +- 最新主干已把 subagent 可见性做成 mode-scoped registry 行为。迁移 agent registry 或 + subagent definitions 前,必须先保留 mode visibility、hidden/custom/review 分组和 desktop + subagent API 等价测试;在此之前它们仍属于 `bitfun-core` product runtime assembly。 - Tool framework crate 不得依赖 concrete service implementation。 - 产品 crate 可以通过显式 product feature 组装完整 runtime。 diff --git a/docs/plans/core-decomposition-plan.md b/docs/plans/core-decomposition-plan.md index 0eb4b9f12..5113405a0 100644 --- a/docs/plans/core-decomposition-plan.md +++ b/docs/plans/core-decomposition-plan.md @@ -965,18 +965,21 @@ product-full = ["git", "mcp", "remote-ssh", "remote-connect", "announcement", "f **任务:** -- [ ] 先迁移 `git`,因为边界相对清晰。 +- [x] 先迁移 `git`,因为边界相对清晰。 - [ ] 再迁移 `remote-ssh`,保留 `ssh-remote` 语义。 +- [x] 先迁移 `remote-ssh` 的纯 contract/type,runtime manager / fs / terminal 仍保留在 core。 - [ ] 再迁移 `mcp`,动态工具通过 `DynamicToolProvider` 接入。 +- [x] 先迁移 `mcp` 的纯 tool-name / tool-info / protocol types / config location / server type-status contract,config service / server process / adapter / auth / dynamic tools 仍保留在 core。 +- [x] 先迁移 `announcement` 的纯 types contract,scheduler / state store / content loader / remote fetch 仍保留在 core。 - [ ] 最后迁移 `remote-connect`,通过 `AgentSubmissionPort`、`SessionTranscriptReader`、`EventSink` 解耦 agent runtime。 - [x] 已迁移的集成能力保持 core 旧路径 re-export。 - [x] 产品完整 runtime 通过 `services-integrations/product-full` 启用已迁移集成能力。 -**当前安全迁移状态(2026-05-11):** +**当前安全迁移状态(2026-05-12):** - 已迁移到 `bitfun-services-integrations`:`service::file_watch`,通过 `file-watch` / `product-full` feature 启用,并保持 `core::service::file_watch` 旧路径。 -- `git`、`remote-ssh`、`mcp`、`remote-connect`、`announcement` 尚未迁移;它们涉及 Git service、SSH runtime、MCP dynamic tool provider、remote agent submission 与 announcement config/path 边界,继续前需要单独确认端口方案与等价性测试。 -- 最新主干的 Deep Review capacity / cost / queue 控制仍属于 core runtime 与 review-team orchestration,不在本轮 `services-integrations` 迁移范围内;如果后续迁移 remote-connect / MCP,需要先定义这些运行状态的 port 合约。 +- `git` 已完成 DTO/params/graph/raw command output/text parser/arg builder、`GitError`、`GitService` runtime implementation 与 git utils 迁移;`bitfun-core::service::git::*` 仅保留 legacy facade re-export。`remote-ssh` 仅迁移了纯 contract/type,SSH runtime manager / fs / terminal 仍保留在 core;`mcp` 仅迁移了 tool-name / tool-info / protocol types / config location / server type-status contract,config service / json config / cursor format / jsonrpc / transport / server process / adapter / auth / dynamic tools 仍保留在 core;`announcement` 仅迁移了纯 types contract,scheduler / state store / content loader / remote fetch 仍保留在 core;`remote-connect` 尚未迁移。它们涉及 SSH runtime、MCP dynamic tool provider、remote agent submission 与 announcement config/path 边界,继续前需要单独确认端口方案与等价性测试。 +- 最新主干的 Deep Review capacity / cost / queue、context profile、evidence ledger、session manifest、stream dedupe、search remote/fallback 与 session rollback persistence 仍属于 core runtime 或对应产品 runtime,不在本轮 `services-integrations` 迁移范围内;如果后续迁移 remote-connect / MCP / search / session,需要先定义运行状态 port 合约和等价测试。 **验证:** @@ -1125,8 +1128,8 @@ cargo check -p bitfun-server **任务:** -- [ ] 将可替换的实现模块改为 re-export。 -- [ ] 在顶层加入关键节点注释: +- [x] 将可替换的实现模块改为 re-export(限本轮已迁移 owner crate;高耦合 runtime 保留为 core-owned runtime)。 +- [x] 在顶层加入关键节点注释: ```rust //! Compatibility facade and full product runtime assembly. @@ -1135,17 +1138,24 @@ cargo check -p bitfun-server //! This crate re-exports legacy paths and wires the full BitFun product runtime. ``` -- [ ] `bitfun-core/Cargo.toml` 只保留 facade 和 product assembly 所需依赖。 -- [ ] 旧路径保持 import-compatible。 +- [ ] `bitfun-core/Cargo.toml` 只保留 facade 和 product assembly 所需依赖;当前仍因 core-owned runtime 保留 concrete runtime 依赖,不在本 PR 强行删减。 +- [x] 旧路径保持 import-compatible。 - [ ] 只有所有产品 crate 都显式启用完整 runtime 后,才可以在独立 PR 中评估: ```toml default = [] ``` +**当前收敛状态(2026-05-11):** + +- 本轮不把 `remote-ssh` runtime、MCP runtime、`remote-connect`、announcement runtime、concrete tool implementations、tool registry、miniapp runtime/compiler/builtin、function-agent 运行逻辑声明为已迁移;它们继续作为 `bitfun-core` 的 product runtime assembly 拥有路径。`git` feature group 已外移;`remote-ssh` 目前只外移 contract/type;`mcp` 目前只外移 tool-name / tool-info / protocol types / config location / server type-status contract;`announcement` 目前只外移 types contract。 +- 新增 `scripts/check-core-boundaries.mjs`,用于阻止已拆出的 owner crate 反向依赖 `bitfun-core`。该脚本只证明 crate graph 方向,不替代产品等价性测试。 +- `default = []` 仍保持为后续独立评估项,本轮不调整默认 feature、构建脚本或 release 脚本。 + **验证:** ```powershell +node scripts/check-core-boundaries.mjs cargo check -p bitfun-core --features product-full cargo check -p bitfun-desktop cargo check -p bitfun-cli @@ -1494,12 +1504,14 @@ cargo check --workspace - 新增 crate 数量仍保持中等粒度。 - heavy dependency 所属 crate 清晰。 -**当前 P2 执行状态(2026-05-11):** +**当前 P2 执行状态(2026-05-12):** - 已完成中等粒度 owner crate 成型的安全部分:`bitfun-services-core`、`bitfun-services-integrations`、`bitfun-agent-tools`、`bitfun-tool-packs`、`bitfun-product-domains` 均已加入 workspace。 - 已迁移的模块均由 core facade re-export,未改变产品默认 feature、构建脚本或 release 脚本。 -- 未声明完成的 P2 剩余部分:重 service 迁移、concrete tool implementation 迁移、tool registry/provider 化、miniapp/function-agent 运行逻辑迁移。这些会触碰 `PathManager`、`BitFunError`、`ToolUseContext`、workspace service、snapshot wrapper、`AgentSubmissionPort` 或 Git/AI service 边界,需要在继续前显式确认。 -- 本次 rebase 后重新核对最新主干 Deep Review capacity/cost/queue、context profile、evidence ledger 与 session manifest 变更:当前 PR 只迁移已声明的纯类型/低耦合模块并保留 core facade,未改动这些行为路径;后续迁移必须补端口设计和等价测试后再推进。 +- Git feature group 已闭环迁移到 `bitfun-services-integrations` 的 `git` feature:DTO/params/graph/raw command output/text parser/arg builder、`GitError`、`GitService` runtime implementation 与 git utils 均由 integrations owner crate 拥有,并通过 `bitfun-core::service::git::*` 保留旧路径兼容。`GitService` 所需的 Windows `libgit2` system-link 边界挂在该 crate 的 `git` feature 上;`bitfun-core` 仍因未迁移的 remote-connect runtime 保留其它 `git2` 使用。 +- 未声明完成的 P2 剩余部分:remote-ssh runtime、MCP server/adapter/auth/dynamic tools、remote-connect 等重 service 迁移、concrete tool implementation 迁移、tool registry/provider 化、miniapp/function-agent 运行逻辑迁移。这些会触碰 `PathManager`、`BitFunError`、`ToolUseContext`、workspace service、snapshot wrapper、`AgentSubmissionPort` 或 AI service 边界,需要在继续前显式确认。 +- 本次 rebase 后重新核对最新主干 Deep Review capacity/cost/queue、context profile、evidence ledger 与 session manifest 变更:当前 PR 已完成 Git feature group 的 owner crate 归属迁移,但未改动这些 Deep Review 行为路径;后续迁移必须补端口设计和等价测试后再推进。 +- 本次 P2 后续复核结论:上述高耦合剩余项不是纯文件搬迁;若继续迁移会改变依赖方向或需要新增 port/provider 行为合约。因此当前 PR 将它们显式保留为 core-owned runtime,并通过 P3 boundary check 防止已拆 owner crate 回流依赖 core。 **暂停条件:** @@ -1532,12 +1544,27 @@ cargo check --workspace - `default = []` 必须是单独 PR,且只在所有产品 crate 显式启用完整 runtime 后评估。 - 不允许把 facade 变成新的业务实现聚合。 -**P3 进入条件与最新主干补充(2026-05-11):** +**P3 进入条件与最新主干补充(2026-05-12):** - P3 只能在 P2 剩余迁移闭环后启动:重 service 迁移、concrete tool implementation 迁移、tool registry/provider 化、miniapp/function-agent 运行逻辑迁移都必须先完成或显式保留为 core-owned runtime。 -- 最近 `origin/main` 的 Deep Review 变更增加了 context profile、evidence ledger、capacity/cost/queue 控制、`deep_review_run_manifest` / `deep_review_cache`、以及 review-team UI orchestration。P3 facade 收敛前必须确认这些行为要么仍由 core product runtime assembly 拥有,要么已有对应 owner crate + port/provider 合约和等价测试。 -- `ToolUseContext` 的 shared-context / evidence checkpoint hook、`TaskTool` / `CodeReviewTool` 的 Deep Review capacity flow、以及 session manifest/cache persistence 不能在 P3 中只通过 re-export 消失;如果外移,需要先补 boundary contract、旧路径兼容和 Deep Review regression。 +- 最近 `origin/main` 的 Deep Review 变更增加了 context profile、evidence ledger、capacity/cost/queue 控制、`deep_review_run_manifest` / `deep_review_cache`、以及 review-team UI orchestration;最新主干还补充了 agent-stream tool-call dedupe、search remote/fallback、session rollback persistence 和 companion typewriter。P3 facade 收敛前必须确认这些行为要么仍由 core product runtime assembly 拥有,要么已有对应 owner crate + port/provider 合约和等价测试。 +- 最新主干的 mode-scoped subagent visibility 将 `agentic::agents` 重组为 definitions / registry / visibility 边界,并扩展了 desktop subagent API 与 Review Team 可见性测试;后续若迁移 agent registry,不能只做路径 re-export,必须保留 mode 可见性过滤、hidden/custom/review 分组语义和前后端 API contract。 +- `ToolUseContext` 的 shared-context / evidence checkpoint hook、`TaskTool` / `CodeReviewTool` 的 Deep Review capacity flow、session manifest/cache persistence、rollback persisted-turn cleanup、search fallback chain 与 stream finish/tool-call contract 不能在 P3 中只通过 re-export 消失;如果外移,需要先补 boundary contract、旧路径兼容和对应 regression。 - P3 的闭环检查应同时覆盖 Rust crate graph 与产品 runtime 行为:边界脚本只证明依赖方向,不能替代 Deep Review、MCP dynamic tools、remote connect、snapshot wrapping、miniapp/function-agent 的产品等价性验证。 +- 当前 PR 的 P3 范围按“显式保留 core-owned runtime + 强制 owner crate 边界”闭环;后续如果要继续外移这些 runtime 路径,需要作为新的迁移批次先补 port 设计、等价测试和用户确认。 + +**阶段复核与后续拆分(2026-05-12 rebase 后):** + +- 当前分支保持小粒度:只包含 boundary check、facade/product runtime 注释、阶段文档和 Git feature group 归属迁移;不继续混入 remote-connect、MCP dynamic tools、tool registry、miniapp 或 function-agent runtime 迁移。 +- 本次 rebase 到 `fork/main` 的新增主干提交 `2e0e2dda` 没有和当前分支产生路径重叠;其 agent visibility 重构不改变本轮 `services-integrations` / service facade 迁移内容,但会把后续 agent registry/provider 外移的等价性门槛抬高。 +- 质量边界:本阶段只证明已拆 owner crate 不依赖回 `bitfun-core`,并证明 Git 旧路径仍通过当前 Rust workspace 检查;不声明 remote connect、MCP dynamic tools、snapshot wrapping、miniapp/function-agent runtime 的外移完成。 +- boundary check 已扩展到 `core-types`、`runtime-ports` 和 `agent-tools` 的轻量边界,并覆盖 Cargo inline 依赖和 dependency table 依赖声明,后续不能绕过脚本把重 runtime、concrete service 或 platform adapter 依赖带入这些 contract crate。 +- boundary check 也已锁定 `bitfun-core::service::git`、`bitfun-core::service::remote_ssh::types`、`bitfun-core::service::mcp::{tool_info,tool_name}`、`bitfun-core::service::mcp::protocol::types`、`bitfun-core::service::mcp::config::location` 和 `bitfun-core::service::announcement::types` 的旧路径 facade-only 状态,并禁止在 `MCPServerProcess` runtime 文件重新定义已外移的 server type/status contract。 +- 后续迁移必须拆成可独立审核的提交:先补 port/provider 设计和等价测试,再按 `remote-ssh` runtime、MCP runtime、`remote-connect` 的顺序一次迁移一个 service feature group。 +- tool registry / concrete tool implementation 外移必须先有工具清单等价测试,并保留 dynamic provider metadata;不能把注册名解析、snapshot wrapper 或 runtime restriction 行为改成隐式约定。 +- 已新增内置工具清单基线测试,后续 registry/provider 外移必须先保持该清单和注册顺序等价,再评估 owner crate 边界。 +- miniapp 与 function-agent runtime 外移必须先明确 Git/AI service、PathManager、process execution 和 permission policy 边界;如果需要行为合约变化,必须作为后续单独 PR 并先确认。 +- `bitfun-core default = []` 和依赖版本收敛仍是后续独立评估项,不与 runtime 外移或构建脚本调整混在同一批提交。 **验收门:** diff --git a/scripts/check-core-boundaries.mjs b/scripts/check-core-boundaries.mjs new file mode 100644 index 000000000..cc151af0a --- /dev/null +++ b/scripts/check-core-boundaries.mjs @@ -0,0 +1,477 @@ +#!/usr/bin/env node + +import { readdirSync, readFileSync, statSync } from 'fs'; +import { join, relative } from 'path'; +import { fileURLToPath } from 'url'; +import { dirname } from 'path'; + +const __dirname = dirname(fileURLToPath(import.meta.url)); +const ROOT = join(__dirname, '..'); + +const noCoreDependencyCrates = [ + 'core-types', + 'events', + 'ai-adapters', + 'agent-stream', + 'runtime-ports', + 'services-core', + 'services-integrations', + 'agent-tools', + 'tool-packs', + 'product-domains', + 'terminal', + 'tool-runtime', + 'transport', + 'api-layer', + 'webdriver', +]; + +const lightweightBoundaryRules = [ + { + crateName: 'core-types', + reason: 'core-types must stay low-level DTO-only', + forbiddenDeps: [ + 'bitfun-core', + 'bitfun-events', + 'bitfun-ai-adapters', + 'bitfun-agent-stream', + 'bitfun-runtime-ports', + 'bitfun-services-core', + 'bitfun-services-integrations', + 'bitfun-agent-tools', + 'bitfun-tool-packs', + 'bitfun-product-domains', + 'bitfun-transport', + 'terminal-core', + 'tool-runtime', + 'tauri', + 'reqwest', + 'git2', + 'rmcp', + 'image', + 'tokio-tungstenite', + ], + }, + { + crateName: 'runtime-ports', + reason: 'runtime-ports must stay DTO/trait-only', + forbiddenDeps: [ + 'bitfun-core', + 'bitfun-agent-stream', + 'bitfun-services-core', + 'bitfun-services-integrations', + 'bitfun-agent-tools', + 'bitfun-tool-packs', + 'bitfun-product-domains', + 'bitfun-transport', + 'terminal-core', + 'tool-runtime', + 'tauri', + 'reqwest', + 'git2', + 'rmcp', + 'image', + 'tokio-tungstenite', + ], + }, + { + crateName: 'agent-tools', + reason: 'agent-tools must not depend on concrete service or product runtime implementations', + forbiddenDeps: [ + 'bitfun-core', + 'bitfun-services-core', + 'bitfun-services-integrations', + 'bitfun-tool-packs', + 'bitfun-product-domains', + 'bitfun-transport', + 'terminal-core', + 'tool-runtime', + 'tauri', + 'reqwest', + 'git2', + 'rmcp', + 'tokio-tungstenite', + ], + }, +]; + +const facadeOnlyFiles = [ + { + path: 'src/crates/core/src/service/git/git_service.rs', + importPrefix: 'bitfun_services_integrations::git', + reason: 'core git service facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/git/git_types.rs', + importPrefix: 'bitfun_services_integrations::git', + reason: 'core git types facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/git/git_utils.rs', + importPrefix: 'bitfun_services_integrations::git', + reason: 'core git utils facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/git/graph.rs', + importPrefix: 'bitfun_services_integrations::git', + reason: 'core git graph facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/remote_ssh/types.rs', + importPrefix: 'bitfun_services_integrations::remote_ssh', + reason: 'core remote SSH types facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/mcp/tool_info.rs', + importPrefix: 'bitfun_services_integrations::mcp', + reason: 'core MCP tool info facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/mcp/tool_name.rs', + importPrefix: 'bitfun_services_integrations::mcp', + reason: 'core MCP tool name facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/mcp/protocol/types.rs', + importPrefix: 'bitfun_services_integrations::mcp', + reason: 'core MCP protocol types facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/mcp/config/location.rs', + importPrefix: 'bitfun_services_integrations::mcp', + reason: 'core MCP config location facade must only re-export the integrations owner crate', + }, + { + path: 'src/crates/core/src/service/announcement/types.rs', + importPrefix: 'bitfun_services_integrations::announcement', + reason: 'core announcement types facade must only re-export the integrations owner crate', + }, +]; + +const forbiddenContentRules = [ + { + path: 'src/crates/core/src/service/mcp/server/process.rs', + patterns: [ + { + regex: /\bpub enum MCPServerType\b/, + message: 'core MCP server process runtime must not redefine MCPServerType; use the integrations contract', + }, + { + regex: /\bpub enum MCPServerStatus\b/, + message: 'core MCP server process runtime must not redefine MCPServerStatus; use the integrations contract', + }, + ], + }, +]; + +const failures = []; + +function toRepoPath(path) { + return relative(ROOT, path).replace(/\\/g, '/'); +} + +function readText(path) { + return readFileSync(path, 'utf8'); +} + +function walkFiles(dir, visit) { + for (const entry of readdirSync(dir)) { + const path = join(dir, entry); + const stat = statSync(path); + if (stat.isDirectory()) { + walkFiles(path, visit); + continue; + } + visit(path); + } +} + +function rustImportName(depName) { + return depName.replace(/-/g, '_'); +} + +function escapeRegex(text) { + return text.replace(/[.*+?^${}()|[\]\\]/g, '\\$&'); +} + +function manifestDependencyHeaderPattern(depName) { + const depPattern = `(?:${escapeRegex(depName)}|"${escapeRegex(depName)}")`; + return new RegExp( + `^\\[(?:target\\.[^\\]]+\\.)?(?:dependencies|dev-dependencies|build-dependencies)\\.${depPattern}\\]$`, + ); +} + +function isManifestDependencyDeclaration(trimmedLine, depName) { + const isInlineDependency = new RegExp(`^${escapeRegex(depName)}\\s*=`).test(trimmedLine); + const isDependencyTable = manifestDependencyHeaderPattern(depName).test(trimmedLine); + return isInlineDependency || isDependencyTable; +} + +function runManifestParserSelfTest() { + const positiveCases = [ + 'bitfun-core = { path = "../core" }', + '[dependencies.bitfun-core]', + '[dev-dependencies."bitfun-core"]', + "[target.'cfg(windows)'.dependencies.bitfun-core]", + "[target.'cfg(unix)'.build-dependencies.\"bitfun-core\"]", + ]; + const negativeCases = [ + '# bitfun-core = { path = "../core" }', + '[dependencies]', + '[workspace.dependencies.bitfun-core]', + '[dependencies.bitfun-core-extra]', + ]; + + for (const line of positiveCases) { + if (!isManifestDependencyDeclaration(line, 'bitfun-core')) { + throw new Error(`manifest parser missed dependency declaration: ${line}`); + } + } + for (const line of negativeCases) { + if (isManifestDependencyDeclaration(line, 'bitfun-core')) { + throw new Error(`manifest parser matched non-dependency declaration: ${line}`); + } + } + + const acceptsGitFacadeLine = createFacadeLineChecker('bitfun_services_integrations::git'); + const facadePositiveCases = [ + '', + '//! Compatibility facade.', + 'pub use bitfun_services_integrations::git::GitService;', + 'pub use bitfun_services_integrations::git::types::*;', + 'pub use bitfun_services_integrations::git::{', + ' build_git_graph, build_git_graph_for_branch,', + '};', + 'pub use bitfun_services_integrations::git::{build_git_graph, build_git_graph_for_branch};', + ]; + for (const line of facadePositiveCases) { + if (!acceptsGitFacadeLine(line)) { + throw new Error(`facade parser rejected allowed line: ${line}`); + } + } + + const rejectsGitImplementationLine = createFacadeLineChecker('bitfun_services_integrations::git'); + const facadeNegativeCases = [ + 'pub mod service;', + 'use bitfun_services_integrations::git::GitService;', + 'fn parse_git_status() {}', + ]; + for (const line of facadeNegativeCases) { + if (rejectsGitImplementationLine(line)) { + throw new Error(`facade parser accepted implementation line: ${line}`); + } + } +} + +function checkCargoManifest(crateDir) { + checkForbiddenManifestDeps(crateDir, ['bitfun-core'], () => { + return 'extracted crate must not depend on bitfun-core'; + }); +} + +function checkForbiddenManifestDeps(crateDir, forbiddenDeps, messageForDep) { + const manifestPath = join(crateDir, 'Cargo.toml'); + const lines = readText(manifestPath).split(/\r?\n/); + lines.forEach((line, index) => { + const trimmed = line.trim(); + if (trimmed.startsWith('#')) { + return; + } + for (const dep of forbiddenDeps) { + if (isManifestDependencyDeclaration(trimmed, dep)) { + failures.push({ + path: manifestPath, + line: index + 1, + message: messageForDep(dep), + }); + } + } + }); +} + +function checkRustImports(crateDir) { + const srcDir = join(crateDir, 'src'); + try { + if (!statSync(srcDir).isDirectory()) { + return; + } + } catch { + return; + } + + walkFiles(srcDir, (path) => { + if (!path.endsWith('.rs')) { + return; + } + const lines = readText(path).split(/\r?\n/); + lines.forEach((line, index) => { + if (/\bbitfun_core::/.test(line)) { + failures.push({ + path, + line: index + 1, + message: 'extracted crate must not import bitfun_core', + }); + } + }); + }); +} + +function checkForbiddenRustImports(crateDir, forbiddenDeps, messageForDep) { + const srcDir = join(crateDir, 'src'); + try { + if (!statSync(srcDir).isDirectory()) { + return; + } + } catch { + return; + } + + const forbiddenImports = forbiddenDeps.map((dep) => ({ + dep, + pattern: new RegExp(`\\b${escapeRegex(rustImportName(dep))}::`), + })); + + walkFiles(srcDir, (path) => { + if (!path.endsWith('.rs')) { + return; + } + const lines = readText(path).split(/\r?\n/); + lines.forEach((line, index) => { + for (const forbidden of forbiddenImports) { + if (forbidden.pattern.test(line)) { + failures.push({ + path, + line: index + 1, + message: messageForDep(forbidden.dep), + }); + } + } + }); + }); +} + +function createFacadeLineChecker(importPrefix) { + let inPubUseBlock = false; + const escapedPrefix = escapeRegex(importPrefix); + const singleReexportPattern = new RegExp( + `^pub use ${escapedPrefix}(?:::[A-Za-z_][A-Za-z0-9_]*)*(?:::\\*)?;$`, + ); + const blockItemPattern = /^[A-Za-z_][A-Za-z0-9_]*(?:,\s*[A-Za-z_][A-Za-z0-9_]*)*,?$/; + const blockStart = `pub use ${importPrefix}::{`; + + const checker = (line) => { + const trimmed = line.trim(); + if ( + trimmed === '' || + trimmed.startsWith('//') || + trimmed.startsWith('/*') || + trimmed.startsWith('*') || + trimmed.startsWith('*/') + ) { + return true; + } + + if (inPubUseBlock) { + if (trimmed === '};') { + inPubUseBlock = false; + return true; + } + return blockItemPattern.test(trimmed); + } + + if (singleReexportPattern.test(trimmed)) { + return true; + } + + if (trimmed.startsWith(blockStart)) { + if (trimmed.endsWith('};')) { + return true; + } + if (trimmed.endsWith('{')) { + inPubUseBlock = true; + return true; + } + } + + return false; + }; + + checker.isComplete = () => !inPubUseBlock; + return checker; +} + +function checkFacadeOnlyFile(repoPath, importPrefix, reason) { + const path = join(ROOT, ...repoPath.split('/')); + const acceptsLine = createFacadeLineChecker(importPrefix); + const lines = readText(path).split(/\r?\n/); + lines.forEach((line, index) => { + if (!acceptsLine(line)) { + failures.push({ + path, + line: index + 1, + message: reason, + }); + } + }); + + if (!acceptsLine.isComplete()) { + failures.push({ + path, + line: lines.length, + message: `${reason}; unterminated pub use block`, + }); + } +} + +function checkForbiddenContent(repoPath, patterns) { + const path = join(ROOT, ...repoPath.split('/')); + const lines = readText(path).split(/\r?\n/); + lines.forEach((line, index) => { + for (const pattern of patterns) { + if (pattern.regex.test(line)) { + failures.push({ + path, + line: index + 1, + message: pattern.message, + }); + } + } + }); +} + +if (process.env.BITFUN_BOUNDARY_CHECK_SELF_TEST === '1') { + runManifestParserSelfTest(); + console.log('Core boundary check self-test passed.'); + process.exit(0); +} + +for (const crateName of noCoreDependencyCrates) { + const crateDir = join(ROOT, 'src', 'crates', crateName); + checkCargoManifest(crateDir); + checkRustImports(crateDir); +} + +for (const rule of lightweightBoundaryRules) { + const crateDir = join(ROOT, 'src', 'crates', rule.crateName); + const messageForDep = (dep) => `${rule.reason}; forbidden dependency: ${dep}`; + checkForbiddenManifestDeps(crateDir, rule.forbiddenDeps, messageForDep); + checkForbiddenRustImports(crateDir, rule.forbiddenDeps, messageForDep); +} + +for (const facade of facadeOnlyFiles) { + checkFacadeOnlyFile(facade.path, facade.importPrefix, facade.reason); +} + +for (const rule of forbiddenContentRules) { + checkForbiddenContent(rule.path, rule.patterns); +} + +if (failures.length > 0) { + console.error('Core boundary check failed.'); + for (const failure of failures) { + console.error(`${toRepoPath(failure.path)}:${failure.line}: ${failure.message}`); + } + process.exit(1); +} + +console.log('Core boundary check passed.'); diff --git a/src/crates/core/src/agentic/mod.rs b/src/crates/core/src/agentic/mod.rs index c6b40fe7a..df12e2630 100644 --- a/src/crates/core/src/agentic/mod.rs +++ b/src/crates/core/src/agentic/mod.rs @@ -1,6 +1,7 @@ -//! Agentic Module +//! Agentic facade and product runtime assembly. //! -//! Core AI Agent service system +//! Portable contracts move to owner crates first; concrete orchestration stays +//! here until it can be split without changing tool, session, or review flows. // Core module pub mod core; diff --git a/src/crates/core/src/agentic/tools/registry.rs b/src/crates/core/src/agentic/tools/registry.rs index e77e83b26..e79d7c7b8 100644 --- a/src/crates/core/src/agentic/tools/registry.rs +++ b/src/crates/core/src/agentic/tools/registry.rs @@ -400,6 +400,51 @@ mod tests { assert!(registry.get_tool("Cron").is_some()); } + #[test] + fn registry_preserves_builtin_tool_manifest_for_owner_migration() { + let registry = create_tool_registry(); + + assert_eq!( + registry.get_tool_names(), + vec![ + "LS", + "Read", + "Glob", + "Grep", + "Write", + "Edit", + "Delete", + "Bash", + "TerminalControl", + "SessionControl", + "SessionMessage", + "SessionHistory", + "TodoWrite", + "Cron", + "Task", + "Skill", + "AskUserQuestion", + "WebSearch", + "WebFetch", + "ListMCPResources", + "ReadMCPResource", + "ListMCPPrompts", + "GetMCPPrompt", + "GenerativeUI", + "GetFileDiff", + "Log", + "Git", + "CreatePlan", + "submit_code_review", + "InitMiniApp", + "ControlHub", + "ComputerUse", + "Playbook", + ], + "builtin tool manifest must stay stable before moving registry ownership" + ); + } + #[tokio::test] async fn dynamic_tool_provider_uses_explicit_provider_metadata() { let mut registry = ToolRegistry::new(); diff --git a/src/crates/core/src/lib.rs b/src/crates/core/src/lib.rs index 4a7d59cba..90720a021 100644 --- a/src/crates/core/src/lib.rs +++ b/src/crates/core/src/lib.rs @@ -1,15 +1,18 @@ #![allow(non_snake_case)] #![recursion_limit = "256"] -// BitFun Core Library - Platform-agnostic business logic -// Four-layer architecture: Util -> Infrastructure -> Service -> Agentic - -pub mod agentic; // Agentic service layer - Agent system, tool system -pub mod function_agents; // Function Agents - Function-based agents -pub mod infrastructure; // Infrastructure layer - AI clients, storage, logging, events -pub mod miniapp; -pub mod service; // Service layer - Workspace, Config, FileSystem, Terminal, Git -pub mod util; // Utility layer - General types, errors, helper functions // MiniApp - AI-generated instant apps (Zero-Dialect Runtime) - // Re-export debug_log from infrastructure for backward compatibility +//! Compatibility facade and full product runtime assembly. +//! +//! New implementation code should live in owner crates under `src/crates/*`. +//! This crate re-exports legacy paths and wires the full BitFun product runtime. + +pub mod agentic; // Agent system, tool system, and product runtime orchestration +pub mod function_agents; // Function-based agents +pub mod infrastructure; // AI clients, storage, logging, events +pub mod miniapp; // AI-generated instant apps (Zero-Dialect Runtime) +pub mod service; // Workspace, Config, FileSystem, Terminal, Git +pub mod util; // General types, errors, helper functions + +// Re-export debug_log from infrastructure for backward compatibility. pub use infrastructure::debug_log as debug; // Export main types diff --git a/src/crates/core/src/service/announcement/types.rs b/src/crates/core/src/service/announcement/types.rs index 3cda8d41f..958a1543f 100644 --- a/src/crates/core/src/service/announcement/types.rs +++ b/src/crates/core/src/service/announcement/types.rs @@ -1,235 +1 @@ -//! Announcement system types. -//! -//! Defines all data structures for the announcement / feature-demo / tips mechanism. - -use serde::{Deserialize, Serialize}; -use std::collections::HashSet; - -/// Categories of announcement cards. -#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] -#[serde(rename_all = "snake_case")] -pub enum CardType { - /// New version feature showcase. - Feature, - /// Operational news or blog post. - News, - /// Lightweight usage tip (toast only, no modal). - Tip, - /// Important system announcement (shown as modal without prior toast). - Announcement, -} - -/// Origin of an announcement card. -#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] -#[serde(rename_all = "snake_case")] -pub enum CardSource { - /// Statically registered in the local binary. - Local, - /// Downloaded from a remote endpoint. - Remote, - /// Built-in tips pool. - BuiltinTip, -} - -/// Conditions that must be met before a card is shown. -#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] -#[serde(tag = "type", rename_all = "snake_case")] -pub enum TriggerCondition { - /// First launch after a version upgrade. - VersionFirstOpen, - /// The N-th time the application has been opened (1-indexed). - AppNthOpen { n: u64 }, - /// A named application feature was used (supplied programmatically). - FeatureUsed { feature: String }, - /// Must be triggered manually via `trigger_announcement`. - Manual, - /// Always eligible (used for announcements that should appear on every start until dismissed). - Always, -} - -/// When and how a card should be presented. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct TriggerRule { - pub condition: TriggerCondition, - /// Milliseconds to wait after application start before displaying. - #[serde(default)] - pub delay_ms: u64, - /// When true, a card is only shown once per application version. - #[serde(default = "default_true")] - pub once_per_version: bool, -} - -fn default_true() -> bool { - true -} - -impl Default for TriggerRule { - fn default() -> Self { - Self { - condition: TriggerCondition::VersionFirstOpen, - delay_ms: 2000, - once_per_version: true, - } - } -} - -/// Configuration for the bottom-left toast entry point. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct ToastConfig { - /// Icon identifier or emoji string (rendered by the frontend). - pub icon: String, - /// Toast title (i18n key or literal text). - pub title: String, - /// Short description shown below the title (i18n key or literal text). - pub description: String, - /// Label for the primary action button (i18n key or literal text). - #[serde(default)] - pub action_label: String, - /// Whether the user can close the toast without acting. - #[serde(default = "default_true")] - pub dismissible: bool, - /// Auto-dismiss after this many milliseconds; `None` means no auto-dismiss. - #[serde(default)] - pub auto_dismiss_ms: Option, -} - -/// Preferred modal size. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "snake_case")] -pub enum ModalSize { - Sm, - Md, - Lg, - Xl, -} - -impl Default for ModalSize { - fn default() -> Self { - ModalSize::Lg - } -} - -/// What happens when the user finishes or closes the modal. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "snake_case")] -pub enum CompletionAction { - /// Only dismiss for this session; may reappear next launch if conditions match. - Dismiss, - /// Permanently suppress via `never_show_ids`. - NeverShowAgain, -} - -impl Default for CompletionAction { - fn default() -> Self { - CompletionAction::Dismiss - } -} - -/// Layout template for a single modal page. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "snake_case")] -pub enum PageLayout { - TextOnly, - MediaLeft, - MediaRight, - MediaTop, - FullscreenMedia, -} - -impl Default for PageLayout { - fn default() -> Self { - PageLayout::MediaTop - } -} - -/// Media asset type. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "snake_case")] -pub enum MediaType { - Lottie, - Video, - Image, - Gif, -} - -/// A media asset attached to a modal page. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct MediaConfig { - pub media_type: MediaType, - /// Relative path under `public/announcements/` or an HTTPS URL. - pub src: String, -} - -/// A single page inside a feature modal. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct ModalPage { - #[serde(default)] - pub layout: PageLayout, - /// Page title (i18n key or literal text). - pub title: String, - /// Body copy in Markdown (i18n key or literal text). - pub body: String, - #[serde(default)] - pub media: Option, -} - -/// Full configuration for the centre modal overlay. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct ModalConfig { - #[serde(default)] - pub size: ModalSize, - /// Allow the user to close the modal with the × button. - #[serde(default = "default_true")] - pub closable: bool, - pub pages: Vec, - #[serde(default)] - pub completion_action: CompletionAction, -} - -/// A single announcement / feature-demo card. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct AnnouncementCard { - /// Globally unique identifier, e.g. `feature_v1_3_0_miniapp`. - pub id: String, - pub card_type: CardType, - pub source: CardSource, - /// Application version this card is associated with. `None` = any version. - #[serde(default)] - pub app_version: Option, - /// Higher priority cards are shown first. - #[serde(default)] - pub priority: i32, - pub trigger: TriggerRule, - pub toast: ToastConfig, - /// If `None`, no modal is opened when the user clicks the toast action. - #[serde(default)] - pub modal: Option, - /// Unix timestamp (seconds) after which the card is ignored. Remote cards only. - #[serde(default)] - pub expires_at: Option, -} - -/// Persisted state for the announcement system. -/// -/// Stored at `~/.config/bitfun/config/announcement-state.json`. -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -pub struct AnnouncementState { - /// Version string recorded when the state was last saved. - #[serde(default)] - pub last_seen_version: String, - /// How many times the application has been opened. - #[serde(default)] - pub app_open_count: u64, - /// IDs of cards the user has seen (action button clicked or modal opened). - #[serde(default)] - pub seen_ids: HashSet, - /// IDs dismissed for the current version cycle; reset on version upgrade. - #[serde(default)] - pub dismissed_ids: HashSet, - /// IDs the user has permanently suppressed. - #[serde(default)] - pub never_show_ids: HashSet, - /// Unix timestamp (seconds) of the last successful remote fetch. - #[serde(default)] - pub last_remote_fetch_at: Option, -} +pub use bitfun_services_integrations::announcement::*; diff --git a/src/crates/core/src/service/git/git_service.rs b/src/crates/core/src/service/git/git_service.rs index e1523e1b7..c173e5914 100644 --- a/src/crates/core/src/service/git/git_service.rs +++ b/src/crates/core/src/service/git/git_service.rs @@ -1,1244 +1 @@ -use super::git_types::*; -use super::git_utils::*; -/** - * Git service implementation - */ -use crate::util::elapsed_ms_u64; -use git2::{BranchType, Commit, Repository}; -use std::path::Path; -use std::time::Duration; -use std::time::Instant; -use tokio::time::timeout; - -pub struct GitService; - -type CommitStats = (Option, Option, Option); - -fn parse_name_status_output(output: &str) -> Vec { - output - .lines() - .filter_map(|line| { - let mut parts = line.split('\t'); - let raw_status = parts.next()?.trim(); - if raw_status.is_empty() { - return None; - } - - let status = match raw_status.chars().next().unwrap_or_default() { - 'A' => GitChangedFileStatus::Added, - 'M' => GitChangedFileStatus::Modified, - 'D' => GitChangedFileStatus::Deleted, - 'R' => GitChangedFileStatus::Renamed, - 'C' => GitChangedFileStatus::Copied, - _ => GitChangedFileStatus::Unknown, - }; - - match status { - GitChangedFileStatus::Renamed | GitChangedFileStatus::Copied => { - let old_path = parts.next()?.to_string(); - let path = parts.next()?.to_string(); - Some(GitChangedFile { - path, - old_path: Some(old_path), - status, - }) - } - _ => { - let path = parts.next()?.to_string(); - Some(GitChangedFile { - path, - old_path: None, - status, - }) - } - } - }) - .collect() -} - -impl GitService { - /// Checks whether the path is a Git repository. - pub async fn is_repository>(path: P) -> Result { - Ok(is_git_repository(path)) - } - - /// Gets repository information. - pub async fn get_repository>(path: P) -> Result { - let _start_time = Instant::now(); - - let repo = - Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; - - let current_branch = get_current_branch(&repo)?; - let is_bare = repo.is_bare(); - let has_changes = !get_file_statuses(&repo)?.is_empty(); - - let remotes = repo - .remotes() - .map_err(|e| GitError::CommandFailed(e.to_string()))? - .iter() - .filter_map(|name| name.map(|s| s.to_string())) - .collect(); - - let path_str = path.as_ref().to_string_lossy().to_string(); - let name = path - .as_ref() - .file_name() - .unwrap_or_default() - .to_string_lossy() - .to_string(); - - Ok(GitRepository { - path: path_str, - name, - current_branch, - is_bare, - has_changes, - remotes, - }) - } - - /// Gets repository status. - pub async fn get_status>(path: P) -> Result { - let repo = - Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; - - let current_branch = get_current_branch(&repo)?; - let file_statuses = get_file_statuses(&repo)?; - - let mut staged = Vec::new(); - let mut unstaged = Vec::new(); - let mut untracked = Vec::new(); - - for status in file_statuses { - if status.status.contains('?') { - untracked.push(status.path); - } else { - if status.index_status.is_some() { - staged.push(status.clone()); - } - if status.workdir_status.is_some() { - unstaged.push(status); - } - } - } - - let (ahead, behind) = - Self::get_ahead_behind_count(&repo, ¤t_branch).unwrap_or((0, 0)); - - Ok(GitStatus { - staged, - unstaged, - untracked, - current_branch, - ahead, - behind, - }) - } - - /// Gets the branch list. - pub async fn get_branches>( - path: P, - include_remote: bool, - ) -> Result, GitError> { - let repo = - Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; - - let mut branches = Vec::new(); - let current_branch = get_current_branch(&repo)?; - - let local_branches = repo - .branches(Some(BranchType::Local)) - .map_err(|e| GitError::CommandFailed(e.to_string()))?; - - for branch_result in local_branches { - let (branch, _) = branch_result.map_err(|e| GitError::CommandFailed(e.to_string()))?; - - if let Some(name) = branch - .name() - .map_err(|e| GitError::CommandFailed(e.to_string()))? - { - let is_current = name == current_branch; - let upstream = branch.upstream().ok().and_then(|upstream_branch| { - upstream_branch.name().ok().flatten().map(|s| s.to_string()) - }); - - let (last_commit, last_commit_date) = - if let Ok(commit) = branch.get().peel_to_commit() { - ( - Some(commit.id().to_string()), - Some(format_timestamp(commit.time().seconds())), - ) - } else { - (None, None) - }; - - let (ahead, behind) = if is_current { - Self::get_ahead_behind_count(&repo, name).unwrap_or((0, 0)) - } else { - (0, 0) - }; - - branches.push(GitBranch { - name: name.to_string(), - current: is_current, - remote: false, - upstream, - ahead, - behind, - last_commit, - last_commit_date: last_commit_date.clone(), - - base_branch: None, - child_branches: None, - merged_branches: None, - branch_type: Some(Self::determine_branch_type(name)), - has_conflicts: None, - can_merge: None, - is_stale: None, - merge_status: None, - stats: None, - created_at: None, - last_activity_at: last_commit_date, - tags: None, - description: None, - linked_issues: None, - }); - } - } - - if include_remote { - let remote_branches = repo - .branches(Some(BranchType::Remote)) - .map_err(|e| GitError::CommandFailed(e.to_string()))?; - - for branch_result in remote_branches { - let (branch, _) = - branch_result.map_err(|e| GitError::CommandFailed(e.to_string()))?; - - if let Some(name) = branch - .name() - .map_err(|e| GitError::CommandFailed(e.to_string()))? - { - let (last_commit, last_commit_date) = - if let Ok(commit) = branch.get().peel_to_commit() { - ( - Some(commit.id().to_string()), - Some(format_timestamp(commit.time().seconds())), - ) - } else { - (None, None) - }; - - branches.push(GitBranch { - name: name.to_string(), - current: false, - remote: true, - upstream: None, - ahead: 0, - behind: 0, - last_commit, - last_commit_date: last_commit_date.clone(), - - base_branch: None, - child_branches: None, - merged_branches: None, - branch_type: Some(Self::determine_branch_type(name)), - has_conflicts: None, - can_merge: None, - is_stale: None, - merge_status: None, - stats: None, - created_at: None, - last_activity_at: last_commit_date, - tags: None, - description: None, - linked_issues: None, - }); - } - } - } - - Ok(branches) - } - - /// Gets branches with detailed information. - pub async fn get_enhanced_branches>( - path: P, - include_remote: bool, - ) -> Result, GitError> { - let mut branches = Self::get_branches(&path, include_remote).await?; - - Self::analyze_branch_relations(&mut branches)?; - - let repo = - Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; - - for branch in &mut branches { - if !branch.remote { - branch.stats = Self::calculate_branch_stats(&repo, &branch.name).ok(); - branch.is_stale = Some(Self::is_branch_stale(branch)); - branch.can_merge = Self::can_merge_safely(&repo, &branch.name).ok(); - branch.has_conflicts = branch.can_merge.map(|can| !can); - } - } - - Ok(branches) - } - - /// Determines the branch type. - fn determine_branch_type(branch_name: &str) -> String { - if branch_name.starts_with("feature/") || branch_name.starts_with("feat/") { - "feature".to_string() - } else if branch_name.starts_with("hotfix/") || branch_name.starts_with("fix/") { - "hotfix".to_string() - } else if branch_name.starts_with("release/") || branch_name.starts_with("rel/") { - "release".to_string() - } else if branch_name.starts_with("bugfix/") || branch_name.starts_with("bug/") { - "bugfix".to_string() - } else if branch_name.starts_with("chore/") { - "chore".to_string() - } else if branch_name.starts_with("docs/") { - "docs".to_string() - } else if branch_name.starts_with("test/") { - "test".to_string() - } else if ["main", "master", "develop", "development"].contains(&branch_name) { - "main".to_string() - } else { - "other".to_string() - } - } - - /// Analyzes branch relationships. - fn analyze_branch_relations(branches: &mut [GitBranch]) -> Result<(), GitError> { - let main_branches = ["main", "master", "develop"]; - - let available_main_branches: Vec = branches - .iter() - .filter(|b| !b.remote && main_branches.contains(&b.name.as_str())) - .map(|b| b.name.clone()) - .collect(); - - for branch in branches.iter_mut() { - if !branch.remote && !main_branches.contains(&branch.name.as_str()) { - if let Some(main_branch) = available_main_branches.first() { - branch.base_branch = Some(main_branch.clone()); - } - } - } - - let mut child_map: std::collections::HashMap> = - std::collections::HashMap::new(); - - for branch in branches.iter() { - if let Some(base) = &branch.base_branch { - child_map - .entry(base.clone()) - .or_default() - .push(branch.name.clone()); - } - } - - for branch in branches.iter_mut() { - if let Some(children) = child_map.get(&branch.name) { - branch.child_branches = Some(children.clone()); - } - } - - Ok(()) - } - - /// Computes branch statistics. - fn calculate_branch_stats( - repo: &Repository, - branch_name: &str, - ) -> Result { - let branch_ref = repo - .find_branch(branch_name, BranchType::Local) - .map_err(|e| GitError::BranchNotFound(e.to_string()))?; - - let target = branch_ref - .get() - .target() - .ok_or_else(|| GitError::CommandFailed("Branch has no target".to_string()))?; - - let mut revwalk = repo - .revwalk() - .map_err(|e| GitError::CommandFailed(e.to_string()))?; - revwalk - .push(target) - .map_err(|e| GitError::CommandFailed(e.to_string()))?; - - let commit_count = revwalk.count() as i32; - - Ok(GitBranchStats { - commit_count, - contributor_count: 1, - file_changes: 0, - lines_changed: GitLinesChanged { - additions: 0, - deletions: 0, - }, - activity_score: std::cmp::min(commit_count * 2, 100), - }) - } - - /// Checks whether a branch is stale. - fn is_branch_stale(branch: &GitBranch) -> bool { - !matches!(&branch.last_commit_date, Some(_last_commit_date)) - } - - /// Checks whether a branch can be merged safely. - fn can_merge_safely(_repo: &Repository, _branch_name: &str) -> Result { - Ok(true) - } - - /// Gets commit history. - pub async fn get_commits>( - path: P, - params: GitLogParams, - ) -> Result, GitError> { - let repo = - Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; - - let mut revwalk = repo - .revwalk() - .map_err(|e| GitError::CommandFailed(e.to_string()))?; - - revwalk - .push_head() - .map_err(|e| GitError::CommandFailed(e.to_string()))?; - - let mut commits = Vec::new(); - let mut count = 0; - let skip = params.skip.unwrap_or(0); - let max_count = params.max_count.unwrap_or(50); - - for (index, oid_result) in revwalk.enumerate() { - if index < skip as usize { - continue; - } - - if count >= max_count { - break; - } - - let oid = oid_result.map_err(|e| GitError::CommandFailed(e.to_string()))?; - - let commit = repo - .find_commit(oid) - .map_err(|e| GitError::CommandFailed(e.to_string()))?; - - let author = commit.author(); - let message = commit.message().unwrap_or("").to_string(); - - if let Some(author_filter) = ¶ms.author { - if !author.name().unwrap_or("").contains(author_filter) { - continue; - } - } - - if let Some(grep_filter) = ¶ms.grep { - if !message.contains(grep_filter) { - continue; - } - } - - let parents: Vec = commit.parent_ids().map(|id| id.to_string()).collect(); - - let (additions, deletions, files_changed) = if params.stat.unwrap_or(false) { - Self::get_commit_stats(&repo, &commit).unwrap_or((None, None, None)) - } else { - (None, None, None) - }; - - commits.push(GitCommit { - hash: commit.id().to_string(), - short_hash: commit.id().to_string()[..7].to_string(), - message, - author: author.name().unwrap_or("Unknown").to_string(), - author_email: author.email().unwrap_or("").to_string(), - date: format_timestamp(commit.time().seconds()), - parents, - additions, - deletions, - files_changed, - }); - - count += 1; - } - - Ok(commits) - } - - /// Adds files to the staging area. - pub async fn add_files>( - path: P, - params: GitAddParams, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec!["add"]; - - if params.all.unwrap_or(false) { - args.push("-A"); - } else if params.update.unwrap_or(false) { - args.push("-u"); - } else { - for file in ¶ms.files { - args.push(file); - } - } - - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "files": params.files, - "all": params.all, - "update": params.update - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Commits changes. - pub async fn commit>( - path: P, - params: GitCommitParams, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec![ - "commit".to_string(), - "-m".to_string(), - params.message.clone(), - ]; - - if params.amend.unwrap_or(false) { - args.push("--amend".to_string()); - } - - if params.all.unwrap_or(false) { - args.push("-a".to_string()); - } - - if params.no_verify.unwrap_or(false) { - args.push("--no-verify".to_string()); - } - - if let Some(author) = ¶ms.author { - args.push("--author".to_string()); - args.push(format!("{} <{}>", author.name, author.email)); - } - - let args_refs: Vec<&str> = args.iter().map(|s| s.as_str()).collect(); - let output = execute_git_command(&repo_path, &args_refs).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "message": params.message, - "amend": params.amend, - "all": params.all, - "noVerify": params.no_verify, - "author": params.author - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Pushes changes. - pub async fn push>( - path: P, - params: GitPushParams, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec!["push"]; - - if params.force.unwrap_or(false) { - args.push("--force"); - } - - if params.set_upstream.unwrap_or(false) { - args.push("-u"); - } - - if let Some(remote) = ¶ms.remote { - args.push(remote); - } - - if let Some(branch) = ¶ms.branch { - args.push(branch); - } - - let output = timeout( - Duration::from_secs(30), - execute_git_command(&repo_path, &args), - ) - .await - .map_err(|_| GitError::NetworkError("Push operation timed out".to_string()))??; - - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "remote": params.remote, - "branch": params.branch, - "force": params.force, - "set_upstream": params.set_upstream - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Pulls changes. - pub async fn pull>( - path: P, - params: GitPullParams, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec!["pull"]; - - if params.rebase.unwrap_or(false) { - args.push("--rebase"); - } - - if let Some(remote) = ¶ms.remote { - args.push(remote); - } - - if let Some(branch) = ¶ms.branch { - args.push(branch); - } - - let output = timeout( - Duration::from_secs(30), - execute_git_command(&repo_path, &args), - ) - .await - .map_err(|_| GitError::NetworkError("Pull operation timed out".to_string()))??; - - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "remote": params.remote, - "branch": params.branch, - "rebase": params.rebase - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Checks out a branch. - pub async fn checkout_branch>( - path: P, - branch_name: &str, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let args = vec!["checkout", branch_name]; - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "branch": branch_name - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Creates a branch. - pub async fn create_branch>( - path: P, - branch_name: &str, - start_point: Option<&str>, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec!["checkout", "-b", branch_name]; - let effective_start_point = start_point.filter(|s| !s.trim().is_empty()); - if let Some(start) = effective_start_point { - args.push(start); - } - - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "branch": branch_name, - "start_point": effective_start_point - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Deletes a branch. - pub async fn delete_branch>( - path: P, - branch_name: &str, - force: bool, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let flag = if force { "-D" } else { "-d" }; - let args = vec!["branch", flag, branch_name]; - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "branch": branch_name, - "force": force - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Resets to a specific commit. - /// - /// # Parameters - /// - `path`: Repository path - /// - `commit_hash`: Target commit hash - /// - `mode`: Reset mode (`soft`, `mixed`, `hard`) - pub async fn reset_to_commit>( - path: P, - commit_hash: &str, - mode: &str, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let mode_flag = match mode { - "soft" => "--soft", - "mixed" => "--mixed", - "hard" => "--hard", - _ => { - return Err(GitError::CommandFailed(format!( - "Invalid reset mode: {}", - mode - ))) - } - }; - - let args = vec!["reset", mode_flag, commit_hash]; - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "commit": commit_hash, - "mode": mode - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Gets the diff. - pub async fn get_diff>( - path: P, - params: &GitDiffParams, - ) -> Result { - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec!["diff"]; - let range; - - if params.staged.unwrap_or(false) { - args.push("--cached"); - } - - match (¶ms.source, ¶ms.target) { - (Some(src), Some(tgt)) => { - range = format!("{}..{}", src, tgt); - args.push(&range); - } - (Some(src), None) => { - args.push(src); - } - (None, None) => {} - _ => {} - } - - if params.stat.unwrap_or(false) { - args.push("--stat"); - } - - if let Some(files) = ¶ms.files { - args.push("--"); - for file in files { - args.push(file); - } - } - - execute_git_command(&repo_path, &args).await - } - - /// Gets changed files using `git diff --name-status`. - pub async fn get_changed_files>( - path: P, - params: &GitChangedFilesParams, - ) -> Result, GitError> { - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec!["diff", "--name-status"]; - let range; - - if params.staged.unwrap_or(false) { - args.push("--cached"); - } - - match (¶ms.source, ¶ms.target) { - (Some(src), Some(tgt)) => { - range = format!("{}..{}", src, tgt); - args.push(&range); - } - (Some(src), None) => { - args.push(src); - } - (None, Some(tgt)) => { - args.push(tgt); - } - (None, None) => {} - } - - let output = execute_git_command(&repo_path, &args).await?; - Ok(parse_name_status_output(&output)) - } - - /// Gets file content. - /// - /// # Parameters - /// - `path`: Repository path - /// - `file_path`: File relative path - /// - `commit`: Commit reference (optional, defaults to `HEAD`) - /// - /// # Returns - /// - File content string - pub async fn get_file_content>( - path: P, - file_path: &str, - commit: Option<&str>, - ) -> Result { - let repo_path = path.as_ref().to_string_lossy(); - - let commit_ref = commit.unwrap_or("HEAD"); - let object_spec = format!("{}:{}", commit_ref, file_path); - - let args = vec!["show", &object_spec]; - - execute_git_command(&repo_path, &args).await - } - - /// Resets file changes (discarding working tree changes). - /// - /// # Parameters - /// - `path`: Repository path - /// - `files`: List of file paths - /// - `staged`: Whether to reset the index (`true`: reset staged, `false`: restore worktree) - /// - /// # Returns - /// - Operation result - pub async fn reset_files>( - path: P, - files: &[String], - staged: bool, - ) -> Result { - let repo_path = path.as_ref().to_string_lossy(); - - if staged { - let mut args = vec!["restore", "--staged"]; - for file in files { - args.push(file); - } - execute_git_command(&repo_path, &args).await - } else { - let mut args = vec!["restore"]; - for file in files { - args.push(file); - } - execute_git_command(&repo_path, &args).await - } - } - - /// Gets ahead/behind counts. - fn get_ahead_behind_count( - repo: &Repository, - branch_name: &str, - ) -> Result<(i32, i32), GitError> { - let local_branch = repo - .find_branch(branch_name, BranchType::Local) - .map_err(|e| GitError::BranchNotFound(e.to_string()))?; - - if let Ok(upstream) = local_branch.upstream() { - let local_oid = local_branch.get().target().ok_or_else(|| { - GitError::CommandFailed("Failed to get local branch target".to_string()) - })?; - let upstream_oid = upstream.get().target().ok_or_else(|| { - GitError::CommandFailed("Failed to get upstream branch target".to_string()) - })?; - - let (ahead, behind) = repo - .graph_ahead_behind(local_oid, upstream_oid) - .map_err(|e| GitError::CommandFailed(e.to_string()))?; - - Ok((ahead as i32, behind as i32)) - } else { - Ok((0, 0)) - } - } - - /// Gets commit statistics. - fn get_commit_stats(_repo: &Repository, _commit: &Commit) -> Result { - Ok((None, None, None)) - } - - /// Gets Git commit graph data. - pub async fn get_git_graph>( - path: P, - max_count: Option, - ) -> Result { - let repo = - Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; - - super::graph::build_git_graph(&repo, max_count) - .map_err(|e| GitError::CommandFailed(e.to_string())) - } - - /// Gets Git commit graph data for a specific branch. - pub async fn get_git_graph_for_branch>( - path: P, - max_count: Option, - branch_name: Option<&str>, - ) -> Result { - let repo = - Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; - - super::graph::build_git_graph_for_branch(&repo, max_count, branch_name) - .map_err(|e| GitError::CommandFailed(e.to_string())) - } - - /// Cherry-picks a commit onto the current branch. - /// - /// # Parameters - /// - `path`: Repository path - /// - `commit_hash`: Commit hash to cherry-pick - /// - `no_commit`: Apply changes without committing automatically (default `false`) - /// - /// # Returns - /// - Operation result - pub async fn cherry_pick>( - path: P, - commit_hash: &str, - no_commit: bool, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec!["cherry-pick"]; - - if no_commit { - args.push("-n"); - } - - args.push(commit_hash); - - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "commit": commit_hash, - "no_commit": no_commit - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Aborts the cherry-pick operation. - /// - /// # Parameters - /// - `path`: Repository path - /// - /// # Returns - /// - Operation result - pub async fn cherry_pick_abort>( - path: P, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let args = vec!["cherry-pick", "--abort"]; - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: None, - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Continues the cherry-pick operation (after resolving conflicts). - /// - /// # Parameters - /// - `path`: Repository path - /// - /// # Returns - /// - Operation result - pub async fn cherry_pick_continue>( - path: P, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let args = vec!["cherry-pick", "--continue"]; - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: None, - error: None, - output: Some(output), - duration: Some(duration), - }) - } - - /// Lists all worktrees. - /// - /// # Parameters - /// - `path`: Repository path - /// - /// # Returns - /// - Worktree list - pub async fn list_worktrees>( - path: P, - ) -> Result, GitError> { - let repo_path = path.as_ref().to_string_lossy(); - - let args = vec!["worktree", "list", "--porcelain"]; - let output = execute_git_command(&repo_path, &args).await?; - - Self::parse_worktree_list(&output) - } - - /// Parses `git worktree list --porcelain` output. - fn parse_worktree_list(output: &str) -> Result, GitError> { - let mut worktrees = Vec::new(); - let mut current_worktree: Option = None; - - for line in output.lines() { - if line.starts_with("worktree ") { - if let Some(wt) = current_worktree.take() { - worktrees.push(wt); - } - let path = line.strip_prefix("worktree ").unwrap_or("").to_string(); - current_worktree = Some(super::GitWorktreeInfo { - path, - branch: None, - head: String::new(), - is_main: false, - is_locked: false, - is_prunable: false, - }); - } else if let Some(ref mut wt) = current_worktree { - if line.starts_with("HEAD ") { - wt.head = line.strip_prefix("HEAD ").unwrap_or("").to_string(); - } else if line.starts_with("branch ") { - let branch_ref = line.strip_prefix("branch ").unwrap_or(""); - let branch_name = branch_ref - .strip_prefix("refs/heads/") - .unwrap_or(branch_ref) - .to_string(); - wt.branch = Some(branch_name); - } else if line == "bare" { - wt.is_main = true; - } else if line == "locked" { - wt.is_locked = true; - } else if line == "prunable" { - wt.is_prunable = true; - } - } - } - - if let Some(wt) = current_worktree { - worktrees.push(wt); - } - - if let Some(first) = worktrees.first_mut() { - if !first.is_main { - first.is_main = true; - } - } - - Ok(worktrees) - } - - /// Adds a new worktree. - /// - /// # Parameters - /// - `path`: Repository path - /// - `branch`: Branch name - /// - `create_branch`: Whether to create a new branch - /// - /// # Returns - /// - Newly created worktree information - pub async fn add_worktree>( - path: P, - branch: &str, - create_branch: bool, - ) -> Result { - let repo_path = path.as_ref().to_string_lossy(); - - let worktree_dir = path.as_ref().join(".worktrees"); - let worktree_path = worktree_dir.join(branch); - let worktree_path_str = worktree_path.to_string_lossy().to_string(); - - if !worktree_dir.exists() { - std::fs::create_dir_all(&worktree_dir).map_err(GitError::IoError)?; - } - - let args = if create_branch { - vec!["worktree", "add", "-b", branch, &worktree_path_str] - } else { - vec!["worktree", "add", &worktree_path_str, branch] - }; - - execute_git_command(&repo_path, &args).await?; - - let worktrees = Self::list_worktrees(&path).await?; - - let normalized_expected = worktree_path_str.replace("\\", "/"); - - worktrees - .into_iter() - .find(|wt| wt.path == normalized_expected) - .ok_or_else(|| { - GitError::CommandFailed("Failed to find newly created worktree".to_string()) - }) - } - - /// Removes a worktree. - /// - /// # Parameters - /// - `path`: Repository path - /// - `worktree_path`: Worktree path to remove - /// - `force`: Whether to force removal - /// - /// # Returns - /// - Operation result - pub async fn remove_worktree>( - path: P, - worktree_path: &str, - force: bool, - ) -> Result { - let start_time = Instant::now(); - let repo_path = path.as_ref().to_string_lossy(); - - let mut args = vec!["worktree", "remove"]; - if force { - args.push("--force"); - } - args.push(worktree_path); - - let output = execute_git_command(&repo_path, &args).await?; - let duration = elapsed_ms_u64(start_time); - - Ok(GitOperationResult { - success: true, - data: Some(serde_json::json!({ - "worktree_path": worktree_path, - "force": force - })), - error: None, - output: Some(output), - duration: Some(duration), - }) - } -} - -#[cfg(test)] -mod tests { - use super::*; - - #[test] - fn parses_name_status_output_for_common_statuses() { - let files = parse_name_status_output( - "M\tsrc/main.rs\nA\tsrc/new.rs\nD\tsrc/old.rs\nR100\tsrc/old_name.rs\tsrc/new_name.rs\nC087\tsrc/source.rs\tsrc/copy.rs\n", - ); - - assert_eq!( - files, - vec![ - GitChangedFile { - path: "src/main.rs".to_string(), - old_path: None, - status: GitChangedFileStatus::Modified, - }, - GitChangedFile { - path: "src/new.rs".to_string(), - old_path: None, - status: GitChangedFileStatus::Added, - }, - GitChangedFile { - path: "src/old.rs".to_string(), - old_path: None, - status: GitChangedFileStatus::Deleted, - }, - GitChangedFile { - path: "src/new_name.rs".to_string(), - old_path: Some("src/old_name.rs".to_string()), - status: GitChangedFileStatus::Renamed, - }, - GitChangedFile { - path: "src/copy.rs".to_string(), - old_path: Some("src/source.rs".to_string()), - status: GitChangedFileStatus::Copied, - }, - ], - ); - } -} +pub use bitfun_services_integrations::git::GitService; diff --git a/src/crates/core/src/service/git/git_types.rs b/src/crates/core/src/service/git/git_types.rs index 7d2786ce3..4e441e371 100644 --- a/src/crates/core/src/service/git/git_types.rs +++ b/src/crates/core/src/service/git/git_types.rs @@ -1,283 +1,2 @@ -/** - * Git-related type definitions - */ -use serde::{Deserialize, Serialize}; - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitRepository { - pub path: String, - pub name: String, - pub current_branch: String, - pub is_bare: bool, - pub has_changes: bool, - pub remotes: Vec, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitStatus { - pub staged: Vec, - pub unstaged: Vec, - pub untracked: Vec, - pub current_branch: String, - pub ahead: i32, - pub behind: i32, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitFileStatus { - pub path: String, - pub status: String, - pub index_status: Option, - pub workdir_status: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitBranch { - pub name: String, - pub current: bool, - pub remote: bool, - pub upstream: Option, - pub ahead: i32, - pub behind: i32, - pub last_commit: Option, - pub last_commit_date: Option, - - pub base_branch: Option, - pub child_branches: Option>, - pub merged_branches: Option>, - - pub branch_type: Option, - pub has_conflicts: Option, - pub can_merge: Option, - pub is_stale: Option, - pub merge_status: Option, - - pub stats: Option, - pub created_at: Option, - pub last_activity_at: Option, - - pub tags: Option>, - pub description: Option, - pub linked_issues: Option>, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitBranchStats { - pub commit_count: i32, - pub contributor_count: i32, - pub file_changes: i32, - pub lines_changed: GitLinesChanged, - pub activity_score: i32, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitLinesChanged { - pub additions: i32, - pub deletions: i32, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitCommit { - pub hash: String, - pub short_hash: String, - pub message: String, - pub author: String, - pub author_email: String, - pub date: String, - pub parents: Vec, - pub additions: Option, - pub deletions: Option, - pub files_changed: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -pub struct GitLogParams { - pub max_count: Option, - pub skip: Option, - pub author: Option, - pub grep: Option, - pub since: Option, - pub until: Option, - pub stat: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitAddParams { - pub files: Vec, - pub all: Option, - pub update: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitCommitParams { - pub message: String, - pub amend: Option, - pub all: Option, - #[serde(rename = "noVerify")] - pub no_verify: Option, - pub author: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitAuthor { - pub name: String, - pub email: String, -} - -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -pub struct GitPushParams { - pub remote: Option, - pub branch: Option, - pub force: Option, - pub set_upstream: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -pub struct GitPullParams { - pub remote: Option, - pub branch: Option, - pub rebase: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitMergeParams { - pub branch: String, - pub strategy: Option, - pub message: Option, - pub no_ff: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitStashParams { - pub message: Option, - pub include_untracked: Option, - pub keep_index: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -pub struct GitDiffParams { - pub source: Option, - pub target: Option, - pub files: Option>, - pub staged: Option, - pub stat: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -pub struct GitChangedFilesParams { - pub source: Option, - pub target: Option, - pub staged: Option, -} - -#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] -#[serde(rename_all = "snake_case")] -pub enum GitChangedFileStatus { - Added, - Modified, - Deleted, - Renamed, - Copied, - Unknown, -} - -#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] -pub struct GitChangedFile { - pub path: String, - pub old_path: Option, - pub status: GitChangedFileStatus, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitOperationResult { - pub success: bool, - pub data: Option, - pub error: Option, - pub output: Option, - pub duration: Option, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitDiffResult { - pub files: Vec, - pub total_additions: i32, - pub total_deletions: i32, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitDiffFile { - pub path: String, - pub old_path: Option, - pub status: String, - pub additions: i32, - pub deletions: i32, - pub diff: String, -} - -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct GitStash { - pub index: i32, - pub message: String, - pub branch: String, - pub date: String, - pub hash: String, -} - -/// Git worktree information -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct GitWorktreeInfo { - /// Worktree path - pub path: String, - /// Associated branch name - pub branch: Option, - /// HEAD commit hash - pub head: String, - /// Whether this is the main worktree (the main directory of a bare repository) - pub is_main: bool, - /// Whether the worktree is locked - pub is_locked: bool, - /// Whether the worktree is prunable - pub is_prunable: bool, -} - -#[derive(Debug, thiserror::Error)] -pub enum GitError { - #[error("Repository not found: {0}")] - RepositoryNotFound(String), - - #[error("Git command failed: {0}")] - CommandFailed(String), - - #[error("Invalid repository path: {0}")] - InvalidPath(String), - - #[error("Branch not found: {0}")] - BranchNotFound(String), - - #[error("Merge conflict: {0}")] - MergeConflict(String), - - #[error("Authentication failed: {0}")] - AuthenticationFailed(String), - - #[error("Network error: {0}")] - NetworkError(String), - - #[error("Parse error: {0}")] - ParseError(String), - - #[error("IO error: {0}")] - IoError(#[from] std::io::Error), - - #[error("Git2 error: {0}")] - Git2Error(#[from] git2::Error), -} - -/// Raw result of executing a git command, preserving exit code and both streams. -#[derive(Debug, Clone)] -pub struct GitCommandOutput { - pub stdout: String, - pub stderr: String, - pub exit_code: i32, -} +pub use bitfun_services_integrations::git::types::*; +pub use bitfun_services_integrations::git::GitError; diff --git a/src/crates/core/src/service/git/git_utils.rs b/src/crates/core/src/service/git/git_utils.rs index 963e02052..1d070cb6d 100644 --- a/src/crates/core/src/service/git/git_utils.rs +++ b/src/crates/core/src/service/git/git_utils.rs @@ -1,322 +1,6 @@ -/** - * Git utility functions - */ -use super::git_types::{GitCommandOutput, GitError, GitFileStatus}; -use git2::{Repository, Status, StatusOptions}; -use std::path::Path; - -/// Returns whether the given path is a Git repository. -pub fn is_git_repository>(path: P) -> bool { - Repository::open(path).is_ok() -} - -/// Returns the repository root directory. -pub fn get_repository_root>(path: P) -> Result { - let repo = - Repository::discover(path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; - - let workdir = repo - .workdir() - .ok_or_else(|| GitError::InvalidPath("Repository has no working directory".to_string()))?; - - Ok(workdir.to_string_lossy().to_string()) -} - -/// Returns the current branch name. -pub fn get_current_branch(repo: &Repository) -> Result { - match repo.head() { - Ok(head) => { - if let Some(branch_name) = head.shorthand() { - Ok(branch_name.to_string()) - } else { - Ok("HEAD".to_string()) - } - } - Err(e) => { - if e.code() == git2::ErrorCode::UnbornBranch { - if let Ok(config) = repo.config() { - if let Ok(default_branch) = config.get_string("init.defaultBranch") { - return Ok(default_branch); - } - } - Ok("master".to_string()) - } else { - Err(GitError::CommandFailed(format!( - "Failed to get HEAD: {}", - e - ))) - } - } - } -} - -/// Converts Git status flags to a short string. -pub fn status_to_string(status: Status) -> String { - let mut result = Vec::new(); - - if status.contains(Status::INDEX_NEW) { - result.push("A"); - } - if status.contains(Status::INDEX_MODIFIED) { - result.push("M"); - } - if status.contains(Status::INDEX_DELETED) { - result.push("D"); - } - if status.contains(Status::INDEX_RENAMED) { - result.push("R"); - } - if status.contains(Status::INDEX_TYPECHANGE) { - result.push("T"); - } - - if status.contains(Status::WT_NEW) { - result.push("?"); - } - if status.contains(Status::WT_MODIFIED) { - result.push("M"); - } - if status.contains(Status::WT_DELETED) { - result.push("D"); - } - if status.contains(Status::WT_RENAMED) { - result.push("R"); - } - if status.contains(Status::WT_TYPECHANGE) { - result.push("T"); - } - - if result.is_empty() { - "U".to_string() - } else { - result.join("") - } -} - -/// Maximum number of untracked entries before we stop recursing into untracked -/// directories. When the non-recursive scan already reports many untracked -/// top-level entries, recursing would return thousands of paths that bloat IPC -/// payloads and DOM rendering, causing severe UI lag. -const UNTRACKED_RECURSE_THRESHOLD: usize = 200; - -/// Collects file statuses from a `StatusOptions` scan. -fn collect_statuses( - repo: &Repository, - recurse_untracked: bool, -) -> Result, GitError> { - let mut status_options = StatusOptions::new(); - status_options.include_untracked(true); - status_options.include_ignored(false); - status_options.recurse_untracked_dirs(recurse_untracked); - - let statuses = repo - .statuses(Some(&mut status_options)) - .map_err(|e| GitError::CommandFailed(format!("Failed to get statuses: {}", e)))?; - - let mut result = Vec::new(); - - for entry in statuses.iter() { - if let Some(path) = entry.path() { - let status = entry.status(); - let status_str = status_to_string(status); - - let index_status = if status.intersects( - Status::INDEX_NEW - | Status::INDEX_MODIFIED - | Status::INDEX_DELETED - | Status::INDEX_RENAMED - | Status::INDEX_TYPECHANGE, - ) { - Some(status_to_string( - status - & (Status::INDEX_NEW - | Status::INDEX_MODIFIED - | Status::INDEX_DELETED - | Status::INDEX_RENAMED - | Status::INDEX_TYPECHANGE), - )) - } else { - None - }; - - let workdir_status = if status.intersects( - Status::WT_NEW - | Status::WT_MODIFIED - | Status::WT_DELETED - | Status::WT_RENAMED - | Status::WT_TYPECHANGE, - ) { - Some(status_to_string( - status - & (Status::WT_NEW - | Status::WT_MODIFIED - | Status::WT_DELETED - | Status::WT_RENAMED - | Status::WT_TYPECHANGE), - )) - } else { - None - }; - - result.push(GitFileStatus { - path: path.to_string(), - status: status_str, - index_status, - workdir_status, - }); - } - } - - Ok(result) -} - -/// Returns file statuses. -/// -/// Uses a two-pass strategy to avoid expensive recursive scans when the -/// repository contains many untracked files (e.g. missing .gitignore for -/// build artifacts). First a non-recursive pass counts top-level untracked -/// entries; only when that count is within `UNTRACKED_RECURSE_THRESHOLD` does -/// a second recursive pass run. -pub fn get_file_statuses(repo: &Repository) -> Result, GitError> { - // Pass 1: fast non-recursive scan. - let shallow = collect_statuses(repo, false)?; - - let untracked_count = shallow.iter().filter(|f| f.status.contains('?')).count(); - - if untracked_count <= UNTRACKED_RECURSE_THRESHOLD { - // Few untracked entries – safe to recurse for full detail. - collect_statuses(repo, true) - } else { - // Too many untracked entries – return the shallow result as-is. - // Untracked directories appear as a single entry (folder name with - // trailing slash) which is sufficient for the UI. - Ok(shallow) - } -} - -/// Executes a Git command and returns the raw output including exit code. -/// -/// Git diff returns exit code 1 when there are differences (not an error). -/// Callers that need to distinguish this case should inspect `exit_code`. -pub async fn execute_git_command_raw( - repo_path: &str, - args: &[&str], -) -> Result { - let output = crate::util::process_manager::create_tokio_command("git") - .current_dir(repo_path) - .args(args) - .output() - .await - .map_err(|e| GitError::CommandFailed(format!("Failed to execute git command: {}", e)))?; - - Ok(GitCommandOutput { - stdout: String::from_utf8_lossy(&output.stdout).to_string(), - stderr: String::from_utf8_lossy(&output.stderr).to_string(), - exit_code: output.status.code().unwrap_or(-1), - }) -} - -/// Executes a Git command. -/// -/// For most git commands, exit code 0 means success and anything else is an error. -/// However, `git diff` returns exit code 1 when there are differences, which is -/// not an error. Use [`execute_git_command_raw`] if you need to handle that case. -pub async fn execute_git_command(repo_path: &str, args: &[&str]) -> Result { - let result = execute_git_command_raw(repo_path, args).await?; - - if result.exit_code == 0 { - Ok(result.stdout) - } else { - let error = if result.stderr.is_empty() { - result.stdout - } else { - result.stderr - }; - Err(GitError::CommandFailed(error)) - } -} - -/// Executes a Git command synchronously and returns the raw output including exit code. -pub fn execute_git_command_sync_raw( - repo_path: &str, - args: &[&str], -) -> Result { - let output = crate::util::process_manager::create_command("git") - .current_dir(repo_path) - .args(args) - .output() - .map_err(|e| GitError::CommandFailed(format!("Failed to execute git command: {}", e)))?; - - Ok(GitCommandOutput { - stdout: String::from_utf8_lossy(&output.stdout).to_string(), - stderr: String::from_utf8_lossy(&output.stderr).to_string(), - exit_code: output.status.code().unwrap_or(-1), - }) -} - -/// Executes a Git command synchronously. -pub fn execute_git_command_sync(repo_path: &str, args: &[&str]) -> Result { - let result = execute_git_command_sync_raw(repo_path, args)?; - - if result.exit_code == 0 { - Ok(result.stdout) - } else { - let error = if result.stderr.is_empty() { - result.stdout - } else { - result.stderr - }; - Err(GitError::CommandFailed(error)) - } -} - -/// Parses a Git log line. -pub fn parse_git_log_line(line: &str) -> Option<(String, String, String, String, String)> { - let parts: Vec<&str> = line.splitn(5, '|').collect(); - if parts.len() == 5 { - Some(( - parts[0].to_string(), - parts[1].to_string(), - parts[2].to_string(), - parts[3].to_string(), - parts[4].to_string(), - )) - } else { - None - } -} - -/// Parses a Git branch line. -pub fn parse_branch_line(line: &str) -> Option<(String, bool)> { - let trimmed = line.trim(); - if trimmed.is_empty() { - return None; - } - - if let Some(stripped) = trimmed.strip_prefix("* ") { - Some((stripped.to_string(), true)) - } else if let Some(stripped) = trimmed.strip_prefix(" ") { - Some((stripped.to_string(), false)) - } else { - Some((trimmed.to_string(), false)) - } -} - -/// Formats a timestamp. -pub fn format_timestamp(timestamp: i64) -> String { - use chrono::{TimeZone, Utc}; - - match Utc.timestamp_opt(timestamp, 0) { - chrono::LocalResult::Single(dt) => dt.format("%Y-%m-%d %H:%M:%S UTC").to_string(), - _ => "Invalid date".to_string(), - } -} - -/// Checks whether Git is available. -pub fn check_git_available() -> bool { - crate::util::process_manager::create_command("git") - .arg("--version") - .output() - .map(|output| output.status.success()) - .unwrap_or(false) -} +pub use bitfun_services_integrations::git::{ + build_git_changed_files_args, build_git_diff_args, check_git_available, execute_git_command, + execute_git_command_raw, execute_git_command_sync, execute_git_command_sync_raw, + format_timestamp, get_current_branch, get_file_statuses, get_repository_root, + is_git_repository, parse_branch_line, parse_git_log_line, status_to_string, +}; diff --git a/src/crates/core/src/service/git/graph.rs b/src/crates/core/src/service/git/graph.rs index 45bae413b..75ec74a6f 100644 --- a/src/crates/core/src/service/git/graph.rs +++ b/src/crates/core/src/service/git/graph.rs @@ -1,359 +1 @@ -use git2::{Commit, Oid, Repository, Sort}; -use serde::{Deserialize, Serialize}; -use std::collections::HashMap; - -/// Git graph node -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct GraphNode { - /// Commit hash - pub hash: String, - /// Commit message (first line) - pub message: String, - /// Full commit message - pub full_message: String, - /// Author name - pub author_name: String, - /// Author email - pub author_email: String, - /// Commit time (Unix timestamp) - pub timestamp: i64, - /// Parent commit hashes - pub parents: Vec, - /// Child commit hashes (filled when building the graph) - pub children: Vec, - /// Associated refs (branches, tags, etc.) - pub refs: Vec, - /// Lane position - pub lane: i32, - /// Lanes that fork out - pub forking_lanes: Vec, - /// Lanes that merge in - pub merging_lanes: Vec, - /// Lanes that pass through - pub passing_lanes: Vec, -} - -/// Git ref information -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct GraphRef { - /// Ref name - pub name: String, - /// Ref type: `branch`, `remote`, `tag` - pub ref_type: String, - /// Whether this is the current branch - pub is_current: bool, - /// Whether this is `HEAD` - pub is_head: bool, -} - -/// Git graph data -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct GitGraph { - /// Node list - pub nodes: Vec, - /// Maximum lane count - pub max_lane: i32, - /// Current branch name - pub current_branch: Option, -} - -/// Lane allocator -struct LaneAllocator { - /// Active lanes: lane position -> commit hash - active_lanes: HashMap, - /// Free lane positions - free_positions: Vec, - /// Next available position - next_position: i32, - /// Lane length stats - lane_lengths: HashMap, -} - -impl LaneAllocator { - fn new() -> Self { - Self { - active_lanes: HashMap::new(), - free_positions: Vec::new(), - next_position: 0, - lane_lengths: HashMap::new(), - } - } - - /// Allocates a new lane. - fn allocate(&mut self, commit_hash: String) -> i32 { - let position = if let Some(pos) = self.free_positions.pop() { - pos - } else { - let pos = self.next_position; - self.next_position += 1; - pos - }; - - self.active_lanes.insert(position, commit_hash); - self.lane_lengths.insert(position, 1); - position - } - - /// Frees a lane. - fn free(&mut self, position: i32) { - self.active_lanes.remove(&position); - self.lane_lengths.remove(&position); - self.free_positions.push(position); - self.free_positions.sort_unstable(); - } - - /// Increments the lane length. - fn increment_length(&mut self, position: i32) { - if let Some(len) = self.lane_lengths.get_mut(&position) { - *len += 1; - } - } - - /// Returns the lane length. - fn get_length(&self, position: i32) -> usize { - self.lane_lengths.get(&position).copied().unwrap_or(0) - } -} - -/// Builds a Git graph. -pub fn build_git_graph( - repo: &Repository, - max_count: Option, -) -> Result { - build_git_graph_for_branch(repo, max_count, None) -} - -/// Builds a Git graph for a specific branch. -pub fn build_git_graph_for_branch( - repo: &Repository, - max_count: Option, - branch_name: Option<&str>, -) -> Result { - let current_branch = get_current_branch(repo); - - let refs_map = collect_refs(repo)?; - - let mut revwalk = repo.revwalk()?; - revwalk.set_sorting(Sort::TOPOLOGICAL | Sort::TIME)?; - - if let Some(branch) = branch_name { - if let Ok(reference) = repo.find_branch(branch, git2::BranchType::Local) { - if let Some(oid) = reference.get().target() { - revwalk.push(oid)?; - } - } else if let Ok(reference) = repo.find_branch(branch, git2::BranchType::Remote) { - if let Some(oid) = reference.get().target() { - revwalk.push(oid)?; - } - } else if let Ok(reference) = repo.find_reference(&format!("refs/heads/{}", branch)) { - if let Some(oid) = reference.target() { - revwalk.push(oid)?; - } - } else { - for reference in repo.references()? { - let reference = reference?; - if reference.is_branch() || reference.is_remote() || reference.is_tag() { - if let Some(oid) = reference.target() { - revwalk.push(oid)?; - } - } - } - } - } else { - for reference in repo.references()? { - let reference = reference?; - if reference.is_branch() || reference.is_remote() || reference.is_tag() { - if let Some(oid) = reference.target() { - revwalk.push(oid)?; - } - } - } - } - - let mut commits: Vec<(Oid, Commit)> = Vec::new(); - let max_count = max_count.unwrap_or(1000); - - for oid_result in revwalk.take(max_count) { - let oid = oid_result?; - if let Ok(commit) = repo.find_commit(oid) { - commits.push((oid, commit)); - } - } - - let mut children_map: HashMap> = HashMap::new(); - for (oid, commit) in &commits { - let hash = oid.to_string(); - for parent_id in commit.parent_ids() { - let parent_hash = parent_id.to_string(); - children_map - .entry(parent_hash) - .or_default() - .push(hash.clone()); - } - } - - let mut nodes: Vec = Vec::new(); - for (oid, commit) in commits { - let hash = oid.to_string(); - let message = commit.summary().unwrap_or("").to_string(); - let full_message = commit.message().unwrap_or("").to_string(); - let author = commit.author(); - - let node = GraphNode { - hash: hash.clone(), - message, - full_message, - author_name: author.name().unwrap_or("Unknown").to_string(), - author_email: author.email().unwrap_or("").to_string(), - timestamp: author.when().seconds(), - parents: commit.parent_ids().map(|id| id.to_string()).collect(), - children: children_map.get(&hash).cloned().unwrap_or_default(), - refs: refs_map.get(&hash).cloned().unwrap_or_default(), - lane: -1, - forking_lanes: Vec::new(), - merging_lanes: Vec::new(), - passing_lanes: Vec::new(), - }; - - nodes.push(node); - } - - let max_lane = allocate_lanes(&mut nodes); - - Ok(GitGraph { - nodes, - max_lane, - current_branch, - }) -} - -/// Collects all refs. -fn collect_refs(repo: &Repository) -> Result>, git2::Error> { - let mut refs_map: HashMap> = HashMap::new(); - let head = repo.head().ok(); - let current_branch = get_current_branch(repo); - - for reference in repo.references()? { - let reference = reference?; - - let (ref_type, name) = if reference.is_branch() { - ("branch", reference.shorthand().unwrap_or("")) - } else if reference.is_remote() { - ("remote", reference.shorthand().unwrap_or("")) - } else if reference.is_tag() { - ("tag", reference.shorthand().unwrap_or("")) - } else { - continue; - }; - - if let Some(oid) = reference.target() { - let hash = oid.to_string(); - let is_current = current_branch.as_ref().is_some_and(|cb| cb == name); - let is_head = head.as_ref().and_then(|h| h.target()) == Some(oid); - - let graph_ref = GraphRef { - name: name.to_string(), - ref_type: ref_type.to_string(), - is_current, - is_head, - }; - - refs_map.entry(hash).or_default().push(graph_ref); - } - } - - Ok(refs_map) -} - -/// Returns the current branch name. -fn get_current_branch(repo: &Repository) -> Option { - repo.head() - .ok() - .and_then(|head| head.shorthand().map(|s| s.to_string())) -} - -/// Allocates lanes (simplified algorithm). -fn allocate_lanes(nodes: &mut [GraphNode]) -> i32 { - if nodes.is_empty() { - return 0; - } - - let mut allocator = LaneAllocator::new(); - let mut commit_lanes: HashMap = HashMap::new(); - - for node in nodes.iter_mut() { - let hash = node.hash.clone(); - - let lane = if node.children.is_empty() { - allocator.allocate(hash.clone()) - } else if node.children.len() == 1 { - let child_hash = &node.children[0]; - if let Some(&child_lane) = commit_lanes.get(child_hash) { - allocator.increment_length(child_lane); - child_lane - } else { - allocator.allocate(hash.clone()) - } - } else { - let mut best_lane = -1; - let mut best_length = 0; - - for child_hash in &node.children { - if let Some(&child_lane) = commit_lanes.get(child_hash) { - let length = allocator.get_length(child_lane); - if length > best_length { - best_length = length; - best_lane = child_lane; - } - } - } - - if best_lane >= 0 { - allocator.increment_length(best_lane); - best_lane - } else { - allocator.allocate(hash.clone()) - } - }; - - node.lane = lane; - commit_lanes.insert(hash.clone(), lane); - - for child_hash in &node.children { - if let Some(&child_lane) = commit_lanes.get(child_hash) { - if child_lane != lane { - node.forking_lanes.push(child_lane); - } - } - } - - for (i, parent_hash) in node.parents.iter().enumerate() { - if i > 0 { - if let Some(&parent_lane) = commit_lanes.get(parent_hash) { - if parent_lane != lane { - node.merging_lanes.push(parent_lane); - } - } - } - } - - let active_lanes: Vec = allocator.active_lanes.keys().copied().collect(); - for &active_lane in &active_lanes { - if active_lane != lane - && !node.forking_lanes.contains(&active_lane) - && !node.merging_lanes.contains(&active_lane) - { - node.passing_lanes.push(active_lane); - } - } - - if node.parents.is_empty() { - allocator.free(lane); - } - } - - allocator.next_position -} +pub use bitfun_services_integrations::git::{build_git_graph, build_git_graph_for_branch}; diff --git a/src/crates/core/src/service/mcp/config/location.rs b/src/crates/core/src/service/mcp/config/location.rs index 79bf91d65..109c2a8ce 100644 --- a/src/crates/core/src/service/mcp/config/location.rs +++ b/src/crates/core/src/service/mcp/config/location.rs @@ -1,10 +1 @@ -use serde::{Deserialize, Serialize}; - -/// Configuration location. -#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)] -#[serde(rename_all = "kebab-case")] -pub enum ConfigLocation { - BuiltIn, // Built-in configuration - User, // User-level configuration - Project, // Project-level configuration -} +pub use bitfun_services_integrations::mcp::config::ConfigLocation; diff --git a/src/crates/core/src/service/mcp/protocol/types.rs b/src/crates/core/src/service/mcp/protocol/types.rs index 9ec34d125..810b8eef2 100644 --- a/src/crates/core/src/service/mcp/protocol/types.rs +++ b/src/crates/core/src/service/mcp/protocol/types.rs @@ -1,710 +1 @@ -//! MCP protocol type definitions -//! -//! Core data structures that follow the Model Context Protocol specification. - -use serde::{Deserialize, Serialize}; -use serde_json::Value; -use std::collections::HashMap; - -/// MCP protocol version (string format, follows the MCP spec). -/// -/// Aligned with VSCode: "2025-11-25" -/// Reference: https://spec.modelcontextprotocol.io/ -pub type MCPProtocolVersion = String; - -/// Returns the default MCP protocol version. -pub fn default_protocol_version() -> MCPProtocolVersion { - "2025-11-25".to_string() -} - -/// MCP resources capability. -#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Default)] -#[serde(rename_all = "camelCase")] -pub struct ResourcesCapability { - #[serde(default)] - pub subscribe: bool, - #[serde(default)] - pub list_changed: bool, -} - -/// MCP prompts capability. -#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Default)] -#[serde(rename_all = "camelCase")] -pub struct PromptsCapability { - #[serde(default)] - pub list_changed: bool, -} - -/// MCP tools capability. -#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Default)] -#[serde(rename_all = "camelCase")] -pub struct ToolsCapability { - #[serde(default)] - pub list_changed: bool, -} - -/// MCP capability declaration (follows the latest MCP spec). -#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)] -#[serde(rename_all = "camelCase")] -pub struct MCPCapability { - #[serde(default, skip_serializing_if = "Option::is_none")] - pub resources: Option, - #[serde(default, skip_serializing_if = "Option::is_none")] - pub prompts: Option, - #[serde(default, skip_serializing_if = "Option::is_none")] - pub tools: Option, - #[serde(default, skip_serializing_if = "Option::is_none")] - pub logging: Option, -} - -impl Default for MCPCapability { - fn default() -> Self { - Self { - resources: Some(ResourcesCapability::default()), - prompts: Some(PromptsCapability::default()), - tools: Some(ToolsCapability::default()), - logging: None, - } - } -} - -/// MCP server info. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPServerInfo { - pub name: String, - pub version: String, - #[serde(skip_serializing_if = "Option::is_none")] - pub description: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub vendor: Option, -} - -/// Icon for display in UIs (2025-11-25 spec). sizes may be string or string[] for compatibility. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPResourceIcon { - pub src: String, - #[serde(skip_serializing_if = "Option::is_none")] - pub mime_type: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub sizes: Option, // string or ["48x48"] per spec -} - -/// Annotations for resources/templates (2025-11-25 spec). -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct MCPAnnotations { - #[serde(skip_serializing_if = "Option::is_none")] - pub audience: Option>, - #[serde(skip_serializing_if = "Option::is_none")] - pub priority: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub last_modified: Option, -} - -/// MCP resource definition (2025-11-25 spec). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPResource { - pub uri: String, - pub name: String, - /// Human-readable title for display (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub title: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub description: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub mime_type: Option, - /// Icons for UI display (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub icons: Option>, - /// Size in bytes, if known (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub size: Option, - /// Annotations: audience, priority, lastModified (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub annotations: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub metadata: Option>, -} - -/// Content Security Policy configuration for MCP App UI (aligned with VSCode/MCP Apps spec). -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct McpUiResourceCsp { - /// Origins for network requests (fetch/XHR/WebSocket). - #[serde(skip_serializing_if = "Option::is_none")] - pub connect_domains: Option>, - /// Origins for static resources (scripts, images, styles, fonts). - #[serde(skip_serializing_if = "Option::is_none")] - pub resource_domains: Option>, - /// Origins for nested iframes (frame-src directive). - #[serde(skip_serializing_if = "Option::is_none")] - pub frame_domains: Option>, - /// Allowed base URIs for the document (base-uri directive). - #[serde(skip_serializing_if = "Option::is_none")] - pub base_uri_domains: Option>, -} - -/// Sandbox permissions requested by the UI resource (aligned with VSCode/MCP Apps spec). -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct McpUiResourcePermissions { - /// Request camera access. - #[serde(skip_serializing_if = "Option::is_none")] - pub camera: Option, - /// Request microphone access. - #[serde(skip_serializing_if = "Option::is_none")] - pub microphone: Option, - /// Request geolocation access. - #[serde(skip_serializing_if = "Option::is_none")] - pub geolocation: Option, - /// Request clipboard write access. - #[serde(skip_serializing_if = "Option::is_none")] - pub clipboard_write: Option, -} - -/// UI metadata within _meta (MCP Apps spec: _meta.ui.csp, _meta.ui.permissions). -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct McpUiMeta { - /// Content Security Policy configuration. - #[serde(skip_serializing_if = "Option::is_none")] - pub csp: Option, - /// Sandbox permissions. - #[serde(skip_serializing_if = "Option::is_none")] - pub permissions: Option, -} - -/// Resource content _meta field (MCP Apps spec). -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct MCPResourceContentMeta { - /// UI metadata containing CSP and permissions. - #[serde(skip_serializing_if = "Option::is_none")] - pub ui: Option, -} - -/// MCP resource content. -/// MCP spec uses `text` for text content and `blob` for base64 binary; both are optional but at least one must be present. -/// Serialization uses `text` per spec; we accept both `text` and `content` when deserializing for compatibility. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPResourceContent { - pub uri: String, - /// Text or HTML content. Serialized as `text` per MCP spec; accepts `text` or `content` when deserializing. - #[serde( - default, - alias = "text", - rename = "text", - skip_serializing_if = "Option::is_none" - )] - pub content: Option, - /// Base64-encoded binary content (MCP spec). Used for video, images, etc. - #[serde(skip_serializing_if = "Option::is_none")] - pub blob: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub mime_type: Option, - /// Annotations for embedded resources (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub annotations: Option, - /// Resource metadata (MCP Apps: contains ui.csp and ui.permissions). - #[serde(skip_serializing_if = "Option::is_none", rename = "_meta")] - pub meta: Option, -} - -/// MCP prompt definition (2025-11-25 spec). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPPrompt { - pub name: String, - /// Human-readable title for display (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub title: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub description: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub arguments: Option>, - /// Icons for UI display (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub icons: Option>, -} - -/// MCP prompt argument. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPPromptArgument { - pub name: String, - #[serde(skip_serializing_if = "Option::is_none")] - pub title: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub description: Option, - #[serde(default)] - pub required: bool, -} - -/// MCP prompt content. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPPromptContent { - pub name: String, - pub messages: Vec, -} - -/// Content block in prompt message (2025-11-25 spec). Deserializes from plain string (legacy) or structured block. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(untagged)] -pub enum MCPPromptMessageContent { - /// Legacy: plain string content from older servers. - Plain(String), - /// Structured content block. - Block(Box), -} - -/// Structured content block types for prompt messages. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase", tag = "type")] -pub enum MCPPromptMessageContentBlock { - #[serde(rename = "text")] - Text { text: String }, - #[serde(rename = "image")] - Image { data: String, mime_type: String }, - #[serde(rename = "audio")] - Audio { data: String, mime_type: String }, - #[serde(rename = "resource_link")] - ResourceLink { - uri: String, - #[serde(skip_serializing_if = "Option::is_none")] - name: Option, - #[serde(skip_serializing_if = "Option::is_none")] - description: Option, - #[serde(skip_serializing_if = "Option::is_none")] - mime_type: Option, - }, - #[serde(rename = "resource")] - Resource { resource: Box }, -} - -impl MCPPromptMessageContent { - /// Extracts displayable text. For non-text types returns a placeholder. - pub fn text_or_placeholder(&self) -> String { - match self { - MCPPromptMessageContent::Plain(s) => s.clone(), - MCPPromptMessageContent::Block(block) => match block.as_ref() { - MCPPromptMessageContentBlock::Text { text } => text.clone(), - MCPPromptMessageContentBlock::Image { mime_type, .. } => { - format!("[Image: {}]", mime_type) - } - MCPPromptMessageContentBlock::Audio { mime_type, .. } => { - format!("[Audio: {}]", mime_type) - } - MCPPromptMessageContentBlock::ResourceLink { uri, name, .. } => { - name.as_ref().map_or_else( - || format!("[Resource Link: {}]", uri), - |n| format!("[Resource Link: {} ({})]", n, uri), - ) - } - MCPPromptMessageContentBlock::Resource { resource } => { - format!("[Resource: {}]", resource.uri) - } - }, - } - } - - /// Substitutes placeholders like {{key}} with values. Only applies to text content. - pub fn substitute_placeholders(&mut self, arguments: &HashMap) { - match self { - MCPPromptMessageContent::Plain(s) => { - for (key, value) in arguments { - let placeholder = format!("{{{{{}}}}}", key); - *s = s.replace(&placeholder, value); - } - } - MCPPromptMessageContent::Block(block) => { - if let MCPPromptMessageContentBlock::Text { text } = block.as_mut() { - for (key, value) in arguments { - let placeholder = format!("{{{{{}}}}}", key); - *text = text.replace(&placeholder, value); - } - } - } - } - } -} - -/// MCP prompt message (2025-11-25 spec). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPPromptMessage { - pub role: String, - pub content: MCPPromptMessageContent, -} - -/// MCP Apps UI metadata (tool declares interactive UI via _meta.ui.resourceUri). -/// resourceUri is optional: some tools use _meta.ui only for visibility/csp/permissions. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPToolUIMeta { - /// URI pointing to UI resource, e.g. "ui://my-server/widget". Optional per MCP Apps spec. - #[serde(skip_serializing_if = "Option::is_none")] - pub resource_uri: Option, -} - -/// MCP tool metadata (MCP Apps extension). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPToolMeta { - #[serde(skip_serializing_if = "Option::is_none")] - pub ui: Option, -} - -/// Tool annotations (2025-11-25 spec). Clients MUST treat as untrusted unless from trusted servers. -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct MCPToolAnnotations { - #[serde(skip_serializing_if = "Option::is_none")] - pub title: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub read_only_hint: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub destructive_hint: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub idempotent_hint: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub open_world_hint: Option, -} - -/// MCP tool definition (2025-11-25 spec). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPTool { - pub name: String, - /// Human-readable title for display (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub title: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub description: Option, - pub input_schema: Value, - /// Optional output schema for structured results (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub output_schema: Option, - /// Icons for UI display (2025-11-25). - #[serde(skip_serializing_if = "Option::is_none")] - pub icons: Option>, - /// Tool behavior hints (2025-11-25). Treat as untrusted. - #[serde(skip_serializing_if = "Option::is_none")] - pub annotations: Option, - /// MCP Apps extension: tool metadata including UI resource URI - #[serde(skip_serializing_if = "Option::is_none", rename = "_meta")] - pub meta: Option, -} - -/// MCP tool call result. -/// MCP Apps extension: `structuredContent` is UI-optimized data (not for model context). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct MCPToolResult { - #[serde(skip_serializing_if = "Option::is_none")] - pub content: Option>, - #[serde(default)] - pub is_error: bool, - /// Structured data for MCP App UI (ext-apps ontoolresult expects this). - #[serde(skip_serializing_if = "Option::is_none")] - pub structured_content: Option, - /// Optional protocol-level metadata returned by the server. - #[serde(skip_serializing_if = "Option::is_none", rename = "_meta")] - pub meta: Option, -} - -/// MCP tool result content (2025-11-25 spec). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase", tag = "type")] -pub enum MCPToolResultContent { - #[serde(rename = "text")] - Text { text: String }, - #[serde(rename = "image")] - Image { - data: String, - #[serde(rename = "mimeType", alias = "mime_type")] - mime_type: String, - }, - #[serde(rename = "audio")] - Audio { - data: String, - #[serde(rename = "mimeType", alias = "mime_type")] - mime_type: String, - }, - /// Link to resource (client may fetch via resources/read). - #[serde(rename = "resource_link")] - ResourceLink { - uri: String, - #[serde(skip_serializing_if = "Option::is_none")] - name: Option, - #[serde(skip_serializing_if = "Option::is_none")] - description: Option, - #[serde(skip_serializing_if = "Option::is_none")] - mime_type: Option, - }, - /// Embedded resource content. - #[serde(rename = "resource")] - Resource { resource: Box }, -} - -/// MCP message type (based on JSON-RPC 2.0). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(untagged)] -pub enum MCPMessage { - Request(MCPRequest), - Response(MCPResponse), - Notification(MCPNotification), -} - -/// MCP request message. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct MCPRequest { - pub jsonrpc: String, - pub id: Value, - pub method: String, - #[serde(skip_serializing_if = "Option::is_none")] - pub params: Option, -} - -impl MCPRequest { - pub fn new(id: Value, method: String, params: Option) -> Self { - Self { - jsonrpc: "2.0".to_string(), - id, - method, - params, - } - } -} - -/// MCP response message. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct MCPResponse { - pub jsonrpc: String, - pub id: Value, - #[serde(skip_serializing_if = "Option::is_none")] - pub result: Option, - #[serde(skip_serializing_if = "Option::is_none")] - pub error: Option, -} - -impl MCPResponse { - pub fn success(id: Value, result: Value) -> Self { - Self { - jsonrpc: "2.0".to_string(), - id, - result: Some(result), - error: None, - } - } - - pub fn error(id: Value, error: MCPError) -> Self { - Self { - jsonrpc: "2.0".to_string(), - id, - result: None, - error: Some(error), - } - } -} - -/// MCP notification message (no response required). -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct MCPNotification { - pub jsonrpc: String, - pub method: String, - #[serde(skip_serializing_if = "Option::is_none")] - pub params: Option, -} - -impl MCPNotification { - pub fn new(method: String, params: Option) -> Self { - Self { - jsonrpc: "2.0".to_string(), - method, - params, - } - } -} - -/// MCP error definition. -#[derive(Debug, Clone, Serialize, Deserialize)] -pub struct MCPError { - pub code: i32, - pub message: String, - #[serde(skip_serializing_if = "Option::is_none")] - pub data: Option, -} - -impl MCPError { - /// Standard JSON-RPC error codes. - pub const PARSE_ERROR: i32 = -32700; - pub const INVALID_REQUEST: i32 = -32600; - pub const METHOD_NOT_FOUND: i32 = -32601; - pub const INVALID_PARAMS: i32 = -32602; - pub const INTERNAL_ERROR: i32 = -32603; - /// Resource not found (2025-11-25 spec). - pub const RESOURCE_NOT_FOUND: i32 = -32002; - - pub fn parse_error(message: impl Into) -> Self { - Self { - code: Self::PARSE_ERROR, - message: message.into(), - data: None, - } - } - - pub fn invalid_request(message: impl Into) -> Self { - Self { - code: Self::INVALID_REQUEST, - message: message.into(), - data: None, - } - } - - pub fn method_not_found(method: impl Into) -> Self { - Self { - code: Self::METHOD_NOT_FOUND, - message: format!("Method not found: {}", method.into()), - data: None, - } - } - - pub fn invalid_params(message: impl Into) -> Self { - Self { - code: Self::INVALID_PARAMS, - message: message.into(), - data: None, - } - } - - pub fn internal_error(message: impl Into) -> Self { - Self { - code: Self::INTERNAL_ERROR, - message: message.into(), - data: None, - } - } -} - -/// Initialize request parameters. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct InitializeParams { - pub protocol_version: MCPProtocolVersion, - pub capabilities: MCPCapability, - pub client_info: MCPServerInfo, -} - -/// Initialize response result. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct InitializeResult { - pub protocol_version: MCPProtocolVersion, - pub capabilities: MCPCapability, - pub server_info: MCPServerInfo, -} - -/// Resources/List request parameters. -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct ResourcesListParams { - #[serde(skip_serializing_if = "Option::is_none")] - pub cursor: Option, -} - -/// Resources/List response result. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct ResourcesListResult { - pub resources: Vec, - #[serde(skip_serializing_if = "Option::is_none")] - pub next_cursor: Option, -} - -/// Resources/Read request parameters. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct ResourcesReadParams { - pub uri: String, -} - -/// Resources/Read response result. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct ResourcesReadResult { - pub contents: Vec, -} - -/// Prompts/List request parameters. -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct PromptsListParams { - #[serde(skip_serializing_if = "Option::is_none")] - pub cursor: Option, -} - -/// Prompts/List response result. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct PromptsListResult { - pub prompts: Vec, - #[serde(skip_serializing_if = "Option::is_none")] - pub next_cursor: Option, -} - -/// Prompts/Get request parameters. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct PromptsGetParams { - pub name: String, - #[serde(skip_serializing_if = "Option::is_none")] - pub arguments: Option>, -} - -/// Prompts/Get response result (2025-11-25 spec). -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct PromptsGetResult { - #[serde(skip_serializing_if = "Option::is_none")] - pub description: Option, - pub messages: Vec, -} - -/// Tools/List request parameters. -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -#[serde(rename_all = "camelCase")] -pub struct ToolsListParams { - #[serde(skip_serializing_if = "Option::is_none")] - pub cursor: Option, -} - -/// Tools/List response result. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct ToolsListResult { - pub tools: Vec, - #[serde(skip_serializing_if = "Option::is_none")] - pub next_cursor: Option, -} - -/// Tools/Call request parameters. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct ToolsCallParams { - pub name: String, - #[serde(skip_serializing_if = "Option::is_none")] - pub arguments: Option, -} - -/// Ping request (heartbeat). -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -pub struct PingParams {} - -/// Ping response. -#[derive(Debug, Clone, Serialize, Deserialize, Default)] -pub struct PingResult {} +pub use bitfun_services_integrations::mcp::protocol::types::*; diff --git a/src/crates/core/src/service/mcp/server/mod.rs b/src/crates/core/src/service/mcp/server/mod.rs index 858e9b559..2ab1062c1 100644 --- a/src/crates/core/src/service/mcp/server/mod.rs +++ b/src/crates/core/src/service/mcp/server/mod.rs @@ -8,8 +8,9 @@ mod manager; mod process; mod registry; +pub use bitfun_services_integrations::mcp::server::{MCPServerStatus, MCPServerType}; pub use config::{MCPServerConfig, MCPServerOAuthConfig, MCPServerTransport, MCPServerXaaConfig}; pub use connection::{MCPConnection, MCPConnectionPool}; pub use manager::MCPServerManager; -pub use process::{MCPServerProcess, MCPServerStatus, MCPServerType}; +pub use process::MCPServerProcess; pub use registry::MCPServerRegistry; diff --git a/src/crates/core/src/service/mcp/server/process.rs b/src/crates/core/src/service/mcp/server/process.rs index 7a4b6e36f..48639d03a 100644 --- a/src/crates/core/src/service/mcp/server/process.rs +++ b/src/crates/core/src/service/mcp/server/process.rs @@ -2,39 +2,17 @@ //! //! Handles starting, stopping, monitoring, and restarting MCP server processes. -use super::connection::MCPConnection; use super::MCPServerConfig; +use super::connection::MCPConnection; use crate::service::mcp::protocol::{InitializeResult, MCPMessage, MCPServerInfo, MCPTransport}; use crate::service::mcp::server::MCPServerTransport; use crate::util::errors::{BitFunError, BitFunResult}; +use bitfun_services_integrations::mcp::server::{MCPServerStatus, MCPServerType}; use log::{debug, error, info, warn}; use std::sync::Arc; use std::time::{Duration, Instant}; use tokio::process::Child; -use tokio::sync::{mpsc, RwLock}; - -/// MCP server type. -#[derive(Debug, Clone, Copy, PartialEq, Eq, serde::Serialize, serde::Deserialize)] -#[serde(rename_all = "lowercase")] -pub enum MCPServerType { - Local, // Command-driven stdio server, including docker/podman wrappers - Remote, // Remote HTTP/WebSocket server -} - -/// MCP server status. -#[derive(Debug, Clone, Copy, PartialEq, Eq, serde::Serialize, serde::Deserialize)] -#[serde(rename_all = "lowercase")] -pub enum MCPServerStatus { - Uninitialized, // Not initialized - Starting, // Starting - Connected, // Connected - Healthy, // Healthy (heartbeat OK) - NeedsAuth, // Authentication required / token expired - Reconnecting, // Reconnecting - Failed, // Failed - Stopping, // Stopping - Stopped, // Stopped -} +use tokio::sync::{RwLock, mpsc}; /// MCP server process. pub struct MCPServerProcess { diff --git a/src/crates/core/src/service/mcp/tool_info.rs b/src/crates/core/src/service/mcp/tool_info.rs index 35f73b2e7..ca804c7a0 100644 --- a/src/crates/core/src/service/mcp/tool_info.rs +++ b/src/crates/core/src/service/mcp/tool_info.rs @@ -1,8 +1 @@ -use serde::{Deserialize, Serialize}; - -#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)] -pub struct McpToolInfo { - pub server_id: String, - pub server_name: String, - pub tool_name: String, -} +pub use bitfun_services_integrations::mcp::McpToolInfo; diff --git a/src/crates/core/src/service/mcp/tool_name.rs b/src/crates/core/src/service/mcp/tool_name.rs index a626e189b..fdf01144a 100644 --- a/src/crates/core/src/service/mcp/tool_name.rs +++ b/src/crates/core/src/service/mcp/tool_name.rs @@ -1,56 +1,3 @@ -//! Shared MCP tool-name helpers. - -pub const MCP_TOOL_PREFIX: &str = "mcp__"; -pub const MCP_TOOL_DELIMITER: &str = "__"; - -/// Normalize MCP server/tool names to a wire-safe format aligned with claude-code. -pub fn normalize_name_for_mcp(name: &str) -> String { - name.chars() - .map(|ch| { - if ch.is_ascii_alphanumeric() || ch == '_' || ch == '-' { - ch - } else { - '_' - } - }) - .collect() -} - -pub fn build_mcp_tool_name(server_id: &str, tool_name: &str) -> String { - format!( - "{}{}{}{}", - MCP_TOOL_PREFIX, - normalize_name_for_mcp(server_id), - MCP_TOOL_DELIMITER, - normalize_name_for_mcp(tool_name) - ) -} - -#[cfg(test)] -mod tests { - use super::{build_mcp_tool_name, normalize_name_for_mcp}; - - #[test] - fn normalize_name_for_mcp_replaces_spaces_and_symbols() { - assert_eq!( - normalize_name_for_mcp("Acme Search / Primary"), - "Acme_Search___Primary" - ); - } - - #[test] - fn normalize_name_for_mcp_keeps_ascii_word_chars_and_hyphen() { - assert_eq!( - normalize_name_for_mcp("github-enterprise_v2"), - "github-enterprise_v2" - ); - } - - #[test] - fn build_mcp_tool_name_normalizes_both_segments() { - assert_eq!( - build_mcp_tool_name("Claude Code", "search repos"), - "mcp__Claude_Code__search_repos" - ); - } -} +pub use bitfun_services_integrations::mcp::{ + build_mcp_tool_name, normalize_name_for_mcp, MCP_TOOL_DELIMITER, MCP_TOOL_PREFIX, +}; diff --git a/src/crates/core/src/service/mod.rs b/src/crates/core/src/service/mod.rs index c584f89c2..b19da6b7d 100644 --- a/src/crates/core/src/service/mod.rs +++ b/src/crates/core/src/service/mod.rs @@ -1,6 +1,8 @@ -//! Service layer module +//! Service facade and core-owned product service assembly. //! -//! Contains core business logic: Workspace, Config, FileSystem, Git, Agentic, MCP. +//! Owner-crate implementations are re-exported here when they are safely +//! isolated. High-coupling runtime services stay here until their port +//! contracts and equivalence tests are explicit. pub(crate) mod agent_memory; // Agent memory prompt helpers pub mod announcement; // Announcement / feature-demo / tips system diff --git a/src/crates/core/src/service/remote_ssh/types.rs b/src/crates/core/src/service/remote_ssh/types.rs index a71f1dab7..574c247c3 100644 --- a/src/crates/core/src/service/remote_ssh/types.rs +++ b/src/crates/core/src/service/remote_ssh/types.rs @@ -1,317 +1 @@ -//! Type definitions for Remote SSH service - -use serde::{Deserialize, Deserializer, Serialize}; -use tokio_util::sync::CancellationToken; - -/// Workspace backend type -#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)] -#[serde(tag = "type", content = "data")] -pub enum WorkspaceBackend { - /// Local workspace (default) - Local, - /// Remote SSH workspace - Remote(RemoteWorkspaceInfo), -} - -/// Remote workspace information -#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)] -#[serde(rename_all = "camelCase")] -pub struct RemoteWorkspaceInfo { - /// SSH connection ID - pub connection_id: String, - /// Connection name (display name) - pub connection_name: String, - /// Remote path on the server - pub remote_path: String, -} - -/// SSH connection configuration -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct SSHConnectionConfig { - /// Unique identifier for this connection - pub id: String, - /// Display name for the connection - pub name: String, - /// Remote host address (hostname or IP) - pub host: String, - /// SSH port (default: 22) - pub port: u16, - /// SSH username - pub username: String, - /// Authentication method - #[serde(deserialize_with = "deserialize_ssh_auth_method")] - pub auth: SSHAuthMethod, - /// Default remote working directory - #[serde(rename = "defaultWorkspace")] - pub default_workspace: Option, -} - -/// SSH authentication method -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(tag = "type")] -pub enum SSHAuthMethod { - /// Password authentication - Password { password: String }, - /// Private key authentication - PrivateKey { - /// Path to private key file on local machine - #[serde(rename = "keyPath")] - key_path: String, - /// Optional passphrase for encrypted private key - passphrase: Option, - }, -} - -/// Legacy `{"type":"Agent"}` in saved config maps to default private key path. -fn deserialize_ssh_auth_method<'de, D>(deserializer: D) -> Result -where - D: Deserializer<'de>, -{ - #[derive(Deserialize)] - #[serde(tag = "type")] - enum Helper { - Password { - password: String, - }, - PrivateKey { - #[serde(rename = "keyPath")] - key_path: String, - passphrase: Option, - }, - Agent, - } - match Helper::deserialize(deserializer)? { - Helper::Password { password } => Ok(SSHAuthMethod::Password { password }), - Helper::PrivateKey { - key_path, - passphrase, - } => Ok(SSHAuthMethod::PrivateKey { - key_path, - passphrase, - }), - Helper::Agent => Ok(SSHAuthMethod::PrivateKey { - key_path: "~/.ssh/id_rsa".to_string(), - passphrase: None, - }), - } -} - -/// Connection state -#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)] -pub enum ConnectionState { - /// Not connected - Disconnected, - /// Connection in progress - Connecting, - /// Successfully connected - Connected, - /// Connection failed with error - Failed { error: String }, -} - -/// Saved connection (without sensitive data like passwords) -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct SavedConnection { - pub id: String, - pub name: String, - pub host: String, - pub port: u16, - pub username: String, - #[serde(rename = "authType", deserialize_with = "deserialize_saved_auth_type")] - pub auth_type: SavedAuthType, - #[serde(rename = "defaultWorkspace")] - pub default_workspace: Option, - #[serde(rename = "lastConnected")] - pub last_connected: Option, -} - -/// Saved auth type (excludes sensitive credentials; password ciphertext is in `ssh_password_vault.json`) -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(tag = "type")] -pub enum SavedAuthType { - Password, - PrivateKey { - #[serde(rename = "keyPath")] - key_path: String, - }, -} - -fn deserialize_saved_auth_type<'de, D>(deserializer: D) -> Result -where - D: Deserializer<'de>, -{ - #[derive(Deserialize)] - #[serde(tag = "type")] - enum Helper { - Password, - PrivateKey { - #[serde(rename = "keyPath")] - key_path: String, - }, - Agent, - } - match Helper::deserialize(deserializer)? { - Helper::Password => Ok(SavedAuthType::Password), - Helper::PrivateKey { key_path } => Ok(SavedAuthType::PrivateKey { key_path }), - Helper::Agent => Ok(SavedAuthType::PrivateKey { - key_path: "~/.ssh/id_rsa".to_string(), - }), - } -} - -/// Remote file entry information -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct RemoteFileEntry { - pub name: String, - pub path: String, - #[serde(rename = "isDir")] - pub is_dir: bool, - #[serde(rename = "isFile")] - pub is_file: bool, - #[serde(rename = "isSymlink")] - pub is_symlink: bool, - pub size: Option, - pub modified: Option, - pub permissions: Option, -} - -/// Remote file tree node -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct RemoteTreeNode { - pub name: String, - pub path: String, - #[serde(rename = "isDir")] - pub is_dir: bool, - pub children: Option>, -} - -/// Remote directory entry (for read_dir operations) -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct RemoteDirEntry { - pub name: String, - pub path: String, - #[serde(rename = "isDir")] - pub is_dir: bool, - #[serde(rename = "isFile")] - pub is_file: bool, - #[serde(rename = "isSymlink")] - pub is_symlink: bool, - pub size: Option, - pub modified: Option, - pub permissions: Option, -} - -/// Result of SSH connection attempt -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct SSHConnectionResult { - pub success: bool, - #[serde(rename = "connectionId")] - pub connection_id: Option, - pub error: Option, - #[serde(rename = "serverInfo")] - pub server_info: Option, -} - -/// Options for executing a remote SSH command. -#[derive(Debug, Clone, Default)] -pub struct SSHCommandOptions { - pub timeout_ms: Option, - pub cancellation_token: Option, -} - -/// Result of executing a remote SSH command. -#[derive(Debug, Clone)] -pub struct SSHCommandResult { - pub stdout: String, - pub stderr: String, - pub exit_code: i32, - pub interrupted: bool, - pub timed_out: bool, -} - -/// Remote server information -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct ServerInfo { - #[serde(rename = "osType")] - pub os_type: String, - pub hostname: String, - #[serde(rename = "homeDir")] - pub home_dir: String, -} - -/// Result of remote file operation -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct RemoteFileResult { - pub success: bool, - pub error: Option, -} - -/// Result of remote directory listing -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct RemoteListResult { - pub entries: Vec, - pub error: Option, -} - -/// Request to open a remote workspace -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct RemoteWorkspaceRequest { - #[serde(rename = "connectionId")] - pub connection_id: String, - #[serde(rename = "remotePath")] - pub remote_path: String, -} - -/// Remote workspace info (persisted in `remote_workspace.json`). -/// `#[serde(default)]` keeps older files loadable if a field was absent. -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct RemoteWorkspace { - #[serde(default)] - pub connection_id: String, - #[serde(default)] - pub remote_path: String, - #[serde(default)] - pub connection_name: String, - /// SSH config `host`; used for `~/.bitfun/remote_ssh/{host}/...` session storage. - #[serde(default)] - pub ssh_host: String, -} - -/// SSH config entry parsed from ~/.ssh/config -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct SSHConfigEntry { - /// Host name (alias from SSH config) - pub host: String, - /// Actual hostname or IP - pub hostname: Option, - /// SSH port - pub port: Option, - /// Username - pub user: Option, - /// Path to identity file (private key) - pub identity_file: Option, - /// Whether to use SSH agent - pub agent: Option, -} - -/// Result of looking up SSH config for a host -#[derive(Debug, Clone, Serialize, Deserialize)] -#[serde(rename_all = "camelCase")] -pub struct SSHConfigLookupResult { - /// Whether a config entry was found - pub found: bool, - /// Config entry if found - pub config: Option, -} +pub use bitfun_services_integrations::remote_ssh::types::*; diff --git a/src/crates/core/tests/git_contracts.rs b/src/crates/core/tests/git_contracts.rs new file mode 100644 index 000000000..81f3a0434 --- /dev/null +++ b/src/crates/core/tests/git_contracts.rs @@ -0,0 +1,102 @@ +use bitfun_core::service::git::{ + build_git_changed_files_args, build_git_diff_args, parse_branch_line, parse_git_log_line, + GitChangedFileStatus, GitChangedFilesParams, GitCommandOutput, GitCommitParams, GitDiffParams, + GitGraph, GitService, GitWorktreeInfo, GraphNode, GraphRef, +}; + +#[test] +fn git_contracts_remain_available_from_core_facade() { + let status = serde_json::to_value(GitChangedFileStatus::Renamed).unwrap(); + assert_eq!(status, serde_json::json!("renamed")); + + let worktree = GitWorktreeInfo { + path: "D:/workspace/BitFun-worktree".to_string(), + branch: Some("feature/test".to_string()), + head: "abc123".to_string(), + is_main: false, + is_locked: true, + is_prunable: false, + }; + let worktree_value = serde_json::to_value(worktree).unwrap(); + assert_eq!(worktree_value["isMain"], false); + + let commit_params = GitCommitParams { + message: "test commit".to_string(), + amend: Some(false), + all: Some(true), + no_verify: Some(true), + author: None, + }; + let commit_value = serde_json::to_value(commit_params).unwrap(); + assert_eq!(commit_value["noVerify"], true); + assert!(commit_value.get("no_verify").is_none()); + + let command_output = GitCommandOutput { + stdout: "ok".to_string(), + stderr: "warning".to_string(), + exit_code: 1, + }; + assert_eq!(command_output.exit_code, 1); + + assert_eq!( + parse_git_log_line("abc123|BitFun|bitfun@example.com|2026-05-12|subject"), + Some(( + "abc123".to_string(), + "BitFun".to_string(), + "bitfun@example.com".to_string(), + "2026-05-12".to_string(), + "subject".to_string(), + )) + ); + assert_eq!( + parse_branch_line("* main"), + Some(("main".to_string(), true)) + ); + assert_eq!( + build_git_diff_args(&GitDiffParams { + source: Some("main".to_string()), + target: Some("feature".to_string()), + files: None, + staged: Some(false), + stat: Some(true), + }), + vec!["diff", "main..feature", "--stat"] + ); + assert_eq!( + build_git_changed_files_args(&GitChangedFilesParams { + source: None, + target: Some("feature".to_string()), + staged: Some(false), + }), + vec!["diff", "--name-status", "feature"] + ); + let _service_size = std::mem::size_of::(); + + let graph = GitGraph { + nodes: vec![GraphNode { + hash: "abc123".to_string(), + message: "initial".to_string(), + full_message: "initial commit".to_string(), + author_name: "BitFun".to_string(), + author_email: "bitfun@example.com".to_string(), + timestamp: 1_700_000_000, + parents: Vec::new(), + children: Vec::new(), + refs: vec![GraphRef { + name: "main".to_string(), + ref_type: "branch".to_string(), + is_current: true, + is_head: true, + }], + lane: 0, + forking_lanes: Vec::new(), + merging_lanes: Vec::new(), + passing_lanes: Vec::new(), + }], + max_lane: 1, + current_branch: Some("main".to_string()), + }; + let graph_value = serde_json::to_value(graph).unwrap(); + assert_eq!(graph_value["maxLane"], 1); + assert_eq!(graph_value["nodes"][0]["refs"][0]["isHead"], true); +} diff --git a/src/crates/services-integrations/Cargo.toml b/src/crates/services-integrations/Cargo.toml index d463ead85..8dacfa567 100644 --- a/src/crates/services-integrations/Cargo.toml +++ b/src/crates/services-integrations/Cargo.toml @@ -15,9 +15,21 @@ serde = { workspace = true } serde_json = { workspace = true } log = { workspace = true } bitfun-events = { path = "../events" } +bitfun-services-core = { path = "../services-core", optional = true } +chrono = { workspace = true, optional = true } +git2 = { workspace = true, optional = true } notify = { workspace = true, optional = true } +thiserror = { workspace = true, optional = true } +tokio-util = { workspace = true, optional = true } + +[target.'cfg(not(windows))'.dependencies] +git2 = { workspace = true, features = ["vendored-openssl"], optional = true } [features] default = [] +announcement = [] +git = ["bitfun-services-core", "chrono", "git2", "thiserror"] file-watch = ["notify"] -product-full = ["file-watch"] +mcp = [] +remote-ssh = ["tokio-util"] +product-full = ["announcement", "file-watch", "git", "mcp", "remote-ssh"] diff --git a/src/crates/services-integrations/src/announcement/mod.rs b/src/crates/services-integrations/src/announcement/mod.rs new file mode 100644 index 000000000..9163725d7 --- /dev/null +++ b/src/crates/services-integrations/src/announcement/mod.rs @@ -0,0 +1,5 @@ +//! Announcement service data contracts. + +mod types; + +pub use types::*; diff --git a/src/crates/services-integrations/src/announcement/types.rs b/src/crates/services-integrations/src/announcement/types.rs new file mode 100644 index 000000000..3cda8d41f --- /dev/null +++ b/src/crates/services-integrations/src/announcement/types.rs @@ -0,0 +1,235 @@ +//! Announcement system types. +//! +//! Defines all data structures for the announcement / feature-demo / tips mechanism. + +use serde::{Deserialize, Serialize}; +use std::collections::HashSet; + +/// Categories of announcement cards. +#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] +#[serde(rename_all = "snake_case")] +pub enum CardType { + /// New version feature showcase. + Feature, + /// Operational news or blog post. + News, + /// Lightweight usage tip (toast only, no modal). + Tip, + /// Important system announcement (shown as modal without prior toast). + Announcement, +} + +/// Origin of an announcement card. +#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] +#[serde(rename_all = "snake_case")] +pub enum CardSource { + /// Statically registered in the local binary. + Local, + /// Downloaded from a remote endpoint. + Remote, + /// Built-in tips pool. + BuiltinTip, +} + +/// Conditions that must be met before a card is shown. +#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] +#[serde(tag = "type", rename_all = "snake_case")] +pub enum TriggerCondition { + /// First launch after a version upgrade. + VersionFirstOpen, + /// The N-th time the application has been opened (1-indexed). + AppNthOpen { n: u64 }, + /// A named application feature was used (supplied programmatically). + FeatureUsed { feature: String }, + /// Must be triggered manually via `trigger_announcement`. + Manual, + /// Always eligible (used for announcements that should appear on every start until dismissed). + Always, +} + +/// When and how a card should be presented. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct TriggerRule { + pub condition: TriggerCondition, + /// Milliseconds to wait after application start before displaying. + #[serde(default)] + pub delay_ms: u64, + /// When true, a card is only shown once per application version. + #[serde(default = "default_true")] + pub once_per_version: bool, +} + +fn default_true() -> bool { + true +} + +impl Default for TriggerRule { + fn default() -> Self { + Self { + condition: TriggerCondition::VersionFirstOpen, + delay_ms: 2000, + once_per_version: true, + } + } +} + +/// Configuration for the bottom-left toast entry point. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct ToastConfig { + /// Icon identifier or emoji string (rendered by the frontend). + pub icon: String, + /// Toast title (i18n key or literal text). + pub title: String, + /// Short description shown below the title (i18n key or literal text). + pub description: String, + /// Label for the primary action button (i18n key or literal text). + #[serde(default)] + pub action_label: String, + /// Whether the user can close the toast without acting. + #[serde(default = "default_true")] + pub dismissible: bool, + /// Auto-dismiss after this many milliseconds; `None` means no auto-dismiss. + #[serde(default)] + pub auto_dismiss_ms: Option, +} + +/// Preferred modal size. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "snake_case")] +pub enum ModalSize { + Sm, + Md, + Lg, + Xl, +} + +impl Default for ModalSize { + fn default() -> Self { + ModalSize::Lg + } +} + +/// What happens when the user finishes or closes the modal. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "snake_case")] +pub enum CompletionAction { + /// Only dismiss for this session; may reappear next launch if conditions match. + Dismiss, + /// Permanently suppress via `never_show_ids`. + NeverShowAgain, +} + +impl Default for CompletionAction { + fn default() -> Self { + CompletionAction::Dismiss + } +} + +/// Layout template for a single modal page. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "snake_case")] +pub enum PageLayout { + TextOnly, + MediaLeft, + MediaRight, + MediaTop, + FullscreenMedia, +} + +impl Default for PageLayout { + fn default() -> Self { + PageLayout::MediaTop + } +} + +/// Media asset type. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "snake_case")] +pub enum MediaType { + Lottie, + Video, + Image, + Gif, +} + +/// A media asset attached to a modal page. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct MediaConfig { + pub media_type: MediaType, + /// Relative path under `public/announcements/` or an HTTPS URL. + pub src: String, +} + +/// A single page inside a feature modal. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct ModalPage { + #[serde(default)] + pub layout: PageLayout, + /// Page title (i18n key or literal text). + pub title: String, + /// Body copy in Markdown (i18n key or literal text). + pub body: String, + #[serde(default)] + pub media: Option, +} + +/// Full configuration for the centre modal overlay. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct ModalConfig { + #[serde(default)] + pub size: ModalSize, + /// Allow the user to close the modal with the × button. + #[serde(default = "default_true")] + pub closable: bool, + pub pages: Vec, + #[serde(default)] + pub completion_action: CompletionAction, +} + +/// A single announcement / feature-demo card. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct AnnouncementCard { + /// Globally unique identifier, e.g. `feature_v1_3_0_miniapp`. + pub id: String, + pub card_type: CardType, + pub source: CardSource, + /// Application version this card is associated with. `None` = any version. + #[serde(default)] + pub app_version: Option, + /// Higher priority cards are shown first. + #[serde(default)] + pub priority: i32, + pub trigger: TriggerRule, + pub toast: ToastConfig, + /// If `None`, no modal is opened when the user clicks the toast action. + #[serde(default)] + pub modal: Option, + /// Unix timestamp (seconds) after which the card is ignored. Remote cards only. + #[serde(default)] + pub expires_at: Option, +} + +/// Persisted state for the announcement system. +/// +/// Stored at `~/.config/bitfun/config/announcement-state.json`. +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +pub struct AnnouncementState { + /// Version string recorded when the state was last saved. + #[serde(default)] + pub last_seen_version: String, + /// How many times the application has been opened. + #[serde(default)] + pub app_open_count: u64, + /// IDs of cards the user has seen (action button clicked or modal opened). + #[serde(default)] + pub seen_ids: HashSet, + /// IDs dismissed for the current version cycle; reset on version upgrade. + #[serde(default)] + pub dismissed_ids: HashSet, + /// IDs the user has permanently suppressed. + #[serde(default)] + pub never_show_ids: HashSet, + /// Unix timestamp (seconds) of the last successful remote fetch. + #[serde(default)] + pub last_remote_fetch_at: Option, +} diff --git a/src/crates/services-integrations/src/git/args.rs b/src/crates/services-integrations/src/git/args.rs new file mode 100644 index 000000000..bf01acfaf --- /dev/null +++ b/src/crates/services-integrations/src/git/args.rs @@ -0,0 +1,54 @@ +use super::types::{GitChangedFilesParams, GitDiffParams}; + +pub fn build_git_diff_args(params: &GitDiffParams) -> Vec { + let mut args = vec!["diff".to_string()]; + + if params.staged.unwrap_or(false) { + args.push("--cached".to_string()); + } + + match (¶ms.source, ¶ms.target) { + (Some(src), Some(tgt)) => { + args.push(format!("{}..{}", src, tgt)); + } + (Some(src), None) => { + args.push(src.clone()); + } + (None, None) => {} + (None, Some(_)) => {} + } + + if params.stat.unwrap_or(false) { + args.push("--stat".to_string()); + } + + if let Some(files) = ¶ms.files { + args.push("--".to_string()); + args.extend(files.iter().cloned()); + } + + args +} + +pub fn build_git_changed_files_args(params: &GitChangedFilesParams) -> Vec { + let mut args = vec!["diff".to_string(), "--name-status".to_string()]; + + if params.staged.unwrap_or(false) { + args.push("--cached".to_string()); + } + + match (¶ms.source, ¶ms.target) { + (Some(src), Some(tgt)) => { + args.push(format!("{}..{}", src, tgt)); + } + (Some(src), None) => { + args.push(src.clone()); + } + (None, Some(tgt)) => { + args.push(tgt.clone()); + } + (None, None) => {} + } + + args +} diff --git a/src/crates/services-integrations/src/git/error.rs b/src/crates/services-integrations/src/git/error.rs new file mode 100644 index 000000000..c00cd2659 --- /dev/null +++ b/src/crates/services-integrations/src/git/error.rs @@ -0,0 +1,32 @@ +#[derive(Debug, thiserror::Error)] +pub enum GitError { + #[error("Repository not found: {0}")] + RepositoryNotFound(String), + + #[error("Git command failed: {0}")] + CommandFailed(String), + + #[error("Invalid repository path: {0}")] + InvalidPath(String), + + #[error("Branch not found: {0}")] + BranchNotFound(String), + + #[error("Merge conflict: {0}")] + MergeConflict(String), + + #[error("Authentication failed: {0}")] + AuthenticationFailed(String), + + #[error("Network error: {0}")] + NetworkError(String), + + #[error("Parse error: {0}")] + ParseError(String), + + #[error("IO error: {0}")] + IoError(#[from] std::io::Error), + + #[error("Git2 error: {0}")] + Git2Error(#[from] git2::Error), +} diff --git a/src/crates/services-integrations/src/git/graph.rs b/src/crates/services-integrations/src/git/graph.rs new file mode 100644 index 000000000..8ac75b043 --- /dev/null +++ b/src/crates/services-integrations/src/git/graph.rs @@ -0,0 +1,302 @@ +use git2::{Commit, Oid, Repository, Sort}; +use std::collections::HashMap; + +use super::{GitGraph, GraphNode, GraphRef}; + +/// Lane allocator +struct LaneAllocator { + /// Active lanes: lane position -> commit hash + active_lanes: HashMap, + /// Free lane positions + free_positions: Vec, + /// Next available position + next_position: i32, + /// Lane length stats + lane_lengths: HashMap, +} + +impl LaneAllocator { + fn new() -> Self { + Self { + active_lanes: HashMap::new(), + free_positions: Vec::new(), + next_position: 0, + lane_lengths: HashMap::new(), + } + } + + /// Allocates a new lane. + fn allocate(&mut self, commit_hash: String) -> i32 { + let position = if let Some(pos) = self.free_positions.pop() { + pos + } else { + let pos = self.next_position; + self.next_position += 1; + pos + }; + + self.active_lanes.insert(position, commit_hash); + self.lane_lengths.insert(position, 1); + position + } + + /// Frees a lane. + fn free(&mut self, position: i32) { + self.active_lanes.remove(&position); + self.lane_lengths.remove(&position); + self.free_positions.push(position); + self.free_positions.sort_unstable(); + } + + /// Increments the lane length. + fn increment_length(&mut self, position: i32) { + if let Some(len) = self.lane_lengths.get_mut(&position) { + *len += 1; + } + } + + /// Returns the lane length. + fn get_length(&self, position: i32) -> usize { + self.lane_lengths.get(&position).copied().unwrap_or(0) + } +} + +/// Builds a Git graph. +pub fn build_git_graph( + repo: &Repository, + max_count: Option, +) -> Result { + build_git_graph_for_branch(repo, max_count, None) +} + +/// Builds a Git graph for a specific branch. +pub fn build_git_graph_for_branch( + repo: &Repository, + max_count: Option, + branch_name: Option<&str>, +) -> Result { + let current_branch = get_current_branch(repo); + + let refs_map = collect_refs(repo)?; + + let mut revwalk = repo.revwalk()?; + revwalk.set_sorting(Sort::TOPOLOGICAL | Sort::TIME)?; + + if let Some(branch) = branch_name { + if let Ok(reference) = repo.find_branch(branch, git2::BranchType::Local) { + if let Some(oid) = reference.get().target() { + revwalk.push(oid)?; + } + } else if let Ok(reference) = repo.find_branch(branch, git2::BranchType::Remote) { + if let Some(oid) = reference.get().target() { + revwalk.push(oid)?; + } + } else if let Ok(reference) = repo.find_reference(&format!("refs/heads/{}", branch)) { + if let Some(oid) = reference.target() { + revwalk.push(oid)?; + } + } else { + for reference in repo.references()? { + let reference = reference?; + if reference.is_branch() || reference.is_remote() || reference.is_tag() { + if let Some(oid) = reference.target() { + revwalk.push(oid)?; + } + } + } + } + } else { + for reference in repo.references()? { + let reference = reference?; + if reference.is_branch() || reference.is_remote() || reference.is_tag() { + if let Some(oid) = reference.target() { + revwalk.push(oid)?; + } + } + } + } + + let mut commits: Vec<(Oid, Commit)> = Vec::new(); + let max_count = max_count.unwrap_or(1000); + + for oid_result in revwalk.take(max_count) { + let oid = oid_result?; + if let Ok(commit) = repo.find_commit(oid) { + commits.push((oid, commit)); + } + } + + let mut children_map: HashMap> = HashMap::new(); + for (oid, commit) in &commits { + let hash = oid.to_string(); + for parent_id in commit.parent_ids() { + let parent_hash = parent_id.to_string(); + children_map + .entry(parent_hash) + .or_default() + .push(hash.clone()); + } + } + + let mut nodes: Vec = Vec::new(); + for (oid, commit) in commits { + let hash = oid.to_string(); + let message = commit.summary().unwrap_or("").to_string(); + let full_message = commit.message().unwrap_or("").to_string(); + let author = commit.author(); + + let node = GraphNode { + hash: hash.clone(), + message, + full_message, + author_name: author.name().unwrap_or("Unknown").to_string(), + author_email: author.email().unwrap_or("").to_string(), + timestamp: author.when().seconds(), + parents: commit.parent_ids().map(|id| id.to_string()).collect(), + children: children_map.get(&hash).cloned().unwrap_or_default(), + refs: refs_map.get(&hash).cloned().unwrap_or_default(), + lane: -1, + forking_lanes: Vec::new(), + merging_lanes: Vec::new(), + passing_lanes: Vec::new(), + }; + + nodes.push(node); + } + + let max_lane = allocate_lanes(&mut nodes); + + Ok(GitGraph { + nodes, + max_lane, + current_branch, + }) +} + +/// Collects all refs. +fn collect_refs(repo: &Repository) -> Result>, git2::Error> { + let mut refs_map: HashMap> = HashMap::new(); + let head = repo.head().ok(); + let current_branch = get_current_branch(repo); + + for reference in repo.references()? { + let reference = reference?; + + let (ref_type, name) = if reference.is_branch() { + ("branch", reference.shorthand().unwrap_or("")) + } else if reference.is_remote() { + ("remote", reference.shorthand().unwrap_or("")) + } else if reference.is_tag() { + ("tag", reference.shorthand().unwrap_or("")) + } else { + continue; + }; + + if let Some(oid) = reference.target() { + let hash = oid.to_string(); + let is_current = current_branch.as_ref().is_some_and(|cb| cb == name); + let is_head = head.as_ref().and_then(|h| h.target()) == Some(oid); + + let graph_ref = GraphRef { + name: name.to_string(), + ref_type: ref_type.to_string(), + is_current, + is_head, + }; + + refs_map.entry(hash).or_default().push(graph_ref); + } + } + + Ok(refs_map) +} + +/// Returns the current branch name. +fn get_current_branch(repo: &Repository) -> Option { + repo.head() + .ok() + .and_then(|head| head.shorthand().map(|s| s.to_string())) +} + +/// Allocates lanes (simplified algorithm). +fn allocate_lanes(nodes: &mut [GraphNode]) -> i32 { + if nodes.is_empty() { + return 0; + } + + let mut allocator = LaneAllocator::new(); + let mut commit_lanes: HashMap = HashMap::new(); + + for node in nodes.iter_mut() { + let hash = node.hash.clone(); + + let lane = if node.children.is_empty() { + allocator.allocate(hash.clone()) + } else if node.children.len() == 1 { + let child_hash = &node.children[0]; + if let Some(&child_lane) = commit_lanes.get(child_hash) { + allocator.increment_length(child_lane); + child_lane + } else { + allocator.allocate(hash.clone()) + } + } else { + let mut best_lane = -1; + let mut best_length = 0; + + for child_hash in &node.children { + if let Some(&child_lane) = commit_lanes.get(child_hash) { + let length = allocator.get_length(child_lane); + if length > best_length { + best_length = length; + best_lane = child_lane; + } + } + } + + if best_lane >= 0 { + allocator.increment_length(best_lane); + best_lane + } else { + allocator.allocate(hash.clone()) + } + }; + + node.lane = lane; + commit_lanes.insert(hash.clone(), lane); + + for child_hash in &node.children { + if let Some(&child_lane) = commit_lanes.get(child_hash) { + if child_lane != lane { + node.forking_lanes.push(child_lane); + } + } + } + + for (i, parent_hash) in node.parents.iter().enumerate() { + if i > 0 { + if let Some(&parent_lane) = commit_lanes.get(parent_hash) { + if parent_lane != lane { + node.merging_lanes.push(parent_lane); + } + } + } + } + + let active_lanes: Vec = allocator.active_lanes.keys().copied().collect(); + for &active_lane in &active_lanes { + if active_lane != lane + && !node.forking_lanes.contains(&active_lane) + && !node.merging_lanes.contains(&active_lane) + { + node.passing_lanes.push(active_lane); + } + } + + if node.parents.is_empty() { + allocator.free(lane); + } + } + + allocator.next_position +} diff --git a/src/crates/services-integrations/src/git/mod.rs b/src/crates/services-integrations/src/git/mod.rs new file mode 100644 index 000000000..09f06b039 --- /dev/null +++ b/src/crates/services-integrations/src/git/mod.rs @@ -0,0 +1,24 @@ +//! Git service contracts. +//! +//! `bitfun-core::service::git` remains as the compatibility facade for the +//! legacy public path. + +pub mod args; +pub mod error; +pub mod graph; +pub mod name_status; +pub mod service; +pub mod text; +pub mod types; +pub mod utils; +pub mod worktree; + +pub use args::{build_git_changed_files_args, build_git_diff_args}; +pub use error::GitError; +pub use graph::{build_git_graph, build_git_graph_for_branch}; +pub use name_status::parse_name_status_output; +pub use service::GitService; +pub use text::{parse_branch_line, parse_git_log_line}; +pub use types::*; +pub use utils::*; +pub use worktree::parse_worktree_list; diff --git a/src/crates/services-integrations/src/git/name_status.rs b/src/crates/services-integrations/src/git/name_status.rs new file mode 100644 index 000000000..299b7054a --- /dev/null +++ b/src/crates/services-integrations/src/git/name_status.rs @@ -0,0 +1,44 @@ +use super::types::{GitChangedFile, GitChangedFileStatus}; + +/// Parses output from `git diff --name-status`. +pub fn parse_name_status_output(output: &str) -> Vec { + output + .lines() + .filter_map(|line| { + let mut parts = line.split('\t'); + let raw_status = parts.next()?.trim(); + if raw_status.is_empty() { + return None; + } + + let status = match raw_status.chars().next().unwrap_or_default() { + 'A' => GitChangedFileStatus::Added, + 'M' => GitChangedFileStatus::Modified, + 'D' => GitChangedFileStatus::Deleted, + 'R' => GitChangedFileStatus::Renamed, + 'C' => GitChangedFileStatus::Copied, + _ => GitChangedFileStatus::Unknown, + }; + + match status { + GitChangedFileStatus::Renamed | GitChangedFileStatus::Copied => { + let old_path = parts.next()?.to_string(); + let path = parts.next()?.to_string(); + Some(GitChangedFile { + path, + old_path: Some(old_path), + status, + }) + } + _ => { + let path = parts.next()?.to_string(); + Some(GitChangedFile { + path, + old_path: None, + status, + }) + } + } + }) + .collect() +} diff --git a/src/crates/services-integrations/src/git/service.rs b/src/crates/services-integrations/src/git/service.rs new file mode 100644 index 000000000..b0ca1f04c --- /dev/null +++ b/src/crates/services-integrations/src/git/service.rs @@ -0,0 +1,1059 @@ +use super::*; +/** + * Git service implementation + */ +use git2::{BranchType, Commit, Repository}; +use std::path::Path; +use std::time::Duration; +use std::time::Instant; +use tokio::time::timeout; + +pub struct GitService; + +type CommitStats = (Option, Option, Option); + +fn elapsed_ms_u64(started_at: Instant) -> u64 { + started_at.elapsed().as_millis() as u64 +} + +impl GitService { + /// Checks whether the path is a Git repository. + pub async fn is_repository>(path: P) -> Result { + Ok(is_git_repository(path)) + } + + /// Gets repository information. + pub async fn get_repository>(path: P) -> Result { + let _start_time = Instant::now(); + + let repo = + Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; + + let current_branch = get_current_branch(&repo)?; + let is_bare = repo.is_bare(); + let has_changes = !get_file_statuses(&repo)?.is_empty(); + + let remotes = repo + .remotes() + .map_err(|e| GitError::CommandFailed(e.to_string()))? + .iter() + .filter_map(|name| name.map(|s| s.to_string())) + .collect(); + + let path_str = path.as_ref().to_string_lossy().to_string(); + let name = path + .as_ref() + .file_name() + .unwrap_or_default() + .to_string_lossy() + .to_string(); + + Ok(GitRepository { + path: path_str, + name, + current_branch, + is_bare, + has_changes, + remotes, + }) + } + + /// Gets repository status. + pub async fn get_status>(path: P) -> Result { + let repo = + Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; + + let current_branch = get_current_branch(&repo)?; + let file_statuses = get_file_statuses(&repo)?; + + let mut staged = Vec::new(); + let mut unstaged = Vec::new(); + let mut untracked = Vec::new(); + + for status in file_statuses { + if status.status.contains('?') { + untracked.push(status.path); + } else { + if status.index_status.is_some() { + staged.push(status.clone()); + } + if status.workdir_status.is_some() { + unstaged.push(status); + } + } + } + + let (ahead, behind) = + Self::get_ahead_behind_count(&repo, ¤t_branch).unwrap_or((0, 0)); + + Ok(GitStatus { + staged, + unstaged, + untracked, + current_branch, + ahead, + behind, + }) + } + + /// Gets the branch list. + pub async fn get_branches>( + path: P, + include_remote: bool, + ) -> Result, GitError> { + let repo = + Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; + + let mut branches = Vec::new(); + let current_branch = get_current_branch(&repo)?; + + let local_branches = repo + .branches(Some(BranchType::Local)) + .map_err(|e| GitError::CommandFailed(e.to_string()))?; + + for branch_result in local_branches { + let (branch, _) = branch_result.map_err(|e| GitError::CommandFailed(e.to_string()))?; + + if let Some(name) = branch + .name() + .map_err(|e| GitError::CommandFailed(e.to_string()))? + { + let is_current = name == current_branch; + let upstream = branch.upstream().ok().and_then(|upstream_branch| { + upstream_branch.name().ok().flatten().map(|s| s.to_string()) + }); + + let (last_commit, last_commit_date) = + if let Ok(commit) = branch.get().peel_to_commit() { + ( + Some(commit.id().to_string()), + Some(format_timestamp(commit.time().seconds())), + ) + } else { + (None, None) + }; + + let (ahead, behind) = if is_current { + Self::get_ahead_behind_count(&repo, name).unwrap_or((0, 0)) + } else { + (0, 0) + }; + + branches.push(GitBranch { + name: name.to_string(), + current: is_current, + remote: false, + upstream, + ahead, + behind, + last_commit, + last_commit_date: last_commit_date.clone(), + + base_branch: None, + child_branches: None, + merged_branches: None, + branch_type: Some(Self::determine_branch_type(name)), + has_conflicts: None, + can_merge: None, + is_stale: None, + merge_status: None, + stats: None, + created_at: None, + last_activity_at: last_commit_date, + tags: None, + description: None, + linked_issues: None, + }); + } + } + + if include_remote { + let remote_branches = repo + .branches(Some(BranchType::Remote)) + .map_err(|e| GitError::CommandFailed(e.to_string()))?; + + for branch_result in remote_branches { + let (branch, _) = + branch_result.map_err(|e| GitError::CommandFailed(e.to_string()))?; + + if let Some(name) = branch + .name() + .map_err(|e| GitError::CommandFailed(e.to_string()))? + { + let (last_commit, last_commit_date) = + if let Ok(commit) = branch.get().peel_to_commit() { + ( + Some(commit.id().to_string()), + Some(format_timestamp(commit.time().seconds())), + ) + } else { + (None, None) + }; + + branches.push(GitBranch { + name: name.to_string(), + current: false, + remote: true, + upstream: None, + ahead: 0, + behind: 0, + last_commit, + last_commit_date: last_commit_date.clone(), + + base_branch: None, + child_branches: None, + merged_branches: None, + branch_type: Some(Self::determine_branch_type(name)), + has_conflicts: None, + can_merge: None, + is_stale: None, + merge_status: None, + stats: None, + created_at: None, + last_activity_at: last_commit_date, + tags: None, + description: None, + linked_issues: None, + }); + } + } + } + + Ok(branches) + } + + /// Gets branches with detailed information. + pub async fn get_enhanced_branches>( + path: P, + include_remote: bool, + ) -> Result, GitError> { + let mut branches = Self::get_branches(&path, include_remote).await?; + + Self::analyze_branch_relations(&mut branches)?; + + let repo = + Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; + + for branch in &mut branches { + if !branch.remote { + branch.stats = Self::calculate_branch_stats(&repo, &branch.name).ok(); + branch.is_stale = Some(Self::is_branch_stale(branch)); + branch.can_merge = Self::can_merge_safely(&repo, &branch.name).ok(); + branch.has_conflicts = branch.can_merge.map(|can| !can); + } + } + + Ok(branches) + } + + /// Determines the branch type. + fn determine_branch_type(branch_name: &str) -> String { + if branch_name.starts_with("feature/") || branch_name.starts_with("feat/") { + "feature".to_string() + } else if branch_name.starts_with("hotfix/") || branch_name.starts_with("fix/") { + "hotfix".to_string() + } else if branch_name.starts_with("release/") || branch_name.starts_with("rel/") { + "release".to_string() + } else if branch_name.starts_with("bugfix/") || branch_name.starts_with("bug/") { + "bugfix".to_string() + } else if branch_name.starts_with("chore/") { + "chore".to_string() + } else if branch_name.starts_with("docs/") { + "docs".to_string() + } else if branch_name.starts_with("test/") { + "test".to_string() + } else if ["main", "master", "develop", "development"].contains(&branch_name) { + "main".to_string() + } else { + "other".to_string() + } + } + + /// Analyzes branch relationships. + fn analyze_branch_relations(branches: &mut [GitBranch]) -> Result<(), GitError> { + let main_branches = ["main", "master", "develop"]; + + let available_main_branches: Vec = branches + .iter() + .filter(|b| !b.remote && main_branches.contains(&b.name.as_str())) + .map(|b| b.name.clone()) + .collect(); + + for branch in branches.iter_mut() { + if !branch.remote && !main_branches.contains(&branch.name.as_str()) { + if let Some(main_branch) = available_main_branches.first() { + branch.base_branch = Some(main_branch.clone()); + } + } + } + + let mut child_map: std::collections::HashMap> = + std::collections::HashMap::new(); + + for branch in branches.iter() { + if let Some(base) = &branch.base_branch { + child_map + .entry(base.clone()) + .or_default() + .push(branch.name.clone()); + } + } + + for branch in branches.iter_mut() { + if let Some(children) = child_map.get(&branch.name) { + branch.child_branches = Some(children.clone()); + } + } + + Ok(()) + } + + /// Computes branch statistics. + fn calculate_branch_stats( + repo: &Repository, + branch_name: &str, + ) -> Result { + let branch_ref = repo + .find_branch(branch_name, BranchType::Local) + .map_err(|e| GitError::BranchNotFound(e.to_string()))?; + + let target = branch_ref + .get() + .target() + .ok_or_else(|| GitError::CommandFailed("Branch has no target".to_string()))?; + + let mut revwalk = repo + .revwalk() + .map_err(|e| GitError::CommandFailed(e.to_string()))?; + revwalk + .push(target) + .map_err(|e| GitError::CommandFailed(e.to_string()))?; + + let commit_count = revwalk.count() as i32; + + Ok(GitBranchStats { + commit_count, + contributor_count: 1, + file_changes: 0, + lines_changed: GitLinesChanged { + additions: 0, + deletions: 0, + }, + activity_score: std::cmp::min(commit_count * 2, 100), + }) + } + + /// Checks whether a branch is stale. + fn is_branch_stale(branch: &GitBranch) -> bool { + !matches!(&branch.last_commit_date, Some(_last_commit_date)) + } + + /// Checks whether a branch can be merged safely. + fn can_merge_safely(_repo: &Repository, _branch_name: &str) -> Result { + Ok(true) + } + + /// Gets commit history. + pub async fn get_commits>( + path: P, + params: GitLogParams, + ) -> Result, GitError> { + let repo = + Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; + + let mut revwalk = repo + .revwalk() + .map_err(|e| GitError::CommandFailed(e.to_string()))?; + + revwalk + .push_head() + .map_err(|e| GitError::CommandFailed(e.to_string()))?; + + let mut commits = Vec::new(); + let mut count = 0; + let skip = params.skip.unwrap_or(0); + let max_count = params.max_count.unwrap_or(50); + + for (index, oid_result) in revwalk.enumerate() { + if index < skip as usize { + continue; + } + + if count >= max_count { + break; + } + + let oid = oid_result.map_err(|e| GitError::CommandFailed(e.to_string()))?; + + let commit = repo + .find_commit(oid) + .map_err(|e| GitError::CommandFailed(e.to_string()))?; + + let author = commit.author(); + let message = commit.message().unwrap_or("").to_string(); + + if let Some(author_filter) = ¶ms.author { + if !author.name().unwrap_or("").contains(author_filter) { + continue; + } + } + + if let Some(grep_filter) = ¶ms.grep { + if !message.contains(grep_filter) { + continue; + } + } + + let parents: Vec = commit.parent_ids().map(|id| id.to_string()).collect(); + + let (additions, deletions, files_changed) = if params.stat.unwrap_or(false) { + Self::get_commit_stats(&repo, &commit).unwrap_or((None, None, None)) + } else { + (None, None, None) + }; + + commits.push(GitCommit { + hash: commit.id().to_string(), + short_hash: commit.id().to_string()[..7].to_string(), + message, + author: author.name().unwrap_or("Unknown").to_string(), + author_email: author.email().unwrap_or("").to_string(), + date: format_timestamp(commit.time().seconds()), + parents, + additions, + deletions, + files_changed, + }); + + count += 1; + } + + Ok(commits) + } + + /// Adds files to the staging area. + pub async fn add_files>( + path: P, + params: GitAddParams, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let mut args = vec!["add"]; + + if params.all.unwrap_or(false) { + args.push("-A"); + } else if params.update.unwrap_or(false) { + args.push("-u"); + } else { + for file in ¶ms.files { + args.push(file); + } + } + + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "files": params.files, + "all": params.all, + "update": params.update + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Commits changes. + pub async fn commit>( + path: P, + params: GitCommitParams, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let mut args = vec![ + "commit".to_string(), + "-m".to_string(), + params.message.clone(), + ]; + + if params.amend.unwrap_or(false) { + args.push("--amend".to_string()); + } + + if params.all.unwrap_or(false) { + args.push("-a".to_string()); + } + + if params.no_verify.unwrap_or(false) { + args.push("--no-verify".to_string()); + } + + if let Some(author) = ¶ms.author { + args.push("--author".to_string()); + args.push(format!("{} <{}>", author.name, author.email)); + } + + let args_refs: Vec<&str> = args.iter().map(|s| s.as_str()).collect(); + let output = execute_git_command(&repo_path, &args_refs).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "message": params.message, + "amend": params.amend, + "all": params.all, + "noVerify": params.no_verify, + "author": params.author + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Pushes changes. + pub async fn push>( + path: P, + params: GitPushParams, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let mut args = vec!["push"]; + + if params.force.unwrap_or(false) { + args.push("--force"); + } + + if params.set_upstream.unwrap_or(false) { + args.push("-u"); + } + + if let Some(remote) = ¶ms.remote { + args.push(remote); + } + + if let Some(branch) = ¶ms.branch { + args.push(branch); + } + + let output = timeout( + Duration::from_secs(30), + execute_git_command(&repo_path, &args), + ) + .await + .map_err(|_| GitError::NetworkError("Push operation timed out".to_string()))??; + + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "remote": params.remote, + "branch": params.branch, + "force": params.force, + "set_upstream": params.set_upstream + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Pulls changes. + pub async fn pull>( + path: P, + params: GitPullParams, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let mut args = vec!["pull"]; + + if params.rebase.unwrap_or(false) { + args.push("--rebase"); + } + + if let Some(remote) = ¶ms.remote { + args.push(remote); + } + + if let Some(branch) = ¶ms.branch { + args.push(branch); + } + + let output = timeout( + Duration::from_secs(30), + execute_git_command(&repo_path, &args), + ) + .await + .map_err(|_| GitError::NetworkError("Pull operation timed out".to_string()))??; + + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "remote": params.remote, + "branch": params.branch, + "rebase": params.rebase + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Checks out a branch. + pub async fn checkout_branch>( + path: P, + branch_name: &str, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let args = vec!["checkout", branch_name]; + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "branch": branch_name + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Creates a branch. + pub async fn create_branch>( + path: P, + branch_name: &str, + start_point: Option<&str>, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let mut args = vec!["checkout", "-b", branch_name]; + let effective_start_point = start_point.filter(|s| !s.trim().is_empty()); + if let Some(start) = effective_start_point { + args.push(start); + } + + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "branch": branch_name, + "start_point": effective_start_point + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Deletes a branch. + pub async fn delete_branch>( + path: P, + branch_name: &str, + force: bool, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let flag = if force { "-D" } else { "-d" }; + let args = vec!["branch", flag, branch_name]; + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "branch": branch_name, + "force": force + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Resets to a specific commit. + /// + /// # Parameters + /// - `path`: Repository path + /// - `commit_hash`: Target commit hash + /// - `mode`: Reset mode (`soft`, `mixed`, `hard`) + pub async fn reset_to_commit>( + path: P, + commit_hash: &str, + mode: &str, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let mode_flag = match mode { + "soft" => "--soft", + "mixed" => "--mixed", + "hard" => "--hard", + _ => { + return Err(GitError::CommandFailed(format!( + "Invalid reset mode: {}", + mode + ))) + } + }; + + let args = vec!["reset", mode_flag, commit_hash]; + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "commit": commit_hash, + "mode": mode + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Gets the diff. + pub async fn get_diff>( + path: P, + params: &GitDiffParams, + ) -> Result { + let repo_path = path.as_ref().to_string_lossy(); + let args = build_git_diff_args(params); + let arg_refs: Vec<&str> = args.iter().map(String::as_str).collect(); + + execute_git_command(&repo_path, &arg_refs).await + } + + /// Gets changed files using `git diff --name-status`. + pub async fn get_changed_files>( + path: P, + params: &GitChangedFilesParams, + ) -> Result, GitError> { + let repo_path = path.as_ref().to_string_lossy(); + let args = build_git_changed_files_args(params); + let arg_refs: Vec<&str> = args.iter().map(String::as_str).collect(); + + let output = execute_git_command(&repo_path, &arg_refs).await?; + Ok(parse_name_status_output(&output)) + } + + /// Gets file content. + /// + /// # Parameters + /// - `path`: Repository path + /// - `file_path`: File relative path + /// - `commit`: Commit reference (optional, defaults to `HEAD`) + /// + /// # Returns + /// - File content string + pub async fn get_file_content>( + path: P, + file_path: &str, + commit: Option<&str>, + ) -> Result { + let repo_path = path.as_ref().to_string_lossy(); + + let commit_ref = commit.unwrap_or("HEAD"); + let object_spec = format!("{}:{}", commit_ref, file_path); + + let args = vec!["show", &object_spec]; + + execute_git_command(&repo_path, &args).await + } + + /// Resets file changes (discarding working tree changes). + /// + /// # Parameters + /// - `path`: Repository path + /// - `files`: List of file paths + /// - `staged`: Whether to reset the index (`true`: reset staged, `false`: restore worktree) + /// + /// # Returns + /// - Operation result + pub async fn reset_files>( + path: P, + files: &[String], + staged: bool, + ) -> Result { + let repo_path = path.as_ref().to_string_lossy(); + + if staged { + let mut args = vec!["restore", "--staged"]; + for file in files { + args.push(file); + } + execute_git_command(&repo_path, &args).await + } else { + let mut args = vec!["restore"]; + for file in files { + args.push(file); + } + execute_git_command(&repo_path, &args).await + } + } + + /// Gets ahead/behind counts. + fn get_ahead_behind_count( + repo: &Repository, + branch_name: &str, + ) -> Result<(i32, i32), GitError> { + let local_branch = repo + .find_branch(branch_name, BranchType::Local) + .map_err(|e| GitError::BranchNotFound(e.to_string()))?; + + if let Ok(upstream) = local_branch.upstream() { + let local_oid = local_branch.get().target().ok_or_else(|| { + GitError::CommandFailed("Failed to get local branch target".to_string()) + })?; + let upstream_oid = upstream.get().target().ok_or_else(|| { + GitError::CommandFailed("Failed to get upstream branch target".to_string()) + })?; + + let (ahead, behind) = repo + .graph_ahead_behind(local_oid, upstream_oid) + .map_err(|e| GitError::CommandFailed(e.to_string()))?; + + Ok((ahead as i32, behind as i32)) + } else { + Ok((0, 0)) + } + } + + /// Gets commit statistics. + fn get_commit_stats(_repo: &Repository, _commit: &Commit) -> Result { + Ok((None, None, None)) + } + + /// Gets Git commit graph data. + pub async fn get_git_graph>( + path: P, + max_count: Option, + ) -> Result { + let repo = + Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; + + build_git_graph(&repo, max_count).map_err(|e| GitError::CommandFailed(e.to_string())) + } + + /// Gets Git commit graph data for a specific branch. + pub async fn get_git_graph_for_branch>( + path: P, + max_count: Option, + branch_name: Option<&str>, + ) -> Result { + let repo = + Repository::open(&path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; + + build_git_graph_for_branch(&repo, max_count, branch_name) + .map_err(|e| GitError::CommandFailed(e.to_string())) + } + + /// Cherry-picks a commit onto the current branch. + /// + /// # Parameters + /// - `path`: Repository path + /// - `commit_hash`: Commit hash to cherry-pick + /// - `no_commit`: Apply changes without committing automatically (default `false`) + /// + /// # Returns + /// - Operation result + pub async fn cherry_pick>( + path: P, + commit_hash: &str, + no_commit: bool, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let mut args = vec!["cherry-pick"]; + + if no_commit { + args.push("-n"); + } + + args.push(commit_hash); + + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "commit": commit_hash, + "no_commit": no_commit + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Aborts the cherry-pick operation. + /// + /// # Parameters + /// - `path`: Repository path + /// + /// # Returns + /// - Operation result + pub async fn cherry_pick_abort>( + path: P, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let args = vec!["cherry-pick", "--abort"]; + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: None, + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Continues the cherry-pick operation (after resolving conflicts). + /// + /// # Parameters + /// - `path`: Repository path + /// + /// # Returns + /// - Operation result + pub async fn cherry_pick_continue>( + path: P, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let args = vec!["cherry-pick", "--continue"]; + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: None, + error: None, + output: Some(output), + duration: Some(duration), + }) + } + + /// Lists all worktrees. + /// + /// # Parameters + /// - `path`: Repository path + /// + /// # Returns + /// - Worktree list + pub async fn list_worktrees>(path: P) -> Result, GitError> { + let repo_path = path.as_ref().to_string_lossy(); + + let args = vec!["worktree", "list", "--porcelain"]; + let output = execute_git_command(&repo_path, &args).await?; + + Ok(parse_worktree_list(&output)) + } + + /// Adds a new worktree. + /// + /// # Parameters + /// - `path`: Repository path + /// - `branch`: Branch name + /// - `create_branch`: Whether to create a new branch + /// + /// # Returns + /// - Newly created worktree information + pub async fn add_worktree>( + path: P, + branch: &str, + create_branch: bool, + ) -> Result { + let repo_path = path.as_ref().to_string_lossy(); + + let worktree_dir = path.as_ref().join(".worktrees"); + let worktree_path = worktree_dir.join(branch); + let worktree_path_str = worktree_path.to_string_lossy().to_string(); + + if !worktree_dir.exists() { + std::fs::create_dir_all(&worktree_dir).map_err(GitError::IoError)?; + } + + let args = if create_branch { + vec!["worktree", "add", "-b", branch, &worktree_path_str] + } else { + vec!["worktree", "add", &worktree_path_str, branch] + }; + + execute_git_command(&repo_path, &args).await?; + + let worktrees = Self::list_worktrees(&path).await?; + + let normalized_expected = worktree_path_str.replace("\\", "/"); + + worktrees + .into_iter() + .find(|wt| wt.path == normalized_expected) + .ok_or_else(|| { + GitError::CommandFailed("Failed to find newly created worktree".to_string()) + }) + } + + /// Removes a worktree. + /// + /// # Parameters + /// - `path`: Repository path + /// - `worktree_path`: Worktree path to remove + /// - `force`: Whether to force removal + /// + /// # Returns + /// - Operation result + pub async fn remove_worktree>( + path: P, + worktree_path: &str, + force: bool, + ) -> Result { + let start_time = Instant::now(); + let repo_path = path.as_ref().to_string_lossy(); + + let mut args = vec!["worktree", "remove"]; + if force { + args.push("--force"); + } + args.push(worktree_path); + + let output = execute_git_command(&repo_path, &args).await?; + let duration = elapsed_ms_u64(start_time); + + Ok(GitOperationResult { + success: true, + data: Some(serde_json::json!({ + "worktree_path": worktree_path, + "force": force + })), + error: None, + output: Some(output), + duration: Some(duration), + }) + } +} diff --git a/src/crates/services-integrations/src/git/text.rs b/src/crates/services-integrations/src/git/text.rs new file mode 100644 index 000000000..95643aa9a --- /dev/null +++ b/src/crates/services-integrations/src/git/text.rs @@ -0,0 +1,31 @@ +/// Parses a Git log line formatted as `hash|author|email|date|message`. +pub fn parse_git_log_line(line: &str) -> Option<(String, String, String, String, String)> { + let parts: Vec<&str> = line.splitn(5, '|').collect(); + if parts.len() == 5 { + Some(( + parts[0].to_string(), + parts[1].to_string(), + parts[2].to_string(), + parts[3].to_string(), + parts[4].to_string(), + )) + } else { + None + } +} + +/// Parses a Git branch list line, preserving the current-branch marker. +pub fn parse_branch_line(line: &str) -> Option<(String, bool)> { + let trimmed = line.trim(); + if trimmed.is_empty() { + return None; + } + + if let Some(stripped) = trimmed.strip_prefix("* ") { + Some((stripped.to_string(), true)) + } else if let Some(stripped) = trimmed.strip_prefix(" ") { + Some((stripped.to_string(), false)) + } else { + Some((trimmed.to_string(), false)) + } +} diff --git a/src/crates/services-integrations/src/git/types.rs b/src/crates/services-integrations/src/git/types.rs new file mode 100644 index 000000000..987cdf2c1 --- /dev/null +++ b/src/crates/services-integrations/src/git/types.rs @@ -0,0 +1,308 @@ +/** + * Git-related type definitions + */ +use serde::{Deserialize, Serialize}; + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitRepository { + pub path: String, + pub name: String, + pub current_branch: String, + pub is_bare: bool, + pub has_changes: bool, + pub remotes: Vec, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitStatus { + pub staged: Vec, + pub unstaged: Vec, + pub untracked: Vec, + pub current_branch: String, + pub ahead: i32, + pub behind: i32, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitFileStatus { + pub path: String, + pub status: String, + pub index_status: Option, + pub workdir_status: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitBranch { + pub name: String, + pub current: bool, + pub remote: bool, + pub upstream: Option, + pub ahead: i32, + pub behind: i32, + pub last_commit: Option, + pub last_commit_date: Option, + + pub base_branch: Option, + pub child_branches: Option>, + pub merged_branches: Option>, + + pub branch_type: Option, + pub has_conflicts: Option, + pub can_merge: Option, + pub is_stale: Option, + pub merge_status: Option, + + pub stats: Option, + pub created_at: Option, + pub last_activity_at: Option, + + pub tags: Option>, + pub description: Option, + pub linked_issues: Option>, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitBranchStats { + pub commit_count: i32, + pub contributor_count: i32, + pub file_changes: i32, + pub lines_changed: GitLinesChanged, + pub activity_score: i32, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitLinesChanged { + pub additions: i32, + pub deletions: i32, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitCommit { + pub hash: String, + pub short_hash: String, + pub message: String, + pub author: String, + pub author_email: String, + pub date: String, + pub parents: Vec, + pub additions: Option, + pub deletions: Option, + pub files_changed: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +pub struct GitLogParams { + pub max_count: Option, + pub skip: Option, + pub author: Option, + pub grep: Option, + pub since: Option, + pub until: Option, + pub stat: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitAddParams { + pub files: Vec, + pub all: Option, + pub update: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitCommitParams { + pub message: String, + pub amend: Option, + pub all: Option, + #[serde(rename = "noVerify")] + pub no_verify: Option, + pub author: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitAuthor { + pub name: String, + pub email: String, +} + +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +pub struct GitPushParams { + pub remote: Option, + pub branch: Option, + pub force: Option, + pub set_upstream: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +pub struct GitPullParams { + pub remote: Option, + pub branch: Option, + pub rebase: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitMergeParams { + pub branch: String, + pub strategy: Option, + pub message: Option, + pub no_ff: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitStashParams { + pub message: Option, + pub include_untracked: Option, + pub keep_index: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +pub struct GitDiffParams { + pub source: Option, + pub target: Option, + pub files: Option>, + pub staged: Option, + pub stat: Option, +} + +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +pub struct GitChangedFilesParams { + pub source: Option, + pub target: Option, + pub staged: Option, +} + +#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] +#[serde(rename_all = "snake_case")] +pub enum GitChangedFileStatus { + Added, + Modified, + Deleted, + Renamed, + Copied, + Unknown, +} + +#[derive(Debug, Clone, PartialEq, Eq, Serialize, Deserialize)] +pub struct GitChangedFile { + pub path: String, + pub old_path: Option, + pub status: GitChangedFileStatus, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitOperationResult { + pub success: bool, + pub data: Option, + pub error: Option, + pub output: Option, + pub duration: Option, +} + +/// Raw result of executing a git command, preserving exit code and both streams. +#[derive(Debug, Clone)] +pub struct GitCommandOutput { + pub stdout: String, + pub stderr: String, + pub exit_code: i32, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitDiffResult { + pub files: Vec, + pub total_additions: i32, + pub total_deletions: i32, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitDiffFile { + pub path: String, + pub old_path: Option, + pub status: String, + pub additions: i32, + pub deletions: i32, + pub diff: String, +} + +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct GitStash { + pub index: i32, + pub message: String, + pub branch: String, + pub date: String, + pub hash: String, +} + +/// Git worktree information +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct GitWorktreeInfo { + /// Worktree path + pub path: String, + /// Associated branch name + pub branch: Option, + /// HEAD commit hash + pub head: String, + /// Whether this is the main worktree (the main directory of a bare repository) + pub is_main: bool, + /// Whether the worktree is locked + pub is_locked: bool, + /// Whether the worktree is prunable + pub is_prunable: bool, +} + +/// Git graph node. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct GraphNode { + /// Commit hash. + pub hash: String, + /// Commit message (first line). + pub message: String, + /// Full commit message. + pub full_message: String, + /// Author name. + pub author_name: String, + /// Author email. + pub author_email: String, + /// Commit time (Unix timestamp). + pub timestamp: i64, + /// Parent commit hashes. + pub parents: Vec, + /// Child commit hashes (filled when building the graph). + pub children: Vec, + /// Associated refs (branches, tags, etc.). + pub refs: Vec, + /// Lane position. + pub lane: i32, + /// Lanes that fork out. + pub forking_lanes: Vec, + /// Lanes that merge in. + pub merging_lanes: Vec, + /// Lanes that pass through. + pub passing_lanes: Vec, +} + +/// Git ref information. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct GraphRef { + /// Ref name. + pub name: String, + /// Ref type: `branch`, `remote`, `tag`. + pub ref_type: String, + /// Whether this is the current branch. + pub is_current: bool, + /// Whether this is `HEAD`. + pub is_head: bool, +} + +/// Git graph data. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct GitGraph { + /// Node list. + pub nodes: Vec, + /// Maximum lane count. + pub max_lane: i32, + /// Current branch name. + pub current_branch: Option, +} diff --git a/src/crates/services-integrations/src/git/utils.rs b/src/crates/services-integrations/src/git/utils.rs new file mode 100644 index 000000000..1411507c6 --- /dev/null +++ b/src/crates/services-integrations/src/git/utils.rs @@ -0,0 +1,294 @@ +pub use super::{ + build_git_changed_files_args, build_git_diff_args, parse_branch_line, parse_git_log_line, +}; +/** + * Git utility functions + */ +use super::{GitCommandOutput, GitError, GitFileStatus}; +use bitfun_services_core::process_manager; +use git2::{Repository, Status, StatusOptions}; +use std::path::Path; + +/// Returns whether the given path is a Git repository. +pub fn is_git_repository>(path: P) -> bool { + Repository::open(path).is_ok() +} + +/// Returns the repository root directory. +pub fn get_repository_root>(path: P) -> Result { + let repo = + Repository::discover(path).map_err(|e| GitError::RepositoryNotFound(e.to_string()))?; + + let workdir = repo + .workdir() + .ok_or_else(|| GitError::InvalidPath("Repository has no working directory".to_string()))?; + + Ok(workdir.to_string_lossy().to_string()) +} + +/// Returns the current branch name. +pub fn get_current_branch(repo: &Repository) -> Result { + match repo.head() { + Ok(head) => { + if let Some(branch_name) = head.shorthand() { + Ok(branch_name.to_string()) + } else { + Ok("HEAD".to_string()) + } + } + Err(e) => { + if e.code() == git2::ErrorCode::UnbornBranch { + if let Ok(config) = repo.config() { + if let Ok(default_branch) = config.get_string("init.defaultBranch") { + return Ok(default_branch); + } + } + Ok("master".to_string()) + } else { + Err(GitError::CommandFailed(format!( + "Failed to get HEAD: {}", + e + ))) + } + } + } +} + +/// Converts Git status flags to a short string. +pub fn status_to_string(status: Status) -> String { + let mut result = Vec::new(); + + if status.contains(Status::INDEX_NEW) { + result.push("A"); + } + if status.contains(Status::INDEX_MODIFIED) { + result.push("M"); + } + if status.contains(Status::INDEX_DELETED) { + result.push("D"); + } + if status.contains(Status::INDEX_RENAMED) { + result.push("R"); + } + if status.contains(Status::INDEX_TYPECHANGE) { + result.push("T"); + } + + if status.contains(Status::WT_NEW) { + result.push("?"); + } + if status.contains(Status::WT_MODIFIED) { + result.push("M"); + } + if status.contains(Status::WT_DELETED) { + result.push("D"); + } + if status.contains(Status::WT_RENAMED) { + result.push("R"); + } + if status.contains(Status::WT_TYPECHANGE) { + result.push("T"); + } + + if result.is_empty() { + "U".to_string() + } else { + result.join("") + } +} + +/// Maximum number of untracked entries before we stop recursing into untracked +/// directories. When the non-recursive scan already reports many untracked +/// top-level entries, recursing would return thousands of paths that bloat IPC +/// payloads and DOM rendering, causing severe UI lag. +const UNTRACKED_RECURSE_THRESHOLD: usize = 200; + +/// Collects file statuses from a `StatusOptions` scan. +fn collect_statuses( + repo: &Repository, + recurse_untracked: bool, +) -> Result, GitError> { + let mut status_options = StatusOptions::new(); + status_options.include_untracked(true); + status_options.include_ignored(false); + status_options.recurse_untracked_dirs(recurse_untracked); + + let statuses = repo + .statuses(Some(&mut status_options)) + .map_err(|e| GitError::CommandFailed(format!("Failed to get statuses: {}", e)))?; + + let mut result = Vec::new(); + + for entry in statuses.iter() { + if let Some(path) = entry.path() { + let status = entry.status(); + let status_str = status_to_string(status); + + let index_status = if status.intersects( + Status::INDEX_NEW + | Status::INDEX_MODIFIED + | Status::INDEX_DELETED + | Status::INDEX_RENAMED + | Status::INDEX_TYPECHANGE, + ) { + Some(status_to_string( + status + & (Status::INDEX_NEW + | Status::INDEX_MODIFIED + | Status::INDEX_DELETED + | Status::INDEX_RENAMED + | Status::INDEX_TYPECHANGE), + )) + } else { + None + }; + + let workdir_status = if status.intersects( + Status::WT_NEW + | Status::WT_MODIFIED + | Status::WT_DELETED + | Status::WT_RENAMED + | Status::WT_TYPECHANGE, + ) { + Some(status_to_string( + status + & (Status::WT_NEW + | Status::WT_MODIFIED + | Status::WT_DELETED + | Status::WT_RENAMED + | Status::WT_TYPECHANGE), + )) + } else { + None + }; + + result.push(GitFileStatus { + path: path.to_string(), + status: status_str, + index_status, + workdir_status, + }); + } + } + + Ok(result) +} + +/// Returns file statuses. +/// +/// Uses a two-pass strategy to avoid expensive recursive scans when the +/// repository contains many untracked files (e.g. missing .gitignore for +/// build artifacts). First a non-recursive pass counts top-level untracked +/// entries; only when that count is within `UNTRACKED_RECURSE_THRESHOLD` does +/// a second recursive pass run. +pub fn get_file_statuses(repo: &Repository) -> Result, GitError> { + // Pass 1: fast non-recursive scan. + let shallow = collect_statuses(repo, false)?; + + let untracked_count = shallow.iter().filter(|f| f.status.contains('?')).count(); + + if untracked_count <= UNTRACKED_RECURSE_THRESHOLD { + // Few untracked entries – safe to recurse for full detail. + collect_statuses(repo, true) + } else { + // Too many untracked entries – return the shallow result as-is. + // Untracked directories appear as a single entry (folder name with + // trailing slash) which is sufficient for the UI. + Ok(shallow) + } +} + +/// Executes a Git command and returns the raw output including exit code. +/// +/// Git diff returns exit code 1 when there are differences (not an error). +/// Callers that need to distinguish this case should inspect `exit_code`. +pub async fn execute_git_command_raw( + repo_path: &str, + args: &[&str], +) -> Result { + let output = process_manager::create_tokio_command("git") + .current_dir(repo_path) + .args(args) + .output() + .await + .map_err(|e| GitError::CommandFailed(format!("Failed to execute git command: {}", e)))?; + + Ok(GitCommandOutput { + stdout: String::from_utf8_lossy(&output.stdout).to_string(), + stderr: String::from_utf8_lossy(&output.stderr).to_string(), + exit_code: output.status.code().unwrap_or(-1), + }) +} + +/// Executes a Git command. +/// +/// For most git commands, exit code 0 means success and anything else is an error. +/// However, `git diff` returns exit code 1 when there are differences, which is +/// not an error. Use [`execute_git_command_raw`] if you need to handle that case. +pub async fn execute_git_command(repo_path: &str, args: &[&str]) -> Result { + let result = execute_git_command_raw(repo_path, args).await?; + + if result.exit_code == 0 { + Ok(result.stdout) + } else { + let error = if result.stderr.is_empty() { + result.stdout + } else { + result.stderr + }; + Err(GitError::CommandFailed(error)) + } +} + +/// Executes a Git command synchronously and returns the raw output including exit code. +pub fn execute_git_command_sync_raw( + repo_path: &str, + args: &[&str], +) -> Result { + let output = process_manager::create_command("git") + .current_dir(repo_path) + .args(args) + .output() + .map_err(|e| GitError::CommandFailed(format!("Failed to execute git command: {}", e)))?; + + Ok(GitCommandOutput { + stdout: String::from_utf8_lossy(&output.stdout).to_string(), + stderr: String::from_utf8_lossy(&output.stderr).to_string(), + exit_code: output.status.code().unwrap_or(-1), + }) +} + +/// Executes a Git command synchronously. +pub fn execute_git_command_sync(repo_path: &str, args: &[&str]) -> Result { + let result = execute_git_command_sync_raw(repo_path, args)?; + + if result.exit_code == 0 { + Ok(result.stdout) + } else { + let error = if result.stderr.is_empty() { + result.stdout + } else { + result.stderr + }; + Err(GitError::CommandFailed(error)) + } +} + +/// Formats a timestamp. +pub fn format_timestamp(timestamp: i64) -> String { + use chrono::{TimeZone, Utc}; + + match Utc.timestamp_opt(timestamp, 0) { + chrono::LocalResult::Single(dt) => dt.format("%Y-%m-%d %H:%M:%S UTC").to_string(), + _ => "Invalid date".to_string(), + } +} + +/// Checks whether Git is available. +pub fn check_git_available() -> bool { + process_manager::create_command("git") + .arg("--version") + .output() + .map(|output| output.status.success()) + .unwrap_or(false) +} diff --git a/src/crates/services-integrations/src/git/worktree.rs b/src/crates/services-integrations/src/git/worktree.rs new file mode 100644 index 000000000..56b13601d --- /dev/null +++ b/src/crates/services-integrations/src/git/worktree.rs @@ -0,0 +1,53 @@ +use super::types::GitWorktreeInfo; + +/// Parses `git worktree list --porcelain` output. +pub fn parse_worktree_list(output: &str) -> Vec { + let mut worktrees = Vec::new(); + let mut current_worktree: Option = None; + + for line in output.lines() { + if line.starts_with("worktree ") { + if let Some(wt) = current_worktree.take() { + worktrees.push(wt); + } + let path = line.strip_prefix("worktree ").unwrap_or("").to_string(); + current_worktree = Some(GitWorktreeInfo { + path, + branch: None, + head: String::new(), + is_main: false, + is_locked: false, + is_prunable: false, + }); + } else if let Some(ref mut wt) = current_worktree { + if line.starts_with("HEAD ") { + wt.head = line.strip_prefix("HEAD ").unwrap_or("").to_string(); + } else if line.starts_with("branch ") { + let branch_ref = line.strip_prefix("branch ").unwrap_or(""); + let branch_name = branch_ref + .strip_prefix("refs/heads/") + .unwrap_or(branch_ref) + .to_string(); + wt.branch = Some(branch_name); + } else if line == "bare" { + wt.is_main = true; + } else if line == "locked" { + wt.is_locked = true; + } else if line == "prunable" { + wt.is_prunable = true; + } + } + } + + if let Some(wt) = current_worktree { + worktrees.push(wt); + } + + if let Some(first) = worktrees.first_mut() { + if !first.is_main { + first.is_main = true; + } + } + + worktrees +} diff --git a/src/crates/services-integrations/src/lib.rs b/src/crates/services-integrations/src/lib.rs index 707c6d48a..23b6598f9 100644 --- a/src/crates/services-integrations/src/lib.rs +++ b/src/crates/services-integrations/src/lib.rs @@ -3,5 +3,21 @@ //! Heavy external integrations live here behind feature groups so local checks //! can opt into only the integration family they need. +#[cfg(feature = "announcement")] +pub mod announcement; + #[cfg(feature = "file-watch")] pub mod file_watch; + +#[cfg(feature = "git")] +pub mod git; + +#[cfg(feature = "mcp")] +pub mod mcp; + +#[cfg(feature = "remote-ssh")] +pub mod remote_ssh; + +#[cfg(all(windows, feature = "git"))] +#[link(name = "advapi32")] +unsafe extern "system" {} diff --git a/src/crates/services-integrations/src/mcp/config/location.rs b/src/crates/services-integrations/src/mcp/config/location.rs new file mode 100644 index 000000000..79bf91d65 --- /dev/null +++ b/src/crates/services-integrations/src/mcp/config/location.rs @@ -0,0 +1,10 @@ +use serde::{Deserialize, Serialize}; + +/// Configuration location. +#[derive(Debug, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)] +#[serde(rename_all = "kebab-case")] +pub enum ConfigLocation { + BuiltIn, // Built-in configuration + User, // User-level configuration + Project, // Project-level configuration +} diff --git a/src/crates/services-integrations/src/mcp/config/mod.rs b/src/crates/services-integrations/src/mcp/config/mod.rs new file mode 100644 index 000000000..486b795d1 --- /dev/null +++ b/src/crates/services-integrations/src/mcp/config/mod.rs @@ -0,0 +1,5 @@ +//! MCP configuration data contracts. + +mod location; + +pub use location::ConfigLocation; diff --git a/src/crates/services-integrations/src/mcp/mod.rs b/src/crates/services-integrations/src/mcp/mod.rs new file mode 100644 index 000000000..16524e61e --- /dev/null +++ b/src/crates/services-integrations/src/mcp/mod.rs @@ -0,0 +1,19 @@ +//! MCP service contracts. +//! +//! `bitfun-core::service::mcp` remains as the compatibility facade for the +//! legacy public path. + +mod tool_info; +mod tool_name; + +pub mod config; +pub mod protocol; +pub mod server; + +pub use config::*; +pub use protocol::*; +pub use server::*; +pub use tool_info::McpToolInfo; +pub use tool_name::{ + MCP_TOOL_DELIMITER, MCP_TOOL_PREFIX, build_mcp_tool_name, normalize_name_for_mcp, +}; diff --git a/src/crates/services-integrations/src/mcp/protocol/mod.rs b/src/crates/services-integrations/src/mcp/protocol/mod.rs new file mode 100644 index 000000000..8aca0d878 --- /dev/null +++ b/src/crates/services-integrations/src/mcp/protocol/mod.rs @@ -0,0 +1,5 @@ +//! MCP protocol data contracts. + +pub mod types; + +pub use types::*; diff --git a/src/crates/services-integrations/src/mcp/protocol/types.rs b/src/crates/services-integrations/src/mcp/protocol/types.rs new file mode 100644 index 000000000..9ec34d125 --- /dev/null +++ b/src/crates/services-integrations/src/mcp/protocol/types.rs @@ -0,0 +1,710 @@ +//! MCP protocol type definitions +//! +//! Core data structures that follow the Model Context Protocol specification. + +use serde::{Deserialize, Serialize}; +use serde_json::Value; +use std::collections::HashMap; + +/// MCP protocol version (string format, follows the MCP spec). +/// +/// Aligned with VSCode: "2025-11-25" +/// Reference: https://spec.modelcontextprotocol.io/ +pub type MCPProtocolVersion = String; + +/// Returns the default MCP protocol version. +pub fn default_protocol_version() -> MCPProtocolVersion { + "2025-11-25".to_string() +} + +/// MCP resources capability. +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Default)] +#[serde(rename_all = "camelCase")] +pub struct ResourcesCapability { + #[serde(default)] + pub subscribe: bool, + #[serde(default)] + pub list_changed: bool, +} + +/// MCP prompts capability. +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Default)] +#[serde(rename_all = "camelCase")] +pub struct PromptsCapability { + #[serde(default)] + pub list_changed: bool, +} + +/// MCP tools capability. +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Default)] +#[serde(rename_all = "camelCase")] +pub struct ToolsCapability { + #[serde(default)] + pub list_changed: bool, +} + +/// MCP capability declaration (follows the latest MCP spec). +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)] +#[serde(rename_all = "camelCase")] +pub struct MCPCapability { + #[serde(default, skip_serializing_if = "Option::is_none")] + pub resources: Option, + #[serde(default, skip_serializing_if = "Option::is_none")] + pub prompts: Option, + #[serde(default, skip_serializing_if = "Option::is_none")] + pub tools: Option, + #[serde(default, skip_serializing_if = "Option::is_none")] + pub logging: Option, +} + +impl Default for MCPCapability { + fn default() -> Self { + Self { + resources: Some(ResourcesCapability::default()), + prompts: Some(PromptsCapability::default()), + tools: Some(ToolsCapability::default()), + logging: None, + } + } +} + +/// MCP server info. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPServerInfo { + pub name: String, + pub version: String, + #[serde(skip_serializing_if = "Option::is_none")] + pub description: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub vendor: Option, +} + +/// Icon for display in UIs (2025-11-25 spec). sizes may be string or string[] for compatibility. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPResourceIcon { + pub src: String, + #[serde(skip_serializing_if = "Option::is_none")] + pub mime_type: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub sizes: Option, // string or ["48x48"] per spec +} + +/// Annotations for resources/templates (2025-11-25 spec). +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct MCPAnnotations { + #[serde(skip_serializing_if = "Option::is_none")] + pub audience: Option>, + #[serde(skip_serializing_if = "Option::is_none")] + pub priority: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub last_modified: Option, +} + +/// MCP resource definition (2025-11-25 spec). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPResource { + pub uri: String, + pub name: String, + /// Human-readable title for display (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub title: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub description: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub mime_type: Option, + /// Icons for UI display (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub icons: Option>, + /// Size in bytes, if known (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub size: Option, + /// Annotations: audience, priority, lastModified (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub annotations: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub metadata: Option>, +} + +/// Content Security Policy configuration for MCP App UI (aligned with VSCode/MCP Apps spec). +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct McpUiResourceCsp { + /// Origins for network requests (fetch/XHR/WebSocket). + #[serde(skip_serializing_if = "Option::is_none")] + pub connect_domains: Option>, + /// Origins for static resources (scripts, images, styles, fonts). + #[serde(skip_serializing_if = "Option::is_none")] + pub resource_domains: Option>, + /// Origins for nested iframes (frame-src directive). + #[serde(skip_serializing_if = "Option::is_none")] + pub frame_domains: Option>, + /// Allowed base URIs for the document (base-uri directive). + #[serde(skip_serializing_if = "Option::is_none")] + pub base_uri_domains: Option>, +} + +/// Sandbox permissions requested by the UI resource (aligned with VSCode/MCP Apps spec). +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct McpUiResourcePermissions { + /// Request camera access. + #[serde(skip_serializing_if = "Option::is_none")] + pub camera: Option, + /// Request microphone access. + #[serde(skip_serializing_if = "Option::is_none")] + pub microphone: Option, + /// Request geolocation access. + #[serde(skip_serializing_if = "Option::is_none")] + pub geolocation: Option, + /// Request clipboard write access. + #[serde(skip_serializing_if = "Option::is_none")] + pub clipboard_write: Option, +} + +/// UI metadata within _meta (MCP Apps spec: _meta.ui.csp, _meta.ui.permissions). +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct McpUiMeta { + /// Content Security Policy configuration. + #[serde(skip_serializing_if = "Option::is_none")] + pub csp: Option, + /// Sandbox permissions. + #[serde(skip_serializing_if = "Option::is_none")] + pub permissions: Option, +} + +/// Resource content _meta field (MCP Apps spec). +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct MCPResourceContentMeta { + /// UI metadata containing CSP and permissions. + #[serde(skip_serializing_if = "Option::is_none")] + pub ui: Option, +} + +/// MCP resource content. +/// MCP spec uses `text` for text content and `blob` for base64 binary; both are optional but at least one must be present. +/// Serialization uses `text` per spec; we accept both `text` and `content` when deserializing for compatibility. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPResourceContent { + pub uri: String, + /// Text or HTML content. Serialized as `text` per MCP spec; accepts `text` or `content` when deserializing. + #[serde( + default, + alias = "text", + rename = "text", + skip_serializing_if = "Option::is_none" + )] + pub content: Option, + /// Base64-encoded binary content (MCP spec). Used for video, images, etc. + #[serde(skip_serializing_if = "Option::is_none")] + pub blob: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub mime_type: Option, + /// Annotations for embedded resources (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub annotations: Option, + /// Resource metadata (MCP Apps: contains ui.csp and ui.permissions). + #[serde(skip_serializing_if = "Option::is_none", rename = "_meta")] + pub meta: Option, +} + +/// MCP prompt definition (2025-11-25 spec). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPPrompt { + pub name: String, + /// Human-readable title for display (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub title: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub description: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub arguments: Option>, + /// Icons for UI display (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub icons: Option>, +} + +/// MCP prompt argument. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPPromptArgument { + pub name: String, + #[serde(skip_serializing_if = "Option::is_none")] + pub title: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub description: Option, + #[serde(default)] + pub required: bool, +} + +/// MCP prompt content. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPPromptContent { + pub name: String, + pub messages: Vec, +} + +/// Content block in prompt message (2025-11-25 spec). Deserializes from plain string (legacy) or structured block. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(untagged)] +pub enum MCPPromptMessageContent { + /// Legacy: plain string content from older servers. + Plain(String), + /// Structured content block. + Block(Box), +} + +/// Structured content block types for prompt messages. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase", tag = "type")] +pub enum MCPPromptMessageContentBlock { + #[serde(rename = "text")] + Text { text: String }, + #[serde(rename = "image")] + Image { data: String, mime_type: String }, + #[serde(rename = "audio")] + Audio { data: String, mime_type: String }, + #[serde(rename = "resource_link")] + ResourceLink { + uri: String, + #[serde(skip_serializing_if = "Option::is_none")] + name: Option, + #[serde(skip_serializing_if = "Option::is_none")] + description: Option, + #[serde(skip_serializing_if = "Option::is_none")] + mime_type: Option, + }, + #[serde(rename = "resource")] + Resource { resource: Box }, +} + +impl MCPPromptMessageContent { + /// Extracts displayable text. For non-text types returns a placeholder. + pub fn text_or_placeholder(&self) -> String { + match self { + MCPPromptMessageContent::Plain(s) => s.clone(), + MCPPromptMessageContent::Block(block) => match block.as_ref() { + MCPPromptMessageContentBlock::Text { text } => text.clone(), + MCPPromptMessageContentBlock::Image { mime_type, .. } => { + format!("[Image: {}]", mime_type) + } + MCPPromptMessageContentBlock::Audio { mime_type, .. } => { + format!("[Audio: {}]", mime_type) + } + MCPPromptMessageContentBlock::ResourceLink { uri, name, .. } => { + name.as_ref().map_or_else( + || format!("[Resource Link: {}]", uri), + |n| format!("[Resource Link: {} ({})]", n, uri), + ) + } + MCPPromptMessageContentBlock::Resource { resource } => { + format!("[Resource: {}]", resource.uri) + } + }, + } + } + + /// Substitutes placeholders like {{key}} with values. Only applies to text content. + pub fn substitute_placeholders(&mut self, arguments: &HashMap) { + match self { + MCPPromptMessageContent::Plain(s) => { + for (key, value) in arguments { + let placeholder = format!("{{{{{}}}}}", key); + *s = s.replace(&placeholder, value); + } + } + MCPPromptMessageContent::Block(block) => { + if let MCPPromptMessageContentBlock::Text { text } = block.as_mut() { + for (key, value) in arguments { + let placeholder = format!("{{{{{}}}}}", key); + *text = text.replace(&placeholder, value); + } + } + } + } + } +} + +/// MCP prompt message (2025-11-25 spec). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPPromptMessage { + pub role: String, + pub content: MCPPromptMessageContent, +} + +/// MCP Apps UI metadata (tool declares interactive UI via _meta.ui.resourceUri). +/// resourceUri is optional: some tools use _meta.ui only for visibility/csp/permissions. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPToolUIMeta { + /// URI pointing to UI resource, e.g. "ui://my-server/widget". Optional per MCP Apps spec. + #[serde(skip_serializing_if = "Option::is_none")] + pub resource_uri: Option, +} + +/// MCP tool metadata (MCP Apps extension). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPToolMeta { + #[serde(skip_serializing_if = "Option::is_none")] + pub ui: Option, +} + +/// Tool annotations (2025-11-25 spec). Clients MUST treat as untrusted unless from trusted servers. +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct MCPToolAnnotations { + #[serde(skip_serializing_if = "Option::is_none")] + pub title: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub read_only_hint: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub destructive_hint: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub idempotent_hint: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub open_world_hint: Option, +} + +/// MCP tool definition (2025-11-25 spec). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPTool { + pub name: String, + /// Human-readable title for display (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub title: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub description: Option, + pub input_schema: Value, + /// Optional output schema for structured results (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub output_schema: Option, + /// Icons for UI display (2025-11-25). + #[serde(skip_serializing_if = "Option::is_none")] + pub icons: Option>, + /// Tool behavior hints (2025-11-25). Treat as untrusted. + #[serde(skip_serializing_if = "Option::is_none")] + pub annotations: Option, + /// MCP Apps extension: tool metadata including UI resource URI + #[serde(skip_serializing_if = "Option::is_none", rename = "_meta")] + pub meta: Option, +} + +/// MCP tool call result. +/// MCP Apps extension: `structuredContent` is UI-optimized data (not for model context). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct MCPToolResult { + #[serde(skip_serializing_if = "Option::is_none")] + pub content: Option>, + #[serde(default)] + pub is_error: bool, + /// Structured data for MCP App UI (ext-apps ontoolresult expects this). + #[serde(skip_serializing_if = "Option::is_none")] + pub structured_content: Option, + /// Optional protocol-level metadata returned by the server. + #[serde(skip_serializing_if = "Option::is_none", rename = "_meta")] + pub meta: Option, +} + +/// MCP tool result content (2025-11-25 spec). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase", tag = "type")] +pub enum MCPToolResultContent { + #[serde(rename = "text")] + Text { text: String }, + #[serde(rename = "image")] + Image { + data: String, + #[serde(rename = "mimeType", alias = "mime_type")] + mime_type: String, + }, + #[serde(rename = "audio")] + Audio { + data: String, + #[serde(rename = "mimeType", alias = "mime_type")] + mime_type: String, + }, + /// Link to resource (client may fetch via resources/read). + #[serde(rename = "resource_link")] + ResourceLink { + uri: String, + #[serde(skip_serializing_if = "Option::is_none")] + name: Option, + #[serde(skip_serializing_if = "Option::is_none")] + description: Option, + #[serde(skip_serializing_if = "Option::is_none")] + mime_type: Option, + }, + /// Embedded resource content. + #[serde(rename = "resource")] + Resource { resource: Box }, +} + +/// MCP message type (based on JSON-RPC 2.0). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(untagged)] +pub enum MCPMessage { + Request(MCPRequest), + Response(MCPResponse), + Notification(MCPNotification), +} + +/// MCP request message. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct MCPRequest { + pub jsonrpc: String, + pub id: Value, + pub method: String, + #[serde(skip_serializing_if = "Option::is_none")] + pub params: Option, +} + +impl MCPRequest { + pub fn new(id: Value, method: String, params: Option) -> Self { + Self { + jsonrpc: "2.0".to_string(), + id, + method, + params, + } + } +} + +/// MCP response message. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct MCPResponse { + pub jsonrpc: String, + pub id: Value, + #[serde(skip_serializing_if = "Option::is_none")] + pub result: Option, + #[serde(skip_serializing_if = "Option::is_none")] + pub error: Option, +} + +impl MCPResponse { + pub fn success(id: Value, result: Value) -> Self { + Self { + jsonrpc: "2.0".to_string(), + id, + result: Some(result), + error: None, + } + } + + pub fn error(id: Value, error: MCPError) -> Self { + Self { + jsonrpc: "2.0".to_string(), + id, + result: None, + error: Some(error), + } + } +} + +/// MCP notification message (no response required). +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct MCPNotification { + pub jsonrpc: String, + pub method: String, + #[serde(skip_serializing_if = "Option::is_none")] + pub params: Option, +} + +impl MCPNotification { + pub fn new(method: String, params: Option) -> Self { + Self { + jsonrpc: "2.0".to_string(), + method, + params, + } + } +} + +/// MCP error definition. +#[derive(Debug, Clone, Serialize, Deserialize)] +pub struct MCPError { + pub code: i32, + pub message: String, + #[serde(skip_serializing_if = "Option::is_none")] + pub data: Option, +} + +impl MCPError { + /// Standard JSON-RPC error codes. + pub const PARSE_ERROR: i32 = -32700; + pub const INVALID_REQUEST: i32 = -32600; + pub const METHOD_NOT_FOUND: i32 = -32601; + pub const INVALID_PARAMS: i32 = -32602; + pub const INTERNAL_ERROR: i32 = -32603; + /// Resource not found (2025-11-25 spec). + pub const RESOURCE_NOT_FOUND: i32 = -32002; + + pub fn parse_error(message: impl Into) -> Self { + Self { + code: Self::PARSE_ERROR, + message: message.into(), + data: None, + } + } + + pub fn invalid_request(message: impl Into) -> Self { + Self { + code: Self::INVALID_REQUEST, + message: message.into(), + data: None, + } + } + + pub fn method_not_found(method: impl Into) -> Self { + Self { + code: Self::METHOD_NOT_FOUND, + message: format!("Method not found: {}", method.into()), + data: None, + } + } + + pub fn invalid_params(message: impl Into) -> Self { + Self { + code: Self::INVALID_PARAMS, + message: message.into(), + data: None, + } + } + + pub fn internal_error(message: impl Into) -> Self { + Self { + code: Self::INTERNAL_ERROR, + message: message.into(), + data: None, + } + } +} + +/// Initialize request parameters. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct InitializeParams { + pub protocol_version: MCPProtocolVersion, + pub capabilities: MCPCapability, + pub client_info: MCPServerInfo, +} + +/// Initialize response result. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct InitializeResult { + pub protocol_version: MCPProtocolVersion, + pub capabilities: MCPCapability, + pub server_info: MCPServerInfo, +} + +/// Resources/List request parameters. +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct ResourcesListParams { + #[serde(skip_serializing_if = "Option::is_none")] + pub cursor: Option, +} + +/// Resources/List response result. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct ResourcesListResult { + pub resources: Vec, + #[serde(skip_serializing_if = "Option::is_none")] + pub next_cursor: Option, +} + +/// Resources/Read request parameters. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct ResourcesReadParams { + pub uri: String, +} + +/// Resources/Read response result. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct ResourcesReadResult { + pub contents: Vec, +} + +/// Prompts/List request parameters. +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct PromptsListParams { + #[serde(skip_serializing_if = "Option::is_none")] + pub cursor: Option, +} + +/// Prompts/List response result. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct PromptsListResult { + pub prompts: Vec, + #[serde(skip_serializing_if = "Option::is_none")] + pub next_cursor: Option, +} + +/// Prompts/Get request parameters. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct PromptsGetParams { + pub name: String, + #[serde(skip_serializing_if = "Option::is_none")] + pub arguments: Option>, +} + +/// Prompts/Get response result (2025-11-25 spec). +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct PromptsGetResult { + #[serde(skip_serializing_if = "Option::is_none")] + pub description: Option, + pub messages: Vec, +} + +/// Tools/List request parameters. +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +#[serde(rename_all = "camelCase")] +pub struct ToolsListParams { + #[serde(skip_serializing_if = "Option::is_none")] + pub cursor: Option, +} + +/// Tools/List response result. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct ToolsListResult { + pub tools: Vec, + #[serde(skip_serializing_if = "Option::is_none")] + pub next_cursor: Option, +} + +/// Tools/Call request parameters. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct ToolsCallParams { + pub name: String, + #[serde(skip_serializing_if = "Option::is_none")] + pub arguments: Option, +} + +/// Ping request (heartbeat). +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +pub struct PingParams {} + +/// Ping response. +#[derive(Debug, Clone, Serialize, Deserialize, Default)] +pub struct PingResult {} diff --git a/src/crates/services-integrations/src/mcp/server/mod.rs b/src/crates/services-integrations/src/mcp/server/mod.rs new file mode 100644 index 000000000..bfb6dbbea --- /dev/null +++ b/src/crates/services-integrations/src/mcp/server/mod.rs @@ -0,0 +1,24 @@ +//! MCP server data contracts. + +/// MCP server type. +#[derive(Debug, Clone, Copy, PartialEq, Eq, serde::Serialize, serde::Deserialize)] +#[serde(rename_all = "lowercase")] +pub enum MCPServerType { + Local, + Remote, +} + +/// MCP server status. +#[derive(Debug, Clone, Copy, PartialEq, Eq, serde::Serialize, serde::Deserialize)] +#[serde(rename_all = "lowercase")] +pub enum MCPServerStatus { + Uninitialized, + Starting, + Connected, + Healthy, + NeedsAuth, + Reconnecting, + Failed, + Stopping, + Stopped, +} diff --git a/src/crates/services-integrations/src/mcp/tool_info.rs b/src/crates/services-integrations/src/mcp/tool_info.rs new file mode 100644 index 000000000..35f73b2e7 --- /dev/null +++ b/src/crates/services-integrations/src/mcp/tool_info.rs @@ -0,0 +1,8 @@ +use serde::{Deserialize, Serialize}; + +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)] +pub struct McpToolInfo { + pub server_id: String, + pub server_name: String, + pub tool_name: String, +} diff --git a/src/crates/services-integrations/src/mcp/tool_name.rs b/src/crates/services-integrations/src/mcp/tool_name.rs new file mode 100644 index 000000000..a626e189b --- /dev/null +++ b/src/crates/services-integrations/src/mcp/tool_name.rs @@ -0,0 +1,56 @@ +//! Shared MCP tool-name helpers. + +pub const MCP_TOOL_PREFIX: &str = "mcp__"; +pub const MCP_TOOL_DELIMITER: &str = "__"; + +/// Normalize MCP server/tool names to a wire-safe format aligned with claude-code. +pub fn normalize_name_for_mcp(name: &str) -> String { + name.chars() + .map(|ch| { + if ch.is_ascii_alphanumeric() || ch == '_' || ch == '-' { + ch + } else { + '_' + } + }) + .collect() +} + +pub fn build_mcp_tool_name(server_id: &str, tool_name: &str) -> String { + format!( + "{}{}{}{}", + MCP_TOOL_PREFIX, + normalize_name_for_mcp(server_id), + MCP_TOOL_DELIMITER, + normalize_name_for_mcp(tool_name) + ) +} + +#[cfg(test)] +mod tests { + use super::{build_mcp_tool_name, normalize_name_for_mcp}; + + #[test] + fn normalize_name_for_mcp_replaces_spaces_and_symbols() { + assert_eq!( + normalize_name_for_mcp("Acme Search / Primary"), + "Acme_Search___Primary" + ); + } + + #[test] + fn normalize_name_for_mcp_keeps_ascii_word_chars_and_hyphen() { + assert_eq!( + normalize_name_for_mcp("github-enterprise_v2"), + "github-enterprise_v2" + ); + } + + #[test] + fn build_mcp_tool_name_normalizes_both_segments() { + assert_eq!( + build_mcp_tool_name("Claude Code", "search repos"), + "mcp__Claude_Code__search_repos" + ); + } +} diff --git a/src/crates/services-integrations/src/remote_ssh/mod.rs b/src/crates/services-integrations/src/remote_ssh/mod.rs new file mode 100644 index 000000000..f11c91fd7 --- /dev/null +++ b/src/crates/services-integrations/src/remote_ssh/mod.rs @@ -0,0 +1,8 @@ +//! Remote SSH service contracts. +//! +//! `bitfun-core::service::remote_ssh` remains as the compatibility facade for +//! the legacy public path. + +pub mod types; + +pub use types::*; diff --git a/src/crates/services-integrations/src/remote_ssh/types.rs b/src/crates/services-integrations/src/remote_ssh/types.rs new file mode 100644 index 000000000..a71f1dab7 --- /dev/null +++ b/src/crates/services-integrations/src/remote_ssh/types.rs @@ -0,0 +1,317 @@ +//! Type definitions for Remote SSH service + +use serde::{Deserialize, Deserializer, Serialize}; +use tokio_util::sync::CancellationToken; + +/// Workspace backend type +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)] +#[serde(tag = "type", content = "data")] +pub enum WorkspaceBackend { + /// Local workspace (default) + Local, + /// Remote SSH workspace + Remote(RemoteWorkspaceInfo), +} + +/// Remote workspace information +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq, Eq)] +#[serde(rename_all = "camelCase")] +pub struct RemoteWorkspaceInfo { + /// SSH connection ID + pub connection_id: String, + /// Connection name (display name) + pub connection_name: String, + /// Remote path on the server + pub remote_path: String, +} + +/// SSH connection configuration +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct SSHConnectionConfig { + /// Unique identifier for this connection + pub id: String, + /// Display name for the connection + pub name: String, + /// Remote host address (hostname or IP) + pub host: String, + /// SSH port (default: 22) + pub port: u16, + /// SSH username + pub username: String, + /// Authentication method + #[serde(deserialize_with = "deserialize_ssh_auth_method")] + pub auth: SSHAuthMethod, + /// Default remote working directory + #[serde(rename = "defaultWorkspace")] + pub default_workspace: Option, +} + +/// SSH authentication method +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(tag = "type")] +pub enum SSHAuthMethod { + /// Password authentication + Password { password: String }, + /// Private key authentication + PrivateKey { + /// Path to private key file on local machine + #[serde(rename = "keyPath")] + key_path: String, + /// Optional passphrase for encrypted private key + passphrase: Option, + }, +} + +/// Legacy `{"type":"Agent"}` in saved config maps to default private key path. +fn deserialize_ssh_auth_method<'de, D>(deserializer: D) -> Result +where + D: Deserializer<'de>, +{ + #[derive(Deserialize)] + #[serde(tag = "type")] + enum Helper { + Password { + password: String, + }, + PrivateKey { + #[serde(rename = "keyPath")] + key_path: String, + passphrase: Option, + }, + Agent, + } + match Helper::deserialize(deserializer)? { + Helper::Password { password } => Ok(SSHAuthMethod::Password { password }), + Helper::PrivateKey { + key_path, + passphrase, + } => Ok(SSHAuthMethod::PrivateKey { + key_path, + passphrase, + }), + Helper::Agent => Ok(SSHAuthMethod::PrivateKey { + key_path: "~/.ssh/id_rsa".to_string(), + passphrase: None, + }), + } +} + +/// Connection state +#[derive(Debug, Clone, Serialize, Deserialize, PartialEq)] +pub enum ConnectionState { + /// Not connected + Disconnected, + /// Connection in progress + Connecting, + /// Successfully connected + Connected, + /// Connection failed with error + Failed { error: String }, +} + +/// Saved connection (without sensitive data like passwords) +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct SavedConnection { + pub id: String, + pub name: String, + pub host: String, + pub port: u16, + pub username: String, + #[serde(rename = "authType", deserialize_with = "deserialize_saved_auth_type")] + pub auth_type: SavedAuthType, + #[serde(rename = "defaultWorkspace")] + pub default_workspace: Option, + #[serde(rename = "lastConnected")] + pub last_connected: Option, +} + +/// Saved auth type (excludes sensitive credentials; password ciphertext is in `ssh_password_vault.json`) +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(tag = "type")] +pub enum SavedAuthType { + Password, + PrivateKey { + #[serde(rename = "keyPath")] + key_path: String, + }, +} + +fn deserialize_saved_auth_type<'de, D>(deserializer: D) -> Result +where + D: Deserializer<'de>, +{ + #[derive(Deserialize)] + #[serde(tag = "type")] + enum Helper { + Password, + PrivateKey { + #[serde(rename = "keyPath")] + key_path: String, + }, + Agent, + } + match Helper::deserialize(deserializer)? { + Helper::Password => Ok(SavedAuthType::Password), + Helper::PrivateKey { key_path } => Ok(SavedAuthType::PrivateKey { key_path }), + Helper::Agent => Ok(SavedAuthType::PrivateKey { + key_path: "~/.ssh/id_rsa".to_string(), + }), + } +} + +/// Remote file entry information +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct RemoteFileEntry { + pub name: String, + pub path: String, + #[serde(rename = "isDir")] + pub is_dir: bool, + #[serde(rename = "isFile")] + pub is_file: bool, + #[serde(rename = "isSymlink")] + pub is_symlink: bool, + pub size: Option, + pub modified: Option, + pub permissions: Option, +} + +/// Remote file tree node +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct RemoteTreeNode { + pub name: String, + pub path: String, + #[serde(rename = "isDir")] + pub is_dir: bool, + pub children: Option>, +} + +/// Remote directory entry (for read_dir operations) +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct RemoteDirEntry { + pub name: String, + pub path: String, + #[serde(rename = "isDir")] + pub is_dir: bool, + #[serde(rename = "isFile")] + pub is_file: bool, + #[serde(rename = "isSymlink")] + pub is_symlink: bool, + pub size: Option, + pub modified: Option, + pub permissions: Option, +} + +/// Result of SSH connection attempt +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct SSHConnectionResult { + pub success: bool, + #[serde(rename = "connectionId")] + pub connection_id: Option, + pub error: Option, + #[serde(rename = "serverInfo")] + pub server_info: Option, +} + +/// Options for executing a remote SSH command. +#[derive(Debug, Clone, Default)] +pub struct SSHCommandOptions { + pub timeout_ms: Option, + pub cancellation_token: Option, +} + +/// Result of executing a remote SSH command. +#[derive(Debug, Clone)] +pub struct SSHCommandResult { + pub stdout: String, + pub stderr: String, + pub exit_code: i32, + pub interrupted: bool, + pub timed_out: bool, +} + +/// Remote server information +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct ServerInfo { + #[serde(rename = "osType")] + pub os_type: String, + pub hostname: String, + #[serde(rename = "homeDir")] + pub home_dir: String, +} + +/// Result of remote file operation +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct RemoteFileResult { + pub success: bool, + pub error: Option, +} + +/// Result of remote directory listing +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct RemoteListResult { + pub entries: Vec, + pub error: Option, +} + +/// Request to open a remote workspace +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct RemoteWorkspaceRequest { + #[serde(rename = "connectionId")] + pub connection_id: String, + #[serde(rename = "remotePath")] + pub remote_path: String, +} + +/// Remote workspace info (persisted in `remote_workspace.json`). +/// `#[serde(default)]` keeps older files loadable if a field was absent. +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct RemoteWorkspace { + #[serde(default)] + pub connection_id: String, + #[serde(default)] + pub remote_path: String, + #[serde(default)] + pub connection_name: String, + /// SSH config `host`; used for `~/.bitfun/remote_ssh/{host}/...` session storage. + #[serde(default)] + pub ssh_host: String, +} + +/// SSH config entry parsed from ~/.ssh/config +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct SSHConfigEntry { + /// Host name (alias from SSH config) + pub host: String, + /// Actual hostname or IP + pub hostname: Option, + /// SSH port + pub port: Option, + /// Username + pub user: Option, + /// Path to identity file (private key) + pub identity_file: Option, + /// Whether to use SSH agent + pub agent: Option, +} + +/// Result of looking up SSH config for a host +#[derive(Debug, Clone, Serialize, Deserialize)] +#[serde(rename_all = "camelCase")] +pub struct SSHConfigLookupResult { + /// Whether a config entry was found + pub found: bool, + /// Config entry if found + pub config: Option, +} diff --git a/src/crates/services-integrations/tests/announcement_contracts.rs b/src/crates/services-integrations/tests/announcement_contracts.rs new file mode 100644 index 000000000..10798e380 --- /dev/null +++ b/src/crates/services-integrations/tests/announcement_contracts.rs @@ -0,0 +1,103 @@ +#![cfg(feature = "announcement")] + +use bitfun_services_integrations::announcement::{ + AnnouncementCard, AnnouncementState, CardSource, CardType, CompletionAction, ModalConfig, + ModalPage, ModalSize, PageLayout, ToastConfig, TriggerCondition, TriggerRule, +}; + +#[test] +fn announcement_card_deserialization_preserves_default_contract() { + let card: AnnouncementCard = serde_json::from_value(serde_json::json!({ + "id": "feature_v1", + "card_type": "feature", + "source": "local", + "trigger": { + "condition": { + "type": "app_nth_open", + "n": 3 + } + }, + "toast": { + "icon": "sparkles", + "title": "Feature", + "description": "Try it" + } + })) + .unwrap(); + + assert_eq!(card.id, "feature_v1"); + assert_eq!(card.card_type, CardType::Feature); + assert_eq!(card.source, CardSource::Local); + assert_eq!(card.priority, 0); + assert_eq!(card.app_version, None); + assert!(card.modal.is_none()); + assert_eq!(card.expires_at, None); + assert!(matches!( + card.trigger.condition, + TriggerCondition::AppNthOpen { n: 3 } + )); + assert_eq!(card.trigger.delay_ms, 0); + assert!(card.trigger.once_per_version); + assert_eq!(card.toast.action_label, ""); + assert!(card.toast.dismissible); + assert_eq!(card.toast.auto_dismiss_ms, None); +} + +#[test] +fn announcement_modal_serialization_preserves_snake_case_contract() { + let modal = ModalConfig { + size: ModalSize::Xl, + closable: true, + pages: vec![ModalPage { + layout: PageLayout::FullscreenMedia, + title: "Showcase".to_string(), + body: "Details".to_string(), + media: None, + }], + completion_action: CompletionAction::NeverShowAgain, + }; + + assert_eq!( + serde_json::to_value(modal).unwrap(), + serde_json::json!({ + "size": "xl", + "closable": true, + "pages": [{ + "layout": "fullscreen_media", + "title": "Showcase", + "body": "Details", + "media": null + }], + "completion_action": "never_show_again" + }) + ); +} + +#[test] +fn announcement_state_and_trigger_defaults_preserve_runtime_assumptions() { + let trigger = TriggerRule::default(); + assert!(matches!( + trigger.condition, + TriggerCondition::VersionFirstOpen + )); + assert_eq!(trigger.delay_ms, 2000); + assert!(trigger.once_per_version); + + let state = AnnouncementState::default(); + assert_eq!(state.last_seen_version, ""); + assert_eq!(state.app_open_count, 0); + assert!(state.seen_ids.is_empty()); + assert!(state.dismissed_ids.is_empty()); + assert!(state.never_show_ids.is_empty()); + assert_eq!(state.last_remote_fetch_at, None); + + let toast = ToastConfig { + icon: "tip".to_string(), + title: "Tip".to_string(), + description: "Use shortcuts".to_string(), + action_label: String::new(), + dismissible: true, + auto_dismiss_ms: None, + }; + assert!(toast.dismissible); +} diff --git a/src/crates/services-integrations/tests/git_contracts.rs b/src/crates/services-integrations/tests/git_contracts.rs new file mode 100644 index 000000000..072877a31 --- /dev/null +++ b/src/crates/services-integrations/tests/git_contracts.rs @@ -0,0 +1,305 @@ +#![cfg(feature = "git")] + +use bitfun_services_integrations::git::{ + build_git_changed_files_args, build_git_diff_args, parse_branch_line, parse_git_log_line, + parse_name_status_output, parse_worktree_list, GitAuthor, GitChangedFile, GitChangedFileStatus, + GitChangedFilesParams, GitCommandOutput, GitCommitParams, GitDiffParams, GitGraph, GitService, + GitWorktreeInfo, GraphNode, GraphRef, +}; +use std::fs; +use std::process::Command; +use std::time::{SystemTime, UNIX_EPOCH}; + +#[test] +fn git_changed_file_status_preserves_snake_case_contract() { + let status = serde_json::to_value(GitChangedFileStatus::Renamed).unwrap(); + assert_eq!(status, serde_json::json!("renamed")); + + let changed_file = GitChangedFile { + path: "src/new.rs".to_string(), + old_path: Some("src/old.rs".to_string()), + status: GitChangedFileStatus::Renamed, + }; + + let value = serde_json::to_value(changed_file).unwrap(); + assert_eq!(value["old_path"], "src/old.rs"); + assert_eq!(value["status"], "renamed"); +} + +#[test] +fn git_name_status_parser_preserves_common_status_contract() { + let files = parse_name_status_output( + "M\tsrc/main.rs\nA\tsrc/new.rs\nD\tsrc/old.rs\nR100\tsrc/old_name.rs\tsrc/new_name.rs\nC087\tsrc/source.rs\tsrc/copy.rs\n", + ); + + assert_eq!( + files, + vec![ + GitChangedFile { + path: "src/main.rs".to_string(), + old_path: None, + status: GitChangedFileStatus::Modified, + }, + GitChangedFile { + path: "src/new.rs".to_string(), + old_path: None, + status: GitChangedFileStatus::Added, + }, + GitChangedFile { + path: "src/old.rs".to_string(), + old_path: None, + status: GitChangedFileStatus::Deleted, + }, + GitChangedFile { + path: "src/new_name.rs".to_string(), + old_path: Some("src/old_name.rs".to_string()), + status: GitChangedFileStatus::Renamed, + }, + GitChangedFile { + path: "src/copy.rs".to_string(), + old_path: Some("src/source.rs".to_string()), + status: GitChangedFileStatus::Copied, + }, + ], + ); +} + +#[test] +fn git_command_output_preserves_raw_stream_contract() { + let output = GitCommandOutput { + stdout: "ok".to_string(), + stderr: "warning".to_string(), + exit_code: 1, + }; + + assert_eq!(output.stdout, "ok"); + assert_eq!(output.stderr, "warning"); + assert_eq!(output.exit_code, 1); +} + +#[test] +fn git_text_parsers_preserve_branch_and_log_contracts() { + assert_eq!( + parse_git_log_line("abc123|BitFun|bitfun@example.com|2026-05-12|subject|body"), + Some(( + "abc123".to_string(), + "BitFun".to_string(), + "bitfun@example.com".to_string(), + "2026-05-12".to_string(), + "subject|body".to_string(), + )) + ); + assert_eq!(parse_git_log_line("abc123|missing"), None); + + assert_eq!( + parse_branch_line("* main"), + Some(("main".to_string(), true)) + ); + assert_eq!( + parse_branch_line(" feature/test"), + Some(("feature/test".to_string(), false)) + ); + assert_eq!( + parse_branch_line("detached"), + Some(("detached".to_string(), false)) + ); + assert_eq!(parse_branch_line(" "), None); +} + +#[test] +fn git_diff_arg_builders_preserve_existing_command_contract() { + let args = build_git_diff_args(&GitDiffParams { + source: Some("main".to_string()), + target: Some("feature".to_string()), + files: Some(vec!["src/lib.rs".to_string(), "README.md".to_string()]), + staged: Some(true), + stat: Some(true), + }); + assert_eq!( + args, + vec![ + "diff", + "--cached", + "main..feature", + "--stat", + "--", + "src/lib.rs", + "README.md", + ] + ); + + let target_only_args = build_git_diff_args(&GitDiffParams { + source: None, + target: Some("feature".to_string()), + files: None, + staged: None, + stat: None, + }); + assert_eq!(target_only_args, vec!["diff"]); + + let changed_args = build_git_changed_files_args(&GitChangedFilesParams { + source: None, + target: Some("feature".to_string()), + staged: Some(true), + }); + assert_eq!( + changed_args, + vec!["diff", "--name-status", "--cached", "feature"] + ); +} + +#[tokio::test] +async fn git_service_preserves_repository_status_contract() { + let repo_dir = TempRepoDir::new("git-service-status"); + assert_eq!( + GitService::is_repository(repo_dir.path()).await.unwrap(), + false + ); + + run_git(repo_dir.path(), &["init"]); + fs::write(repo_dir.path().join("new-file.txt"), "hello\n").unwrap(); + + assert_eq!( + GitService::is_repository(repo_dir.path()).await.unwrap(), + true + ); + + let status = GitService::get_status(repo_dir.path()).await.unwrap(); + assert!(status + .untracked + .iter() + .any(|path| path == "new-file.txt" || path == "new-file.txt/")); +} + +struct TempRepoDir { + path: std::path::PathBuf, +} + +impl TempRepoDir { + fn new(name: &str) -> Self { + let nanos = SystemTime::now() + .duration_since(UNIX_EPOCH) + .unwrap() + .as_nanos(); + let path = std::env::temp_dir().join(format!( + "bitfun-services-integrations-{}-{}-{}", + name, + std::process::id(), + nanos + )); + fs::create_dir_all(&path).unwrap(); + Self { path } + } + + fn path(&self) -> &std::path::Path { + &self.path + } +} + +impl Drop for TempRepoDir { + fn drop(&mut self) { + let _ = fs::remove_dir_all(&self.path); + } +} + +fn run_git(repo_dir: &std::path::Path, args: &[&str]) { + let output = Command::new("git") + .current_dir(repo_dir) + .args(args) + .output() + .unwrap(); + assert!( + output.status.success(), + "git {:?} failed: {}{}", + args, + String::from_utf8_lossy(&output.stdout), + String::from_utf8_lossy(&output.stderr) + ); +} + +#[test] +fn git_worktree_info_preserves_camel_case_contract() { + let worktree = GitWorktreeInfo { + path: "D:/workspace/BitFun-worktree".to_string(), + branch: Some("feature/test".to_string()), + head: "abc123".to_string(), + is_main: false, + is_locked: true, + is_prunable: false, + }; + + let value = serde_json::to_value(worktree).unwrap(); + assert_eq!(value["isMain"], false); + assert_eq!(value["isLocked"], true); + assert_eq!(value["isPrunable"], false); +} + +#[test] +fn git_worktree_parser_preserves_porcelain_contract() { + let worktrees = parse_worktree_list( + "worktree D:/workspace/BitFun\nHEAD abc123\nbranch refs/heads/main\n\nworktree D:/workspace/BitFun-feature\nHEAD def456\nbranch refs/heads/feature/test\nlocked\nprunable\n", + ); + + assert_eq!(worktrees.len(), 2); + assert_eq!(worktrees[0].path, "D:/workspace/BitFun"); + assert_eq!(worktrees[0].branch.as_deref(), Some("main")); + assert_eq!(worktrees[0].head, "abc123"); + assert!(worktrees[0].is_main); + assert!(!worktrees[0].is_locked); + assert_eq!(worktrees[1].branch.as_deref(), Some("feature/test")); + assert!(worktrees[1].is_locked); + assert!(worktrees[1].is_prunable); +} + +#[test] +fn git_commit_params_preserves_no_verify_rename_contract() { + let params = GitCommitParams { + message: "test commit".to_string(), + amend: Some(false), + all: Some(true), + no_verify: Some(true), + author: Some(GitAuthor { + name: "BitFun".to_string(), + email: "bitfun@example.com".to_string(), + }), + }; + + let value = serde_json::to_value(params).unwrap(); + assert_eq!(value["noVerify"], true); + assert!(value.get("no_verify").is_none()); +} + +#[test] +fn git_graph_contract_preserves_camel_case_contract() { + let graph = GitGraph { + nodes: vec![GraphNode { + hash: "abc123".to_string(), + message: "initial".to_string(), + full_message: "initial commit".to_string(), + author_name: "BitFun".to_string(), + author_email: "bitfun@example.com".to_string(), + timestamp: 1_700_000_000, + parents: Vec::new(), + children: vec!["def456".to_string()], + refs: vec![GraphRef { + name: "main".to_string(), + ref_type: "branch".to_string(), + is_current: true, + is_head: true, + }], + lane: 0, + forking_lanes: Vec::new(), + merging_lanes: Vec::new(), + passing_lanes: Vec::new(), + }], + max_lane: 1, + current_branch: Some("main".to_string()), + }; + + let value = serde_json::to_value(graph).unwrap(); + assert_eq!(value["maxLane"], 1); + assert_eq!(value["currentBranch"], "main"); + assert_eq!(value["nodes"][0]["fullMessage"], "initial commit"); + assert_eq!(value["nodes"][0]["refs"][0]["refType"], "branch"); + assert_eq!(value["nodes"][0]["refs"][0]["isCurrent"], true); +} diff --git a/src/crates/services-integrations/tests/mcp_contracts.rs b/src/crates/services-integrations/tests/mcp_contracts.rs new file mode 100644 index 000000000..4cdc27d84 --- /dev/null +++ b/src/crates/services-integrations/tests/mcp_contracts.rs @@ -0,0 +1,145 @@ +#![cfg(feature = "mcp")] + +use bitfun_services_integrations::mcp::config::ConfigLocation; +use bitfun_services_integrations::mcp::protocol::{ + MCPCapability, MCPError, MCPPromptMessageContent, MCPPromptMessageContentBlock, MCPRequest, + default_protocol_version, +}; +use bitfun_services_integrations::mcp::server::{MCPServerStatus, MCPServerType}; +use bitfun_services_integrations::mcp::{ + MCP_TOOL_DELIMITER, MCP_TOOL_PREFIX, McpToolInfo, build_mcp_tool_name, normalize_name_for_mcp, +}; + +#[test] +fn mcp_tool_name_contract_matches_existing_wire_format() { + assert_eq!(MCP_TOOL_PREFIX, "mcp__"); + assert_eq!(MCP_TOOL_DELIMITER, "__"); + assert_eq!( + normalize_name_for_mcp("Acme Search / Primary"), + "Acme_Search___Primary" + ); + assert_eq!( + build_mcp_tool_name("Claude Code", "search repos"), + "mcp__Claude_Code__search_repos" + ); +} + +#[test] +fn mcp_tool_info_preserves_json_shape() { + let info = McpToolInfo { + server_id: "server-1".to_string(), + server_name: "Docs".to_string(), + tool_name: "search".to_string(), + }; + + assert_eq!( + serde_json::to_value(info).unwrap(), + serde_json::json!({ + "server_id": "server-1", + "server_name": "Docs", + "tool_name": "search" + }) + ); +} + +#[test] +fn mcp_protocol_capability_contract_matches_existing_default() { + assert_eq!(default_protocol_version(), "2025-11-25"); + assert_eq!( + serde_json::to_value(MCPCapability::default()).unwrap(), + serde_json::json!({ + "resources": { + "subscribe": false, + "listChanged": false + }, + "prompts": { + "listChanged": false + }, + "tools": { + "listChanged": false + } + }) + ); +} + +#[test] +fn mcp_protocol_jsonrpc_helpers_preserve_wire_shape() { + let request = MCPRequest::new( + serde_json::json!(7), + "tools/list".to_string(), + Some(serde_json::json!({ "cursor": "next" })), + ); + + assert_eq!( + serde_json::to_value(request).unwrap(), + serde_json::json!({ + "jsonrpc": "2.0", + "id": 7, + "method": "tools/list", + "params": { + "cursor": "next" + } + }) + ); + + assert_eq!( + serde_json::to_value(MCPError::method_not_found("tools/call")).unwrap(), + serde_json::json!({ + "code": -32601, + "message": "Method not found: tools/call" + }) + ); +} + +#[test] +fn mcp_protocol_prompt_content_helpers_preserve_legacy_text_behavior() { + let mut content = MCPPromptMessageContent::Plain("Review {{target}}".to_string()); + content.substitute_placeholders(&std::collections::HashMap::from([( + "target".to_string(), + "src/main.rs".to_string(), + )])); + + assert_eq!(content.text_or_placeholder(), "Review src/main.rs"); + + let image = MCPPromptMessageContent::Block(Box::new(MCPPromptMessageContentBlock::Image { + data: "base64".to_string(), + mime_type: "image/png".to_string(), + })); + assert_eq!(image.text_or_placeholder(), "[Image: image/png]"); +} + +#[test] +fn mcp_config_location_preserves_kebab_case_wire_contract() { + assert_eq!( + serde_json::to_value(ConfigLocation::BuiltIn).unwrap(), + serde_json::json!("built-in") + ); + assert_eq!( + serde_json::from_value::(serde_json::json!("user")).unwrap(), + ConfigLocation::User + ); + assert_eq!( + serde_json::from_value::(serde_json::json!("project")).unwrap(), + ConfigLocation::Project + ); +} + +#[test] +fn mcp_server_type_and_status_preserve_lowercase_wire_contract() { + assert_eq!( + serde_json::to_value(MCPServerType::Local).unwrap(), + serde_json::json!("local") + ); + assert_eq!( + serde_json::from_value::(serde_json::json!("remote")).unwrap(), + MCPServerType::Remote + ); + assert_eq!( + serde_json::to_value(MCPServerStatus::NeedsAuth).unwrap(), + serde_json::json!("needsauth") + ); + assert_eq!( + serde_json::from_value::(serde_json::json!("reconnecting")).unwrap(), + MCPServerStatus::Reconnecting + ); +} diff --git a/src/crates/services-integrations/tests/remote_ssh_contracts.rs b/src/crates/services-integrations/tests/remote_ssh_contracts.rs new file mode 100644 index 000000000..4b49562a8 --- /dev/null +++ b/src/crates/services-integrations/tests/remote_ssh_contracts.rs @@ -0,0 +1,60 @@ +#![cfg(feature = "remote-ssh")] + +use bitfun_services_integrations::remote_ssh::{ + RemoteWorkspace, SSHAuthMethod, SSHConnectionConfig, SavedAuthType, SavedConnection, +}; + +#[test] +fn remote_ssh_legacy_agent_auth_maps_to_default_private_key() { + let config: SSHConnectionConfig = serde_json::from_value(serde_json::json!({ + "id": "conn-1", + "name": "dev", + "host": "example.com", + "port": 22, + "username": "alice", + "auth": { "type": "Agent" }, + "defaultWorkspace": "/repo" + })) + .unwrap(); + + match config.auth { + SSHAuthMethod::PrivateKey { + key_path, + passphrase, + } => { + assert_eq!(key_path, "~/.ssh/id_rsa"); + assert_eq!(passphrase, None); + } + SSHAuthMethod::Password { .. } => panic!("legacy agent auth must map to private key"), + } + + let saved: SavedConnection = serde_json::from_value(serde_json::json!({ + "id": "conn-1", + "name": "dev", + "host": "example.com", + "port": 22, + "username": "alice", + "authType": { "type": "Agent" }, + "defaultWorkspace": "/repo", + "lastConnected": 1 + })) + .unwrap(); + + match saved.auth_type { + SavedAuthType::PrivateKey { key_path } => assert_eq!(key_path, "~/.ssh/id_rsa"), + SavedAuthType::Password => panic!("legacy agent auth type must map to private key"), + } +} + +#[test] +fn remote_workspace_defaults_keep_older_files_loadable() { + let workspace: RemoteWorkspace = serde_json::from_value(serde_json::json!({ + "connectionId": "conn-1" + })) + .unwrap(); + + assert_eq!(workspace.connection_id, "conn-1"); + assert_eq!(workspace.remote_path, ""); + assert_eq!(workspace.connection_name, ""); + assert_eq!(workspace.ssh_host, ""); +}