Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
b850973
refactor(ai): refactor prompt build to use system messages and indepe…
Mar 25, 2026
cdcf75f
fix(DataGrid): fix the problem that the selected status is not reset …
Mar 25, 2026
3e1e8fc
fix:fix the processing of empty passwords during the generation and n…
Mar 25, 2026
e1293e4
feat(db/ssh): Improve connection error handling and SSH tunnel defaul…
Mar 26, 2026
90dd2da
feat(transfer): supports importing SQL files to MySQL/PostgreSQL and …
Mar 27, 2026
d2867cb
style: Format code to conform to the project code style specification
Mar 27, 2026
ed5fba1
test:API key encryption and decryption test using random key
Mar 27, 2026
4778c62
fix(models): hide sensitive fields in the Debug implementation of Con…
Mar 27, 2026
825d68e
feat(clickhouse): supports the creation, update and deletion of Click…
Mar 27, 2026
2dda3ff
fix:Add ssl mode=DISABLED parameter in MySQL test DSN
Mar 28, 2026
da0f7de
feat(import): extend SQL import support to more databases and improve…
Mar 28, 2026
e8e3d98
ci:Add integration test automation and CI pipeline
Mar 28, 2026
594e207
feat(test): extend integration test to support MariaDB, SQL Server an…
Mar 28, 2026
629d681
feat(tests): Added ClickHouse and DuckDB integration test support
Mar 28, 2026
5e55027
fix(mysql): Correctly handle the number of affected rows of INSERT/UP…
Mar 28, 2026
88641f4
test:Add integration test for all database drivers
Mar 29, 2026
c1be17a
feat(mssql): introduce connection pool to improve concurrency perform…
Mar 29, 2026
a6998de
feat(lib): add direct call command function and extend integration test
Mar 29, 2026
308c3d6
feat:add a new database integration script and update the ignore conf…
Mar 29, 2026
59920a0
chore: Remove the old change log file to prepare for the new release …
Mar 29, 2026
22af678
chore: Update the project version number to 0.3.0
Mar 29, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 66 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
name: CI

on:
pull_request:
branches:
- main

jobs:
test:
runs-on: ubuntu-22.04
timeout-minutes: 40
Comment on lines +8 to +11
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add explicit permissions block to limit GITHUB_TOKEN scope.

The workflow does not limit the permissions of the GITHUB_TOKEN. For security best practices, add a minimal permissions block.

🔒 Proposed fix
 jobs:
   test:
     runs-on: ubuntu-22.04
     timeout-minutes: 40
+
+permissions:
+  contents: read

Or at the workflow level:

 on:
   pull_request:
     branches:
       - main

+permissions:
+  contents: read
+
 jobs:
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/ci.yml around lines 8 - 11, Add a minimal permissions
block to restrict the GITHUB_TOKEN scope by adding a permissions: mapping (for
example at the workflow root or under the job "test") that explicitly grants
only the least privileges needed (e.g., contents: read and actions: read or
other specific scopes your tests require) instead of the default full
permissions; update the workflow keys near the existing jobs/test configuration
(where "runs-on" and "timeout-minutes" are defined) to include this new
permissions block referencing GITHUB_TOKEN usage.


steps:
- name: Checkout
uses: actions/checkout@v4

- name: Setup Bun
uses: oven-sh/setup-bun@v2
with:
bun-version: latest

- name: Cache Bun dependencies
uses: actions/cache@v4
with:
path: |
node_modules
~/.bun/install/cache
key: ${{ runner.os }}-bun-${{ hashFiles('bun.lock') }}
restore-keys: |
${{ runner.os }}-bun-

- name: Install frontend dependencies
run: bun install --frozen-lockfile

- name: Setup Rust toolchain
uses: dtolnay/rust-toolchain@stable

- name: Cache Rust build artifacts
uses: Swatinem/rust-cache@v2
with:
workspaces: src-tauri -> src-tauri/target

- name: Run unit tests
run: bun run test:unit

- name: Run service tests
run: bun run test:service

- name: Run rust unit tests
run: bun run test:rust:unit

- name: Run integration tests (MySQL + Postgres with testcontainers)
run: IT_DB=all bun run test:integration

- name: Docker diagnostics on failure
if: failure()
run: |
echo "==== docker ps -a ===="
docker ps -a || true
echo "==== recent mysql/postgres logs ===="
for image in mysql:8.0 postgres:16-alpine; do
for id in $(docker ps -aq --filter "ancestor=${image}"); do
echo "--- logs for $id (${image}) ---"
docker logs "$id" || true
done
done
Comment on lines +10 to +66

Check warning

Code scanning / CodeQL

Workflow does not contain permissions Medium

Actions job or workflow does not limit the permissions of the GITHUB_TOKEN. Consider setting an explicit permissions block, using the following as a minimal starting point: {contents: read}

Copilot Autofix

AI 1 day ago

In general, to fix this class of issue you add an explicit permissions block at the workflow root or per job, restricting the GITHUB_TOKEN to the minimal rights needed (typically contents: read for a pure CI/test workflow). This both documents the intended access and ensures the workflow does not inherit broader defaults.

For this workflow, no step modifies repository contents, creates releases, comments on pull requests, or otherwise needs write access. The only privileged action is actions/checkout@v4, which works with contents: read. Caching with actions/cache, installing dependencies, running tests, and running Docker commands do not require GITHUB_TOKEN write scopes. Therefore the safest minimal fix is to add permissions: contents: read at the workflow root so it applies to all jobs.

Concretely:

  • Edit .github/workflows/ci.yml.
  • Add a root-level permissions: block after the name: CI line (before on:), with contents: read.
  • No additional imports, definitions, or changes to steps are required; this merely limits the default GITHUB_TOKEN capabilities.

Suggested changeset 1
.github/workflows/ci.yml

Autofix patch

Autofix patch
Run the following command in your local git repository to apply this patch
cat << 'EOF' | git apply
diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml
--- a/.github/workflows/ci.yml
+++ b/.github/workflows/ci.yml
@@ -1,5 +1,8 @@
 name: CI
 
+permissions:
+  contents: read
+
 on:
   pull_request:
     branches:
EOF
@@ -1,5 +1,8 @@
name: CI

permissions:
contents: read

on:
pull_request:
branches:
Copilot is powered by AI and may make mistakes. Always verify output.
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,9 @@ reference
# plan
.trae/documents/
.cursor/plans/

# skills
.trae/skills/*

# example
githubworkflowexample/*
86 changes: 0 additions & 86 deletions CHANGELOG.md

This file was deleted.

1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ English | [简体中文](README_CN.md) | [日本語](README_JA.md)
- Connect to PostgreSQL, MySQL, MariaDB (MySQL-compatible), TiDB (MySQL-compatible), SQLite, SQL Server, and ClickHouse (preview, currently read-only)
- Write and run SQL with syntax highlighting, auto-completion, and one-click formatting
- Browse query results in a data grid with filtering, sorting, pagination, and export
- Import `.sql` files into MySQL/MariaDB/TiDB/PostgreSQL/SQLite/DuckDB/SQL Server with all-or-nothing rollback
- Save and reuse frequently used SQL scripts with Saved Queries
- Use the AI sidebar to draft SQL and explain queries (optional)
- Access remote databases through SSH tunneling
Expand Down
1 change: 1 addition & 0 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@
- 连接 PostgreSQL、MySQL、MariaDB(MySQL 兼容)、TiDB(MySQL 兼容)、SQLite、SQL Server 与 ClickHouse(预览版,当前只读)
- 编写与执行 SQL:语法高亮、自动补全、一键格式化
- 在数据网格中浏览结果,支持过滤、排序、分页与导出
- 支持将 `.sql` 文件导入 MySQL/MariaDB/TiDB/PostgreSQL/SQLite/DuckDB/SQL Server,并在失败时全量回滚
- 使用 Saved Queries 保存并复用常用 SQL 脚本
- 使用 AI 侧边栏辅助写 SQL、解释查询(可选)
- 通过 SSH 隧道访问远程数据库
Expand Down
97 changes: 97 additions & 0 deletions docs/zh/Development/DEVELOPMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,103 @@ bun run test:rust:unit
bun run test:integration
```

### 集成测试自动化(MySQL + MariaDB + Postgres + ClickHouse + SQL Server + DuckDB)

- 默认执行 `bun run test:integration` 会自动启动/销毁 MySQL、MariaDB、Postgres、ClickHouse 与 SQL Server 容器(DuckDB 使用本地临时文件,不依赖容器)。
- 可通过 `IT_DB` 指定目标数据库:
```bash
IT_DB=mysql bun run test:integration
IT_DB=mariadb bun run test:integration
IT_DB=postgres bun run test:integration
IT_DB=clickhouse bun run test:integration
IT_DB=mssql bun run test:integration
IT_DB=duckdb bun run test:integration
IT_DB=all bun run test:integration
```
- 如需复用你本地已经启动的数据库(兼容旧流程),可设置:
```bash
IT_REUSE_LOCAL_DB=1 bun run test:integration
```

### 集成测试常见环境变量(可选覆盖)

- MySQL: `MYSQL_HOST` `MYSQL_PORT` `MYSQL_USER` `MYSQL_PASSWORD` `MYSQL_DB`
- MariaDB: `MARIADB_HOST` `MARIADB_PORT` `MARIADB_USER` `MARIADB_PASSWORD` `MARIADB_DB`
- Postgres: `POSTGRES_HOST` `POSTGRES_PORT` `POSTGRES_USER` `POSTGRES_PASSWORD` `POSTGRES_DB`
- ClickHouse: `CLICKHOUSE_HOST` `CLICKHOUSE_PORT` `CLICKHOUSE_USER` `CLICKHOUSE_PASSWORD` `CLICKHOUSE_DB`
- SQL Server: `MSSQL_HOST` `MSSQL_PORT` `MSSQL_USER` `MSSQL_PASSWORD` `MSSQL_DB`
- DuckDB: `DUCKDB_IT_DB_PATH`(可选)`DUCKDB_DB_PATH`(可选)
- 兼容 Postgres 常见别名: `PG_HOST` `PG_PORT` `PGUSER` `PGPASSWORD` `PGDATABASE`

### 排障建议

- 镜像拉取慢:先手动执行 `docker pull mysql:8.0`、`docker pull mariadb:11`、`docker pull postgres:16-alpine`、`docker pull clickhouse/clickhouse-server:24.3` 和 `docker pull mcr.microsoft.com/mssql/server:2022-latest` 预热。
- 端口冲突:集成测试默认使用 Docker 动态映射端口,通常不会冲突;如本地复用模式冲突,请调整 `*_PORT`。
- Apple 芯片兼容:若首次拉取较慢,建议预先拉取镜像并等待 Docker Desktop 完成架构层初始化。

### 推荐工作流

- 日常开发:优先执行 `test:unit` + `test:service`。
- 提交前:按需执行 `test:integration` 做数据库回归。
- PR:CI 会固定执行集成测试作为质量兜底。

### 功能开发后怎么跑测试(实践版)

1. 开发过程中(高频、快速反馈)

- 先跑:
```bash
bun run test:unit
bun run test:service
```
- 适用:前端逻辑、业务逻辑、小范围改动的快速验证。

2. 改动涉及数据库行为时(中频)

- 跑:
```bash
IT_DB=all bun run test:integration
```
- 或按需只跑单库:
```bash
IT_DB=mysql bun run test:integration
IT_DB=mariadb bun run test:integration
IT_DB=postgres bun run test:integration
IT_DB=clickhouse bun run test:integration
IT_DB=mssql bun run test:integration
IT_DB=duckdb bun run test:integration
```
- 适用:连接参数、驱动逻辑、执行 SQL、表/库元数据、DDL/DML、类型映射相关改动。

3. 提交前(低频但建议)

- 至少跑一次:
```bash
IT_DB=all bun run test:integration
```
- PR 流水线会再次自动跑,作为最终兜底。

### 这套集成测试覆盖什么 / 不覆盖什么

- 能覆盖:
- Rust 数据库层真实连库能力
- 常见数据库操作链路(连接、建表、查询、元数据、DDL)
- 驱动兼容与类型映射问题
- 不覆盖:
- 前端 UI 的“点点点”交互流程(这属于 E2E/UI 自动化范畴)
- 纯视觉样式问题

### 什么时候可以不跑集成测试

- 仅改文案、样式、纯前端展示层,且不影响数据库交互。
- 仅改与数据库完全无关的代码。
- 快速迭代中间版本可不跑;合并前建议补跑一次。

### 容器清理说明

- 默认模式(未设置 `IT_REUSE_LOCAL_DB=1`)下,测试使用 testcontainers 拉起临时容器,测试结束后会自动销毁。
- 设置 `IT_REUSE_LOCAL_DB=1` 时,测试会连接你手动准备的数据库实例,不会自动删除你自己的容器。

## 代码格式化

```bash
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
"url": "git+https://github.com/codeErrorSleep/dbpaw.git"
},
"private": true,
"version": "0.2.9",
"version": "0.3.0",
"type": "module",
"scripts": {
"dev": "vite",
Expand Down
97 changes: 97 additions & 0 deletions scripts/db-onboard.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
#!/usr/bin/env bash
set -euo pipefail

if [[ $# -lt 1 ]]; then
echo "Usage: scripts/db-onboard.sh <db> [--skip-gate] [--skip-matrix]"
exit 1
fi

db="$1"
shift || true

skip_gate=0
skip_matrix=0

for arg in "$@"; do
case "$arg" in
--skip-gate)
skip_gate=1
;;
--skip-matrix)
skip_matrix=1
;;
*)
echo "[error] unknown option: $arg"
echo "Usage: scripts/db-onboard.sh <db> [--skip-gate] [--skip-matrix]"
exit 1
;;
esac
done

root_dir="$(cd "$(dirname "${BASH_SOURCE[0]}")/.." && pwd)"
cd "${root_dir}"

context_file="src-tauri/tests/common/${db}_context.rs"
integration_file="src-tauri/tests/${db}_integration.rs"
command_file="src-tauri/tests/${db}_command_integration.rs"
stateful_file="src-tauri/tests/${db}_stateful_command_integration.rs"
tracker_file="docs/zh/Development/MYSQL_TEST_COVERAGE_GAP_TRACKER.md"

echo "[step] scaffold check: ${db}"
missing=0
for file in "${context_file}" "${integration_file}" "${command_file}" "${stateful_file}"; do
if [[ ! -f "${file}" ]]; then
echo "[missing] ${file}"
missing=1
else
echo "[ok] ${file}"
fi
done

if [[ ${missing} -ne 0 ]]; then
echo "[error] scaffold is incomplete for '${db}'."
echo "[hint] finish scaffold first, then rerun scripts/db-onboard.sh ${db}"
exit 1
fi

if [[ ${skip_gate} -eq 0 ]]; then
echo "[step] gate syntax check"
bash -n scripts/test-integration.sh

echo "[step] compile smoke: ${db}_integration"
cargo test --manifest-path src-tauri/Cargo.toml --test "${db}_integration" --no-run

echo "[step] compile smoke: ${db}_command_integration"
cargo test --manifest-path src-tauri/Cargo.toml --test "${db}_command_integration" --no-run

echo "[step] compile smoke: ${db}_stateful_command_integration"
cargo test --manifest-path src-tauri/Cargo.toml --test "${db}_stateful_command_integration" --no-run

echo "[step] integration gate run: IT_DB=${db}"
IT_DB="${db}" bash scripts/test-integration.sh
else
echo "[skip] gate run skipped by --skip-gate"
fi

if [[ ${skip_matrix} -eq 0 ]]; then
echo "[step] matrix sync check"
test_count="$(rg -n "async fn test_${db}_" src-tauri/tests --glob "*.rs" || true)"
test_count="$(printf "%s\n" "${test_count}" | sed '/^$/d' | wc -l | tr -d ' ')"
echo "[info] detected test functions for ${db}: ${test_count}"
if [[ -f "${tracker_file}" ]]; then
tracker_hits="$(rg -n "test_${db}_" "${tracker_file}" || true)"
tracker_hits="$(printf "%s\n" "${tracker_hits}" | sed '/^$/d' | wc -l | tr -d ' ')"
if [[ "${tracker_hits}" -eq 0 ]]; then
echo "[warn] tracker has no '${db}' test entries yet: ${tracker_file}"
echo "[next] sync capability matrix and command coverage sections for '${db}'"
else
echo "[ok] tracker already contains ${tracker_hits} '${db}' test entries"
fi
else
echo "[warn] tracker file not found: ${tracker_file}"
fi
else
echo "[skip] matrix sync check skipped by --skip-matrix"
fi

echo "[done] db onboarding pipeline finished for '${db}'"
Loading
Loading