From 01f1a6339ff2a4fa96b6707d1e811394012cab0c Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Wed, 4 Feb 2026 19:46:16 -0600 Subject: [PATCH 01/15] noxfile, docs: Add detailed coverage report generation and documentation updates - Updated `noxfile.py` to include coverage reports in HTML, XML, and Markdown, stored under `coverage/py/`. - Extended `TESTING_GUIDELINES.md` and `AGENTS.md` to document new coverage report formats and locations. - Reinforced maintaining high client coverage for key methods and ensuring serialization of optional payload branches is tested. Assisted-by: Codex --- .gitignore | 1 + AGENTS.md | 11 +++++++++++ TESTING_GUIDELINES.md | 10 ++++++++++ noxfile.py | 8 ++++++++ 4 files changed, 30 insertions(+) diff --git a/.gitignore b/.gitignore index c104834d..78e07f63 100644 --- a/.gitignore +++ b/.gitignore @@ -453,3 +453,4 @@ $RECYCLE.BIN/ *.speedscope.json !pyrightconfig.json +/coverage/ diff --git a/AGENTS.md b/AGENTS.md index 1f55440a..21b60517 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -28,6 +28,10 @@ - `nox` executes pytest sessions with built-in parallelism; when invoking pytest directly use `pytest -n 8 --maxschedchunk 2` to mirror the parallel test scheduling and keep runtimes predictable. +- Coverage reports (XML/Markdown/HTML) are produced by the nox `tests` session + and stored under `coverage/py/` (for example, + `coverage/py3.12/coverage.xml`, `coverage/py3.12/coverage.md`, + `coverage/py3.12/html/`). ## Coding Style & Naming Conventions @@ -138,6 +142,13 @@ description with the reason and the follow-up plan; otherwise reviewers should block the change. Treat this as a release gate on par with unit tests. +- **Client coverage criteria:** `PdfRestClient` and `AsyncPdfRestClient` are + customer-facing entry points and must retain high coverage. Every public + client method must have at least one unit test that exercises the REST call + path (MockTransport + request assertions), with distinct sync and async tests. + Optional payload branches (`pages`, `output`, `rgb_color`, etc.) need explicit + coverage so serialization regressions are caught. + - Write pytest tests: files named `test_*.py`, test functions `test_*`, fixtures in `conftest.py` where shared. diff --git a/TESTING_GUIDELINES.md b/TESTING_GUIDELINES.md index f2603b20..473dc63d 100644 --- a/TESTING_GUIDELINES.md +++ b/TESTING_GUIDELINES.md @@ -13,6 +13,12 @@ iteration required. request customization, validation failures, file helpers, and live calls. Do not hide the transport behind a parameter; the test name itself should reveal which client is under test. +- **Maintain high client coverage.** `PdfRestClient` and `AsyncPdfRestClient` + are the primary customer-facing entry points. Every public client method must + have at least one unit test that exercises the REST call path (MockTransport + asserting method/path/headers/body). Optional payload branches (for example, + `pages`, `output`, `rgb_color`, and output-prefix fields) require explicit + tests so serialization differences are caught early. - **Check parity regularly.** Run `scripts/check_test_parity.sh` (defaults to `upstream/main..HEAD`) to spot missing sync/async counterparts, keeping parameterized test IDs aligned between transports. @@ -20,6 +26,10 @@ iteration required. `httpx.MockTransport`) validate serialization and local validation. Live suites prove the server behaves the same way, including invalid literal handling. +- **Know where coverage lands.** The nox `tests` session writes coverage reports + to `coverage/py/` (XML, Markdown, and HTML). Example: + `coverage/py3.12/coverage.xml`, `coverage/py3.12/coverage.md`, + `coverage/py3.12/html/`. - **Reset global state per test.** Use `monkeypatch.delenv("PDFREST_API_KEY", raising=False)` (or `setenv`) so clients never inherit accidental API keys. Patch `importlib.metadata.version` diff --git a/noxfile.py b/noxfile.py index 261fbefb..48cd14ad 100644 --- a/noxfile.py +++ b/noxfile.py @@ -189,10 +189,18 @@ def tests(session: nox.Session) -> None: f"--python={session.virtualenv.location}", env={"UV_PROJECT_ENVIRONMENT": session.virtualenv.location}, ) + coverage_dir = PROJECT_ROOT / "coverage" / f"py{session.python}" + coverage_dir.mkdir(parents=True, exist_ok=True) + htmlcov_dir = coverage_dir / "html" + xml_report = coverage_dir / "coverage.xml" + md_report = coverage_dir / "coverage.md" _ = session.run( "pytest", "--cov=pdfrest", "--cov-report=term-missing", + f"--cov-report=html:{htmlcov_dir}", + f"--cov-report=xml:{xml_report}", + f"--cov-report=markdown:{md_report}", *pytest_args, ) From 80b85dedb864094ec01009f0243ffc130f1a966d Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Wed, 4 Feb 2026 19:46:32 -0600 Subject: [PATCH 02/15] github/workflows: Add job to upload coverage reports - Updated `test-and-publish.yml` to include coverage report uploads for all tested Python versions. - Stored coverage reports under `coverage/py` for organized access and tracking. Assisted-by: Codex --- .github/workflows/test-and-publish.yml | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/.github/workflows/test-and-publish.yml b/.github/workflows/test-and-publish.yml index 4775d8cb..a34a98c8 100644 --- a/.github/workflows/test-and-publish.yml +++ b/.github/workflows/test-and-publish.yml @@ -38,6 +38,12 @@ jobs: run: uvx nox --python ${{ matrix.python-version }} --session tests -- --no-parallel env: PDFREST_API_KEY: ${{ secrets.PDFREST_API_KEY }} + - name: Upload coverage reports + if: always() + uses: actions/upload-artifact@v4 + with: + name: coverage-${{ matrix.python-version }} + path: coverage/py${{ matrix.python-version }} examples: name: Examples (Python ${{ matrix.python-version }}) From a64568c93f400b5ed2cb5c6ae5f81237662525ca Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 10:04:57 -0600 Subject: [PATCH 03/15] pyproject, github/workflows: Add diff-cover for CI checks and dependencies - Added `diff-cover` (>=10.2.0) as a development dependency in `pyproject.toml`. - Modified `test-and-publish.yml` to add a new step for running `diff-cover` during pull request workflows. - Ensures new code meets a minimum 90% coverage threshold. - Generates a Markdown report for coverage in modified code sections. Assisted-by: Codex --- .github/workflows/test-and-publish.yml | 10 ++ pyproject.toml | 1 + uv.lock | 123 +++++++++++++++++++++++++ 3 files changed, 134 insertions(+) diff --git a/.github/workflows/test-and-publish.yml b/.github/workflows/test-and-publish.yml index a34a98c8..9a3f0f24 100644 --- a/.github/workflows/test-and-publish.yml +++ b/.github/workflows/test-and-publish.yml @@ -38,6 +38,16 @@ jobs: run: uvx nox --python ${{ matrix.python-version }} --session tests -- --no-parallel env: PDFREST_API_KEY: ${{ secrets.PDFREST_API_KEY }} + - name: Fetch base branch for diff-cover + if: github.event_name == 'pull_request' + run: git fetch origin ${{ github.base_ref }} --depth=1 + - name: Run diff-cover (new code must be >= 90%) + if: github.event_name == 'pull_request' + run: > + uv run diff-cover coverage/py${{ matrix.python-version }}/coverage.xml + --compare-branch origin/${{ github.base_ref }} + --fail-under 90 + --markdown-report coverage/py${{ matrix.python-version }}/diff-cover.md - name: Upload coverage reports if: always() uses: actions/upload-artifact@v4 diff --git a/pyproject.toml b/pyproject.toml index a28be645..c7c92469 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -34,6 +34,7 @@ dev = [ "nox>=2025.5.1", "basedpyright>=1.34.0", "python-dotenv>=1.0.1", + "diff-cover>=10.2.0", ] [tool.pytest.ini_options] diff --git a/uv.lock b/uv.lock index aa0a1f3c..d78991ea 100644 --- a/uv.lock +++ b/uv.lock @@ -110,6 +110,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/c5/55/51844dd50c4fc7a33b653bfaba4c2456f06955289ca770a5dbd5fd267374/cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9", size = 7249, upload-time = "2023-08-12T20:38:16.269Z" }, ] +[[package]] +name = "chardet" +version = "5.2.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f3/0d/f7b6ab21ec75897ed80c17d79b15951a719226b9fababf1e40ea74d69079/chardet-5.2.0.tar.gz", hash = "sha256:1b3b6ff479a8c414bc3fa2c0852995695c4a026dcd6d0633b2dd092ca39c1cf7", size = 2069618, upload-time = "2023-08-01T19:23:02.662Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/38/6f/f5fbc992a329ee4e0f288c1fe0e2ad9485ed064cac731ed2fe47dcc38cbf/chardet-5.2.0-py3-none-any.whl", hash = "sha256:e1cf59446890a00105fe7b7912492ea04b6e6f06d4b742b2c788469e34c82970", size = 199385, upload-time = "2023-08-01T19:23:00.661Z" }, +] + [[package]] name = "charset-normalizer" version = "3.4.3" @@ -336,6 +345,21 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/99/c7/d1ec24fb280caa5a79b6b950db565dab30210a66259d17d5bb2b3a9f878d/dependency_groups-1.3.1-py3-none-any.whl", hash = "sha256:51aeaa0dfad72430fcfb7bcdbefbd75f3792e5919563077f30bc0d73f4493030", size = 8664, upload-time = "2025-05-02T00:34:27.085Z" }, ] +[[package]] +name = "diff-cover" +version = "10.2.0" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "chardet" }, + { name = "jinja2" }, + { name = "pluggy" }, + { name = "pygments" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/99/b4/eee71d1e338bc1f9bd3539b46b70e303dac061324b759c9a80fa3c96d90d/diff_cover-10.2.0.tar.gz", hash = "sha256:61bf83025f10510c76ef6a5820680cf61b9b974e8f81de70c57ac926fa63872a", size = 102473, upload-time = "2026-01-09T01:59:07.605Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b3/2c/61eeb887055a37150db824b6bf830e821a736580769ac2fea4eadb0d613f/diff_cover-10.2.0-py3-none-any.whl", hash = "sha256:59c328595e0b8948617cc5269af9e484c86462e2844bfcafa3fb37f8fca0af87", size = 56748, upload-time = "2026-01-09T01:59:06.028Z" }, +] + [[package]] name = "distlib" version = "0.4.0" @@ -439,6 +463,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" }, ] +[[package]] +name = "jinja2" +version = "3.1.6" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "markupsafe" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/df/bf/f7da0350254c0ed7c72f3e33cef02e048281fec7ecec5f032d4aac52226b/jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", size = 245115, upload-time = "2025-03-05T20:05:02.478Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" }, +] + [[package]] name = "langcodes" version = "3.5.1" @@ -472,6 +508,91 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" }, ] +[[package]] +name = "markupsafe" +version = "3.0.3" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/e8/4b/3541d44f3937ba468b75da9eebcae497dcf67adb65caa16760b0a6807ebb/markupsafe-3.0.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:2f981d352f04553a7171b8e44369f2af4055f888dfb147d55e42d29e29e74559", size = 11631, upload-time = "2025-09-27T18:36:05.558Z" }, + { url = "https://files.pythonhosted.org/packages/98/1b/fbd8eed11021cabd9226c37342fa6ca4e8a98d8188a8d9b66740494960e4/markupsafe-3.0.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e1c1493fb6e50ab01d20a22826e57520f1284df32f2d8601fdd90b6304601419", size = 12057, upload-time = "2025-09-27T18:36:07.165Z" }, + { url = "https://files.pythonhosted.org/packages/40/01/e560d658dc0bb8ab762670ece35281dec7b6c1b33f5fbc09ebb57a185519/markupsafe-3.0.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1ba88449deb3de88bd40044603fafffb7bc2b055d626a330323a9ed736661695", size = 22050, upload-time = "2025-09-27T18:36:08.005Z" }, + { url = "https://files.pythonhosted.org/packages/af/cd/ce6e848bbf2c32314c9b237839119c5a564a59725b53157c856e90937b7a/markupsafe-3.0.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f42d0984e947b8adf7dd6dde396e720934d12c506ce84eea8476409563607591", size = 20681, upload-time = "2025-09-27T18:36:08.881Z" }, + { url = "https://files.pythonhosted.org/packages/c9/2a/b5c12c809f1c3045c4d580b035a743d12fcde53cf685dbc44660826308da/markupsafe-3.0.3-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c0c0b3ade1c0b13b936d7970b1d37a57acde9199dc2aecc4c336773e1d86049c", size = 20705, upload-time = "2025-09-27T18:36:10.131Z" }, + { url = "https://files.pythonhosted.org/packages/cf/e3/9427a68c82728d0a88c50f890d0fc072a1484de2f3ac1ad0bfc1a7214fd5/markupsafe-3.0.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:0303439a41979d9e74d18ff5e2dd8c43ed6c6001fd40e5bf2e43f7bd9bbc523f", size = 21524, upload-time = "2025-09-27T18:36:11.324Z" }, + { url = "https://files.pythonhosted.org/packages/bc/36/23578f29e9e582a4d0278e009b38081dbe363c5e7165113fad546918a232/markupsafe-3.0.3-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:d2ee202e79d8ed691ceebae8e0486bd9a2cd4794cec4824e1c99b6f5009502f6", size = 20282, upload-time = "2025-09-27T18:36:12.573Z" }, + { url = "https://files.pythonhosted.org/packages/56/21/dca11354e756ebd03e036bd8ad58d6d7168c80ce1fe5e75218e4945cbab7/markupsafe-3.0.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:177b5253b2834fe3678cb4a5f0059808258584c559193998be2601324fdeafb1", size = 20745, upload-time = "2025-09-27T18:36:13.504Z" }, + { url = "https://files.pythonhosted.org/packages/87/99/faba9369a7ad6e4d10b6a5fbf71fa2a188fe4a593b15f0963b73859a1bbd/markupsafe-3.0.3-cp310-cp310-win32.whl", hash = "sha256:2a15a08b17dd94c53a1da0438822d70ebcd13f8c3a95abe3a9ef9f11a94830aa", size = 14571, upload-time = "2025-09-27T18:36:14.779Z" }, + { url = "https://files.pythonhosted.org/packages/d6/25/55dc3ab959917602c96985cb1253efaa4ff42f71194bddeb61eb7278b8be/markupsafe-3.0.3-cp310-cp310-win_amd64.whl", hash = "sha256:c4ffb7ebf07cfe8931028e3e4c85f0357459a3f9f9490886198848f4fa002ec8", size = 15056, upload-time = "2025-09-27T18:36:16.125Z" }, + { url = "https://files.pythonhosted.org/packages/d0/9e/0a02226640c255d1da0b8d12e24ac2aa6734da68bff14c05dd53b94a0fc3/markupsafe-3.0.3-cp310-cp310-win_arm64.whl", hash = "sha256:e2103a929dfa2fcaf9bb4e7c091983a49c9ac3b19c9061b6d5427dd7d14d81a1", size = 13932, upload-time = "2025-09-27T18:36:17.311Z" }, + { url = "https://files.pythonhosted.org/packages/08/db/fefacb2136439fc8dd20e797950e749aa1f4997ed584c62cfb8ef7c2be0e/markupsafe-3.0.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:1cc7ea17a6824959616c525620e387f6dd30fec8cb44f649e31712db02123dad", size = 11631, upload-time = "2025-09-27T18:36:18.185Z" }, + { url = "https://files.pythonhosted.org/packages/e1/2e/5898933336b61975ce9dc04decbc0a7f2fee78c30353c5efba7f2d6ff27a/markupsafe-3.0.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4bd4cd07944443f5a265608cc6aab442e4f74dff8088b0dfc8238647b8f6ae9a", size = 12058, upload-time = "2025-09-27T18:36:19.444Z" }, + { url = "https://files.pythonhosted.org/packages/1d/09/adf2df3699d87d1d8184038df46a9c80d78c0148492323f4693df54e17bb/markupsafe-3.0.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b5420a1d9450023228968e7e6a9ce57f65d148ab56d2313fcd589eee96a7a50", size = 24287, upload-time = "2025-09-27T18:36:20.768Z" }, + { url = "https://files.pythonhosted.org/packages/30/ac/0273f6fcb5f42e314c6d8cd99effae6a5354604d461b8d392b5ec9530a54/markupsafe-3.0.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0bf2a864d67e76e5c9a34dc26ec616a66b9888e25e7b9460e1c76d3293bd9dbf", size = 22940, upload-time = "2025-09-27T18:36:22.249Z" }, + { url = "https://files.pythonhosted.org/packages/19/ae/31c1be199ef767124c042c6c3e904da327a2f7f0cd63a0337e1eca2967a8/markupsafe-3.0.3-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc51efed119bc9cfdf792cdeaa4d67e8f6fcccab66ed4bfdd6bde3e59bfcbb2f", size = 21887, upload-time = "2025-09-27T18:36:23.535Z" }, + { url = "https://files.pythonhosted.org/packages/b2/76/7edcab99d5349a4532a459e1fe64f0b0467a3365056ae550d3bcf3f79e1e/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:068f375c472b3e7acbe2d5318dea141359e6900156b5b2ba06a30b169086b91a", size = 23692, upload-time = "2025-09-27T18:36:24.823Z" }, + { url = "https://files.pythonhosted.org/packages/a4/28/6e74cdd26d7514849143d69f0bf2399f929c37dc2b31e6829fd2045b2765/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7be7b61bb172e1ed687f1754f8e7484f1c8019780f6f6b0786e76bb01c2ae115", size = 21471, upload-time = "2025-09-27T18:36:25.95Z" }, + { url = "https://files.pythonhosted.org/packages/62/7e/a145f36a5c2945673e590850a6f8014318d5577ed7e5920a4b3448e0865d/markupsafe-3.0.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:f9e130248f4462aaa8e2552d547f36ddadbeaa573879158d721bbd33dfe4743a", size = 22923, upload-time = "2025-09-27T18:36:27.109Z" }, + { url = "https://files.pythonhosted.org/packages/0f/62/d9c46a7f5c9adbeeeda52f5b8d802e1094e9717705a645efc71b0913a0a8/markupsafe-3.0.3-cp311-cp311-win32.whl", hash = "sha256:0db14f5dafddbb6d9208827849fad01f1a2609380add406671a26386cdf15a19", size = 14572, upload-time = "2025-09-27T18:36:28.045Z" }, + { url = "https://files.pythonhosted.org/packages/83/8a/4414c03d3f891739326e1783338e48fb49781cc915b2e0ee052aa490d586/markupsafe-3.0.3-cp311-cp311-win_amd64.whl", hash = "sha256:de8a88e63464af587c950061a5e6a67d3632e36df62b986892331d4620a35c01", size = 15077, upload-time = "2025-09-27T18:36:29.025Z" }, + { url = "https://files.pythonhosted.org/packages/35/73/893072b42e6862f319b5207adc9ae06070f095b358655f077f69a35601f0/markupsafe-3.0.3-cp311-cp311-win_arm64.whl", hash = "sha256:3b562dd9e9ea93f13d53989d23a7e775fdfd1066c33494ff43f5418bc8c58a5c", size = 13876, upload-time = "2025-09-27T18:36:29.954Z" }, + { url = "https://files.pythonhosted.org/packages/5a/72/147da192e38635ada20e0a2e1a51cf8823d2119ce8883f7053879c2199b5/markupsafe-3.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d53197da72cc091b024dd97249dfc7794d6a56530370992a5e1a08983ad9230e", size = 11615, upload-time = "2025-09-27T18:36:30.854Z" }, + { url = "https://files.pythonhosted.org/packages/9a/81/7e4e08678a1f98521201c3079f77db69fb552acd56067661f8c2f534a718/markupsafe-3.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce", size = 12020, upload-time = "2025-09-27T18:36:31.971Z" }, + { url = "https://files.pythonhosted.org/packages/1e/2c/799f4742efc39633a1b54a92eec4082e4f815314869865d876824c257c1e/markupsafe-3.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d", size = 24332, upload-time = "2025-09-27T18:36:32.813Z" }, + { url = "https://files.pythonhosted.org/packages/3c/2e/8d0c2ab90a8c1d9a24f0399058ab8519a3279d1bd4289511d74e909f060e/markupsafe-3.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d", size = 22947, upload-time = "2025-09-27T18:36:33.86Z" }, + { url = "https://files.pythonhosted.org/packages/2c/54/887f3092a85238093a0b2154bd629c89444f395618842e8b0c41783898ea/markupsafe-3.0.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a", size = 21962, upload-time = "2025-09-27T18:36:35.099Z" }, + { url = "https://files.pythonhosted.org/packages/c9/2f/336b8c7b6f4a4d95e91119dc8521402461b74a485558d8f238a68312f11c/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b", size = 23760, upload-time = "2025-09-27T18:36:36.001Z" }, + { url = "https://files.pythonhosted.org/packages/32/43/67935f2b7e4982ffb50a4d169b724d74b62a3964bc1a9a527f5ac4f1ee2b/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f", size = 21529, upload-time = "2025-09-27T18:36:36.906Z" }, + { url = "https://files.pythonhosted.org/packages/89/e0/4486f11e51bbba8b0c041098859e869e304d1c261e59244baa3d295d47b7/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b", size = 23015, upload-time = "2025-09-27T18:36:37.868Z" }, + { url = "https://files.pythonhosted.org/packages/2f/e1/78ee7a023dac597a5825441ebd17170785a9dab23de95d2c7508ade94e0e/markupsafe-3.0.3-cp312-cp312-win32.whl", hash = "sha256:d88b440e37a16e651bda4c7c2b930eb586fd15ca7406cb39e211fcff3bf3017d", size = 14540, upload-time = "2025-09-27T18:36:38.761Z" }, + { url = "https://files.pythonhosted.org/packages/aa/5b/bec5aa9bbbb2c946ca2733ef9c4ca91c91b6a24580193e891b5f7dbe8e1e/markupsafe-3.0.3-cp312-cp312-win_amd64.whl", hash = "sha256:26a5784ded40c9e318cfc2bdb30fe164bdb8665ded9cd64d500a34fb42067b1c", size = 15105, upload-time = "2025-09-27T18:36:39.701Z" }, + { url = "https://files.pythonhosted.org/packages/e5/f1/216fc1bbfd74011693a4fd837e7026152e89c4bcf3e77b6692fba9923123/markupsafe-3.0.3-cp312-cp312-win_arm64.whl", hash = "sha256:35add3b638a5d900e807944a078b51922212fb3dedb01633a8defc4b01a3c85f", size = 13906, upload-time = "2025-09-27T18:36:40.689Z" }, + { url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" }, + { url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" }, + { url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" }, + { url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" }, + { url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" }, + { url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" }, + { url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" }, + { url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" }, + { url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" }, + { url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" }, + { url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" }, + { url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" }, + { url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" }, + { url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" }, + { url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" }, + { url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" }, + { url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" }, + { url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" }, + { url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" }, + { url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" }, + { url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" }, + { url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" }, + { url = "https://files.pythonhosted.org/packages/33/8a/8e42d4838cd89b7dde187011e97fe6c3af66d8c044997d2183fbd6d31352/markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", size = 11619, upload-time = "2025-09-27T18:37:06.342Z" }, + { url = "https://files.pythonhosted.org/packages/b5/64/7660f8a4a8e53c924d0fa05dc3a55c9cee10bbd82b11c5afb27d44b096ce/markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", size = 12029, upload-time = "2025-09-27T18:37:07.213Z" }, + { url = "https://files.pythonhosted.org/packages/da/ef/e648bfd021127bef5fa12e1720ffed0c6cbb8310c8d9bea7266337ff06de/markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", size = 24408, upload-time = "2025-09-27T18:37:09.572Z" }, + { url = "https://files.pythonhosted.org/packages/41/3c/a36c2450754618e62008bf7435ccb0f88053e07592e6028a34776213d877/markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", size = 23005, upload-time = "2025-09-27T18:37:10.58Z" }, + { url = "https://files.pythonhosted.org/packages/bc/20/b7fdf89a8456b099837cd1dc21974632a02a999ec9bf7ca3e490aacd98e7/markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", size = 22048, upload-time = "2025-09-27T18:37:11.547Z" }, + { url = "https://files.pythonhosted.org/packages/9a/a7/591f592afdc734f47db08a75793a55d7fbcc6902a723ae4cfbab61010cc5/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", size = 23821, upload-time = "2025-09-27T18:37:12.48Z" }, + { url = "https://files.pythonhosted.org/packages/7d/33/45b24e4f44195b26521bc6f1a82197118f74df348556594bd2262bda1038/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", size = 21606, upload-time = "2025-09-27T18:37:13.485Z" }, + { url = "https://files.pythonhosted.org/packages/ff/0e/53dfaca23a69fbfbbf17a4b64072090e70717344c52eaaaa9c5ddff1e5f0/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", size = 23043, upload-time = "2025-09-27T18:37:14.408Z" }, + { url = "https://files.pythonhosted.org/packages/46/11/f333a06fc16236d5238bfe74daccbca41459dcd8d1fa952e8fbd5dccfb70/markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9", size = 14747, upload-time = "2025-09-27T18:37:15.36Z" }, + { url = "https://files.pythonhosted.org/packages/28/52/182836104b33b444e400b14f797212f720cbc9ed6ba34c800639d154e821/markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581", size = 15341, upload-time = "2025-09-27T18:37:16.496Z" }, + { url = "https://files.pythonhosted.org/packages/6f/18/acf23e91bd94fd7b3031558b1f013adfa21a8e407a3fdb32745538730382/markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4", size = 14073, upload-time = "2025-09-27T18:37:17.476Z" }, + { url = "https://files.pythonhosted.org/packages/3c/f0/57689aa4076e1b43b15fdfa646b04653969d50cf30c32a102762be2485da/markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", size = 11661, upload-time = "2025-09-27T18:37:18.453Z" }, + { url = "https://files.pythonhosted.org/packages/89/c3/2e67a7ca217c6912985ec766c6393b636fb0c2344443ff9d91404dc4c79f/markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", size = 12069, upload-time = "2025-09-27T18:37:19.332Z" }, + { url = "https://files.pythonhosted.org/packages/f0/00/be561dce4e6ca66b15276e184ce4b8aec61fe83662cce2f7d72bd3249d28/markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", size = 25670, upload-time = "2025-09-27T18:37:20.245Z" }, + { url = "https://files.pythonhosted.org/packages/50/09/c419f6f5a92e5fadde27efd190eca90f05e1261b10dbd8cbcb39cd8ea1dc/markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50", size = 23598, upload-time = "2025-09-27T18:37:21.177Z" }, + { url = "https://files.pythonhosted.org/packages/22/44/a0681611106e0b2921b3033fc19bc53323e0b50bc70cffdd19f7d679bb66/markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", size = 23261, upload-time = "2025-09-27T18:37:22.167Z" }, + { url = "https://files.pythonhosted.org/packages/5f/57/1b0b3f100259dc9fffe780cfb60d4be71375510e435efec3d116b6436d43/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", size = 24835, upload-time = "2025-09-27T18:37:23.296Z" }, + { url = "https://files.pythonhosted.org/packages/26/6a/4bf6d0c97c4920f1597cc14dd720705eca0bf7c787aebc6bb4d1bead5388/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", size = 22733, upload-time = "2025-09-27T18:37:24.237Z" }, + { url = "https://files.pythonhosted.org/packages/14/c7/ca723101509b518797fedc2fdf79ba57f886b4aca8a7d31857ba3ee8281f/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", size = 23672, upload-time = "2025-09-27T18:37:25.271Z" }, + { url = "https://files.pythonhosted.org/packages/fb/df/5bd7a48c256faecd1d36edc13133e51397e41b73bb77e1a69deab746ebac/markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d", size = 14819, upload-time = "2025-09-27T18:37:26.285Z" }, + { url = "https://files.pythonhosted.org/packages/1a/8a/0402ba61a2f16038b48b39bccca271134be00c5c9f0f623208399333c448/markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9", size = 15426, upload-time = "2025-09-27T18:37:27.316Z" }, + { url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" }, +] + [[package]] name = "mdurl" version = "0.1.2" @@ -617,6 +738,7 @@ dependencies = [ [package.dev-dependencies] dev = [ { name = "basedpyright" }, + { name = "diff-cover" }, { name = "nox" }, { name = "pip-audit" }, { name = "pre-commit" }, @@ -643,6 +765,7 @@ requires-dist = [ [package.metadata.requires-dev] dev = [ { name = "basedpyright", specifier = ">=1.34.0" }, + { name = "diff-cover", specifier = ">=10.2.0" }, { name = "nox", specifier = ">=2025.5.1" }, { name = "pip-audit", specifier = ">=2.7.3" }, { name = "pre-commit", specifier = ">=3.7.0" }, From de6c2b2455183bd8eedcf33ac1cbb35fec93125a Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 10:37:22 -0600 Subject: [PATCH 04/15] pyrightconfig: Add `scripts` to `include` and new root directory configuration - Added `scripts` as a root directory for Pyright with default settings. - Retained configurations for `examples` with Python 3.11 specified. --- pyrightconfig.json | 3 +++ 1 file changed, 3 insertions(+) diff --git a/pyrightconfig.json b/pyrightconfig.json index 36d3b75a..5626ce3b 100644 --- a/pyrightconfig.json +++ b/pyrightconfig.json @@ -22,6 +22,9 @@ { "root": "src" }, + { + "root": "scripts" + }, { "root": "examples", "pythonVersion": "3.11" From 579ccca3441bbbac81838bfc9446898da1cc6938 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 10:38:06 -0600 Subject: [PATCH 05/15] scripts: Add `check_class_function_coverage.py` for class-specific coverage checks - Introduced a Python script to analyze function coverage for specified classes using `coverage.py` JSON output. - Validates minimum coverage thresholds for each function and identifies uncovered classes or insufficiently covered functions. - Supports generating a Markdown report for detailed coverage insights. - Handles invalid or missing JSON inputs gracefully with meaningful error messages. Assisted-by: Codex --- scripts/check_class_function_coverage.py | 223 +++++++++++++++++++++++ 1 file changed, 223 insertions(+) create mode 100644 scripts/check_class_function_coverage.py diff --git a/scripts/check_class_function_coverage.py b/scripts/check_class_function_coverage.py new file mode 100644 index 00000000..ca235e3f --- /dev/null +++ b/scripts/check_class_function_coverage.py @@ -0,0 +1,223 @@ +#!/usr/bin/env python3 +from __future__ import annotations + +import argparse +import json +import sys +from collections.abc import Iterable +from dataclasses import dataclass +from pathlib import Path + + +@dataclass(frozen=True) +class FunctionCoverage: + file: str + function: str + percent_covered: float + covered_lines: int + num_statements: int + + +def _parse_classes(values: Iterable[str]) -> list[str]: + classes: list[str] = [] + for value in values: + if not value: + continue + for item in value.split(","): + item = item.strip() + if item: + classes.append(item) + return classes + + +def _load_coverage(path: Path) -> dict: + try: + data = json.loads(path.read_text(encoding="utf-8")) + except FileNotFoundError: + message = f"Coverage JSON not found: {path}" + raise SystemExit(message) from None + except json.JSONDecodeError as exc: + message = f"Invalid coverage JSON: {exc}" + raise SystemExit(message) from exc + + if not isinstance(data, dict) or "files" not in data: + message = "Coverage JSON missing 'files' data." + raise SystemExit(message) + return data + + +def _match_class(function_name: str, classes: Iterable[str]) -> str | None: + for class_name in classes: + prefix = f"{class_name}." + if function_name.startswith(prefix): + return class_name + return None + + +def _collect_function_coverage( + data: dict, classes: list[str] +) -> tuple[list[FunctionCoverage], set[str]]: + results: list[FunctionCoverage] = [] + matched_classes: set[str] = set() + + for file_name, file_info in data.get("files", {}).items(): + functions = file_info.get("functions") or {} + for function_name, function_info in functions.items(): + if not function_name or "." not in function_name: + continue + class_name = _match_class(function_name, classes) + if not class_name: + continue + + summary = function_info.get("summary") or {} + percent = summary.get("percent_covered") + covered_lines = summary.get("covered_lines") + num_statements = summary.get("num_statements") + if percent is None or covered_lines is None or num_statements is None: + continue + + matched_classes.add(class_name) + results.append( + FunctionCoverage( + file=file_name, + function=function_name, + percent_covered=float(percent), + covered_lines=int(covered_lines), + num_statements=int(num_statements), + ) + ) + + return results, matched_classes + + +def _parse_args() -> argparse.Namespace: + parser = argparse.ArgumentParser( + description=( + "Check function coverage for methods in specified classes using coverage.py JSON data." + ) + ) + parser.add_argument("coverage_json", type=Path, help="Path to coverage.json") + parser.add_argument( + "--class", + dest="classes", + action="append", + default=[], + help="Class name to check (repeatable).", + ) + parser.add_argument( + "--classes", + dest="classes_csv", + default="", + help="Comma-separated class names to check.", + ) + parser.add_argument( + "--fail-under", + "--min-coverage", + dest="fail_under", + type=float, + required=True, + help="Minimum coverage percentage required for each function.", + ) + parser.add_argument( + "--markdown-report", + dest="markdown_report", + type=Path, + default=None, + help="Write a markdown report to the given path.", + ) + return parser.parse_args() + + +def _print_summary( + classes: list[str], + threshold: float, + functions: list[FunctionCoverage], + missing_classes: list[str], + failures: list[FunctionCoverage], +) -> None: + print(f"Minimum required coverage: {threshold:.2f}%") + print(f"Classes checked: {', '.join(classes)}") + print(f"Functions checked: {len(functions)}") + + if missing_classes: + print("Classes with no discovered functions:") + for cls in missing_classes: + print(f"- {cls}") + + if failures: + print("Functions with insufficient coverage:") + for fn in sorted(failures, key=lambda item: (item.file, item.function)): + print( + f"- {fn.file} :: {fn.function} -> {fn.percent_covered:.2f}% " + f"({fn.covered_lines}/{fn.num_statements} lines)" + ) + + +def _build_report_lines( + classes: list[str], + threshold: float, + functions: list[FunctionCoverage], + missing_classes: list[str], + failures: list[FunctionCoverage], +) -> list[str]: + report_lines = [ + "# Function Coverage Report", + "", + f"- Minimum required coverage: {threshold:.2f}%", + f"- Classes checked: {', '.join(classes)}", + f"- Functions checked: {len(functions)}", + ] + if missing_classes: + report_lines.append("- Classes with no discovered functions:") + report_lines.extend(f" - {cls}" for cls in missing_classes) + if failures: + report_lines.append("- Functions with insufficient coverage:") + report_lines.extend( + f" - {fn.file} :: {fn.function} -> {fn.percent_covered:.2f}% " + f"({fn.covered_lines}/{fn.num_statements} lines)" + for fn in sorted(failures, key=lambda item: (item.file, item.function)) + ) + else: + report_lines.append("- All checked functions meet the coverage threshold.") + return report_lines + + +def _write_markdown_report(path: Path, report_lines: list[str]) -> None: + path.write_text("\n".join(report_lines) + "\n", encoding="utf-8") + + +def main() -> int: + args = _parse_args() + + classes = _parse_classes([*args.classes, args.classes_csv]) + if not classes: + print("No classes specified. Use --class or --classes.", file=sys.stderr) + return 2 + + data = _load_coverage(args.coverage_json) + functions, matched_classes = _collect_function_coverage(data, classes) + + missing_classes = [cls for cls in classes if cls not in matched_classes] + if not functions: + print("No functions found for the requested classes.") + + threshold = args.fail_under + failures: list[FunctionCoverage] = [ + fn for fn in functions if fn.percent_covered + 1e-9 < threshold + ] + + _print_summary(classes, threshold, functions, missing_classes, failures) + + if args.markdown_report: + report_lines = _build_report_lines( + classes, threshold, functions, missing_classes, failures + ) + _write_markdown_report(args.markdown_report, report_lines) + + if failures or missing_classes: + return 1 + return 0 + + +if __name__ == "__main__": + raise SystemExit(main()) From 03c4ce0434d5c1be999f95dc72f3f06d022648e2 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 10:38:54 -0600 Subject: [PATCH 06/15] noxfile, README: Add per-class coverage check functionality - Updated `noxfile.py` with a new `class-coverage` session to analyze coverage for specified client classes using `coverage.json`. - Updated README with instructions for using `class-coverage` to monitor per-function coverage and reuse existing coverage data without rerunning tests. - Added utility methods in `noxfile.py` to streamline dependency installation, pytest argument management, and coverage file handling. Assisted-by: Codex --- README.md | 9 ++++ noxfile.py | 145 +++++++++++++++++++++++++++++++++++++++++------------ 2 files changed, 122 insertions(+), 32 deletions(-) diff --git a/README.md b/README.md index 3a454d4f..360ed8b7 100644 --- a/README.md +++ b/README.md @@ -37,6 +37,15 @@ Run the test suite with: uv run pytest ``` +Check per-function coverage for the client classes: + +```bash +uvx nox -s class-coverage +``` + +To reuse an existing `coverage/py/coverage.json` without rerunning +tests, add `-- --no-tests` (and optional `--coverage-json path`). + Check sync/async parity for changed tests (defaults to `upstream/main..HEAD`): ```bash diff --git a/noxfile.py b/noxfile.py index 48cd14ad..32f76117 100644 --- a/noxfile.py +++ b/noxfile.py @@ -17,6 +17,74 @@ PROJECT_ROOT = Path(__file__).resolve().parent DEFAULT_EXAMPLE_PYTHON = "3.11" EXAMPLES_DIR = PROJECT_ROOT / "examples" +DEFAULT_COVERAGE_CLASSES = ("PdfRestClient", "AsyncPdfRestClient") + + +def _install_test_dependencies(session: nox.Session) -> None: + _ = session.run_install( + "uv", + "sync", + "--no-default-groups", + "--group=dev", + "--reinstall-package=pdfrest", + f"--python={session.virtualenv.location}", + env={"UV_PROJECT_ENVIRONMENT": session.virtualenv.location}, + ) + + +def _coverage_dir_for_session(session: nox.Session) -> Path: + coverage_dir = PROJECT_ROOT / "coverage" / f"py{session.python}" + coverage_dir.mkdir(parents=True, exist_ok=True) + return coverage_dir + + +def _pytest_args_from_session(session: nox.Session) -> list[str]: + parser = argparse.ArgumentParser(add_help=False) + _ = parser.add_argument("--no-parallel", action="store_true") + _ = parser.add_argument("-n", "--workers", "--numprocesses") + custom, remaining = parser.parse_known_args(session.posargs) + + pytest_args = list(remaining) + + if custom.no_parallel: + return pytest_args + if custom.workers: + pytest_args[:0] = ["-n", custom.workers, "--maxschedchunk", "2"] + else: + pytest_args[:0] = ["-n", "8", "--maxschedchunk", "2"] + + return pytest_args + + +def _run_pytest_with_coverage(session: nox.Session, pytest_args: Iterable[str]) -> Path: + coverage_dir = _coverage_dir_for_session(session) + htmlcov_dir = coverage_dir / "html" + xml_report = coverage_dir / "coverage.xml" + md_report = coverage_dir / "coverage.md" + json_report = coverage_dir / "coverage.json" + _ = session.run( + "pytest", + "--cov=pdfrest", + "--cov-report=term-missing", + f"--cov-report=html:{htmlcov_dir}", + f"--cov-report=xml:{xml_report}", + f"--cov-report=markdown:{md_report}", + f"--cov-report=json:{json_report}", + *pytest_args, + ) + return coverage_dir + + +def _parse_class_values(values: Iterable[str]) -> list[str]: + classes: list[str] = [] + for value in values: + if not value: + continue + for item in value.split(","): + item = item.strip() + if item: + classes.append(item) + return classes @dataclass(frozen=True) @@ -162,48 +230,61 @@ def _infer_python_version_from_path(script: Path) -> str | None: @nox.session(name="tests", python=python_versions, reuse_venv=True) def tests(session: nox.Session) -> None: - # Define only custom flags + pytest_args = _pytest_args_from_session(session) + + _install_test_dependencies(session) + _ = _run_pytest_with_coverage(session, pytest_args) + + +@nox.session(name="class-coverage", python=python_versions, reuse_venv=True) +def class_coverage(session: nox.Session) -> None: parser = argparse.ArgumentParser(add_help=False) _ = parser.add_argument("--no-parallel", action="store_true") - _ = parser.add_argument( - "-n", "--workers", "--numprocesses" - ) # e.g., -n 4 to set workers + _ = parser.add_argument("-n", "--workers", "--numprocesses") + _ = parser.add_argument("--no-tests", action="store_true") + _ = parser.add_argument("--coverage-json", type=Path, default=None) + _ = parser.add_argument("--markdown-report", type=Path, default=None) + _ = parser.add_argument("--fail-under", type=float, default=90.0) + _ = parser.add_argument("--class", dest="classes", action="append", default=[]) + _ = parser.add_argument("--classes", dest="classes_csv", default="") custom, remaining = parser.parse_known_args(session.posargs) pytest_args = list(remaining) + if not custom.no_parallel: + if custom.workers: + pytest_args[:0] = ["-n", custom.workers, "--maxschedchunk", "2"] + else: + pytest_args[:0] = ["-n", "8", "--maxschedchunk", "2"] - # Default to parallel unless disabled or overridden - if custom.no_parallel: - pass - elif custom.workers: - pytest_args[:0] = ["-n", custom.workers, "--maxschedchunk", "2"] + if custom.no_tests: + coverage_dir = _coverage_dir_for_session(session) else: - pytest_args[:0] = ["-n", "8", "--maxschedchunk", "2"] + _install_test_dependencies(session) + coverage_dir = _run_pytest_with_coverage(session, pytest_args) - _ = session.run_install( - "uv", - "sync", - "--no-default-groups", - "--group=dev", - "--reinstall-package=pdfrest", - f"--python={session.virtualenv.location}", - env={"UV_PROJECT_ENVIRONMENT": session.virtualenv.location}, - ) - coverage_dir = PROJECT_ROOT / "coverage" / f"py{session.python}" - coverage_dir.mkdir(parents=True, exist_ok=True) - htmlcov_dir = coverage_dir / "html" - xml_report = coverage_dir / "coverage.xml" - md_report = coverage_dir / "coverage.md" - _ = session.run( - "pytest", - "--cov=pdfrest", - "--cov-report=term-missing", - f"--cov-report=html:{htmlcov_dir}", - f"--cov-report=xml:{xml_report}", - f"--cov-report=markdown:{md_report}", - *pytest_args, + coverage_json = custom.coverage_json or (coverage_dir / "coverage.json") + markdown_report = custom.markdown_report or ( + coverage_dir / "class-function-coverage.md" ) + classes = _parse_class_values([*custom.classes, custom.classes_csv]) + if not classes: + classes = list(DEFAULT_COVERAGE_CLASSES) + + script_args = [ + "python", + str(PROJECT_ROOT / "scripts" / "check_class_function_coverage.py"), + str(coverage_json), + "--fail-under", + f"{custom.fail_under}", + "--markdown-report", + str(markdown_report), + ] + for class_name in classes: + script_args.extend(["--class", class_name]) + + _ = session.run(*script_args) + @nox.session(name="examples", python=python_versions, reuse_venv=True) def run_examples(session: nox.Session) -> None: From 4bb65b92e5cdb91609fedaa911347e587df68c03 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 10:39:16 -0600 Subject: [PATCH 07/15] github/workflows: Add per-class function coverage checks to CI - Updated `test-and-publish.yml` with a new step to check class function coverage for `PdfRestClient` and `AsyncPdfRestClient`. - Ensured minimum 90% coverage thresholds for specified classes. - Configured Markdown reporting for detailed class function coverage data. Assisted-by: Codex --- .github/workflows/test-and-publish.yml | 8 ++++++++ 1 file changed, 8 insertions(+) diff --git a/.github/workflows/test-and-publish.yml b/.github/workflows/test-and-publish.yml index 9a3f0f24..07b58764 100644 --- a/.github/workflows/test-and-publish.yml +++ b/.github/workflows/test-and-publish.yml @@ -48,6 +48,14 @@ jobs: --compare-branch origin/${{ github.base_ref }} --fail-under 90 --markdown-report coverage/py${{ matrix.python-version }}/diff-cover.md + - name: Check client class function coverage + run: > + uv run python scripts/check_class_function_coverage.py + coverage/py${{ matrix.python-version }}/coverage.json + --class PdfRestClient + --class AsyncPdfRestClient + --fail-under 90 + --markdown-report coverage/py${{ matrix.python-version }}/class-function-coverage.md - name: Upload coverage reports if: always() uses: actions/upload-artifact@v4 From 6788d0bcee802e82a4875db9bfa96c495a4ac948 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 10:55:13 -0600 Subject: [PATCH 08/15] tests: Add optional-branch coverage for client payloads - Add translate-to-file tests with pages/output defaults verified - Add async summarize/markdown/ocr/extract payload branch coverage - Add async redaction apply coverage for rgb_color serialization Assisted-by: Codex --- tests/test_convert_to_markdown.py | 57 ++++++++++++ tests/test_extract_pdf_text_to_file.py | 58 ++++++++++++ tests/test_ocr_pdf.py | 51 ++++++++++ tests/test_pdf_redaction_apply.py | 69 +++++++++++++- tests/test_summarize_pdf_text.py | 59 ++++++++++++ tests/test_translate_pdf_text.py | 123 +++++++++++++++++++++++++ 6 files changed, 415 insertions(+), 2 deletions(-) diff --git a/tests/test_convert_to_markdown.py b/tests/test_convert_to_markdown.py index 140a1c22..e9acd80c 100644 --- a/tests/test_convert_to_markdown.py +++ b/tests/test_convert_to_markdown.py @@ -304,6 +304,63 @@ def handler(request: httpx.Request) -> httpx.Response: assert timeout_value == pytest.approx(0.4) +@pytest.mark.asyncio +async def test_async_convert_to_markdown_includes_pages_and_output( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.delenv("PDFREST_API_KEY", raising=False) + input_file = make_pdf_file(PdfRestFileID.generate(2)) + output_id = str(PdfRestFileID.generate()) + payload_dump = ConvertToMarkdownPayload.model_validate( + { + "files": [input_file], + "output_type": "file", + "page_break_comments": "on", + "pages": ["2-4"], + "output": "async-md", + } + ).model_dump(mode="json", by_alias=True, exclude_none=True, exclude_unset=True) + + seen: dict[str, int] = {"post": 0, "get": 0} + + def handler(request: httpx.Request) -> httpx.Response: + if request.method == "POST" and request.url.path == "/markdown": + seen["post"] += 1 + payload = json.loads(request.content.decode("utf-8")) + assert payload == payload_dump + return httpx.Response( + 200, + json={ + "inputId": [str(input_file.id)], + "outputId": [output_id], + }, + ) + if request.method == "GET" and request.url.path == f"/resource/{output_id}": + seen["get"] += 1 + assert request.url.params["format"] == "info" + return httpx.Response( + 200, + json=_make_markdown_file(output_id, "async-pages.md").model_dump( + mode="json", by_alias=True + ), + ) + msg = f"Unexpected request {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + async with AsyncPdfRestClient(api_key=ASYNC_API_KEY, transport=transport) as client: + response = await client.convert_to_markdown( + input_file, + pages=["2-4"], + output="async-md", + page_break_comments="on", + ) + + assert seen == {"post": 1, "get": 1} + assert isinstance(response, PdfRestFileBasedResponse) + assert len(response.output_files) == 1 + + @pytest.mark.asyncio async def test_async_convert_to_markdown_success( monkeypatch: pytest.MonkeyPatch, diff --git a/tests/test_extract_pdf_text_to_file.py b/tests/test_extract_pdf_text_to_file.py index fb875fc9..ef608ad5 100644 --- a/tests/test_extract_pdf_text_to_file.py +++ b/tests/test_extract_pdf_text_to_file.py @@ -344,3 +344,61 @@ def handler(request: httpx.Request) -> httpx.Response: assert isinstance(response, PdfRestFileBasedResponse) assert len(response.output_files) == 1 assert response.input_id == input_file.id + + +@pytest.mark.asyncio +async def test_async_extract_pdf_text_to_file_includes_pages( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.delenv("PDFREST_API_KEY", raising=False) + input_file = make_pdf_file(PdfRestFileID.generate(2)) + output_id = str(PdfRestFileID.generate()) + payload_dump = ExtractTextPayload.model_validate( + { + "files": [input_file], + "full_text": "document", + "preserve_line_breaks": "off", + "word_style": "off", + "word_coordinates": "off", + "output_type": "file", + "pages": ["2-3"], + } + ).model_dump(mode="json", by_alias=True, exclude_none=True, exclude_unset=True) + + seen: dict[str, int] = {"post": 0, "get": 0} + + def handler(request: httpx.Request) -> httpx.Response: + if request.method == "POST" and request.url.path == "/extracted-text": + seen["post"] += 1 + payload = json.loads(request.content.decode("utf-8")) + assert payload == payload_dump + return httpx.Response( + 200, + json={ + "inputId": [str(input_file.id)], + "outputId": [output_id], + }, + ) + if request.method == "GET" and request.url.path == f"/resource/{output_id}": + seen["get"] += 1 + assert request.url.params["format"] == "info" + return httpx.Response( + 200, + json=_make_text_file(output_id, "async-pages.txt").model_dump( + mode="json", by_alias=True + ), + ) + msg = f"Unexpected request {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + async with AsyncPdfRestClient(api_key=ASYNC_API_KEY, transport=transport) as client: + response = await client.extract_pdf_text_to_file( + input_file, + pages=["2-3"], + ) + + assert seen == {"post": 1, "get": 1} + assert isinstance(response, PdfRestFileBasedResponse) + assert len(response.output_files) == 1 + assert response.input_id == input_file.id diff --git a/tests/test_ocr_pdf.py b/tests/test_ocr_pdf.py index fe55df9e..a3b8b385 100644 --- a/tests/test_ocr_pdf.py +++ b/tests/test_ocr_pdf.py @@ -281,6 +281,57 @@ def handler(request: httpx.Request) -> httpx.Response: assert timeout_value == pytest.approx(0.4) +@pytest.mark.asyncio +async def test_async_ocr_pdf_includes_pages(monkeypatch: pytest.MonkeyPatch) -> None: + monkeypatch.delenv("PDFREST_API_KEY", raising=False) + input_file = make_pdf_file(PdfRestFileID.generate(2)) + payload_dump = OcrPdfPayload.model_validate( + { + "files": [input_file], + "pages": ["1-2"], + "languages": ["English"], + } + ).model_dump(mode="json", by_alias=True, exclude_none=True, exclude_unset=True) + output_id = str(PdfRestFileID.generate()) + + seen: dict[str, int] = {"post": 0, "get": 0} + + def handler(request: httpx.Request) -> httpx.Response: + if request.method == "POST" and request.url.path == "/pdf-with-ocr-text": + seen["post"] += 1 + payload = json.loads(request.content.decode("utf-8")) + assert payload == payload_dump + return httpx.Response( + 200, + json={ + "inputId": str(input_file.id), + "outputId": output_id, + }, + ) + if request.method == "GET" and request.url.path == f"/resource/{output_id}": + seen["get"] += 1 + return httpx.Response( + 200, + json=make_pdf_file(output_id, "async-ocr.pdf").model_dump( + mode="json", by_alias=True + ), + ) + msg = f"Unexpected request {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + async with AsyncPdfRestClient(api_key=ASYNC_API_KEY, transport=transport) as client: + response = await client.ocr_pdf( + input_file, + pages=["1-2"], + languages=["English"], + ) + + assert seen == {"post": 1, "get": 1} + assert isinstance(response, PdfRestFileBasedResponse) + assert response.output_file.id == output_id + + @pytest.mark.asyncio async def test_async_ocr_pdf_success( monkeypatch: pytest.MonkeyPatch, diff --git a/tests/test_pdf_redaction_apply.py b/tests/test_pdf_redaction_apply.py index 84fd991c..69cedc24 100644 --- a/tests/test_pdf_redaction_apply.py +++ b/tests/test_pdf_redaction_apply.py @@ -6,12 +6,17 @@ import pytest from pydantic import ValidationError -from pdfrest import PdfRestClient +from pdfrest import AsyncPdfRestClient, PdfRestClient from pdfrest.models import PdfRestFileBasedResponse, PdfRestFileID from pdfrest.models._internal import PdfRedactionApplyPayload from pdfrest.types import PdfRGBColor -from .graphics_test_helpers import VALID_API_KEY, build_file_info_payload, make_pdf_file +from .graphics_test_helpers import ( + ASYNC_API_KEY, + VALID_API_KEY, + build_file_info_payload, + make_pdf_file, +) @pytest.mark.parametrize( @@ -87,3 +92,63 @@ def test_apply_redactions_invalid_color(monkeypatch: pytest.MonkeyPatch) -> None with pytest.raises(ValidationError, match="greater than or equal to 0"): client.apply_redactions(input_file, rgb_color=[-1, 0, 0]) + + +@pytest.mark.asyncio +async def test_async_apply_redactions_includes_rgb_color( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.delenv("PDFREST_API_KEY", raising=False) + input_file = make_pdf_file(PdfRestFileID.generate(2)) + output_id = str(PdfRestFileID.generate()) + + payload_data: dict[str, object] = { + "files": [input_file], + "rgb_color": (10, 20, 30), + "output": "async-output", + } + + payload_model_dump = PdfRedactionApplyPayload.model_validate( + payload_data + ).model_dump(mode="json", by_alias=True, exclude_none=True, exclude_unset=True) + + seen: dict[str, int] = {"post": 0, "get": 0} + + def handler(request: httpx.Request) -> httpx.Response: + if ( + request.method == "POST" + and request.url.path == "/pdf-with-redacted-text-applied" + ): + seen["post"] += 1 + body = json.loads(request.content.decode("utf-8")) + assert body == payload_model_dump + return httpx.Response( + 200, + json={ + "inputId": [input_file.id], + "outputId": [output_id], + }, + ) + if request.method == "GET" and request.url.path == f"/resource/{output_id}": + seen["get"] += 1 + return httpx.Response( + 200, + json=build_file_info_payload( + output_id, "async-output.pdf", "application/pdf" + ), + ) + msg = f"Unexpected request {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + async with AsyncPdfRestClient(api_key=ASYNC_API_KEY, transport=transport) as client: + response = await client.apply_redactions( + input_file, + rgb_color=(10, 20, 30), + output="async-output", + ) + + assert seen == {"post": 1, "get": 1} + assert isinstance(response, PdfRestFileBasedResponse) + assert response.output_files[0].name == "async-output.pdf" + assert response.output_files[0].type == "application/pdf" diff --git a/tests/test_summarize_pdf_text.py b/tests/test_summarize_pdf_text.py index 2f8c3ef9..98855ca8 100644 --- a/tests/test_summarize_pdf_text.py +++ b/tests/test_summarize_pdf_text.py @@ -402,6 +402,65 @@ def handler(request: httpx.Request) -> httpx.Response: assert timeout_value == pytest.approx(0.25) +@pytest.mark.asyncio +async def test_async_summarize_text_to_file_includes_pages_and_output( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.delenv("PDFREST_API_KEY", raising=False) + input_file = make_pdf_file(PdfRestFileID.generate(2)) + payload_dump = SummarizePdfTextPayload.model_validate( + { + "files": [input_file], + "output_type": "file", + "output_format": "markdown", + "summary_format": "overview", + "target_word_count": 400, + "pages": ["1-3"], + "output": "async-summary", + } + ).model_dump(mode="json", by_alias=True, exclude_none=True, exclude_unset=True) + output_id = str(PdfRestFileID.generate()) + + seen: dict[str, int] = {"post": 0, "get": 0} + + def handler(request: httpx.Request) -> httpx.Response: + if request.method == "POST" and request.url.path == "/summarized-pdf-text": + seen["post"] += 1 + payload = json.loads(request.content.decode("utf-8")) + assert payload == payload_dump + return httpx.Response( + 200, + json={ + "outputId": output_id, + "inputId": str(input_file.id), + }, + ) + if request.method == "GET" and request.url.path == f"/resource/{output_id}": + seen["get"] += 1 + assert request.url.params["format"] == "info" + return httpx.Response( + 200, + json=build_file_info_payload( + output_id, "async-summary.md", "text/markdown" + ), + ) + msg = f"Unexpected request {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + async with AsyncPdfRestClient(api_key=ASYNC_API_KEY, transport=transport) as client: + response = await client.summarize_text_to_file( + input_file, + pages=["1-3"], + output="async-summary", + ) + + assert seen == {"post": 1, "get": 1} + assert isinstance(response, PdfRestFileBasedResponse) + assert response.output_file.id == output_id + assert response.output_file.name == "async-summary.md" + + def test_summarize_text_success(monkeypatch: pytest.MonkeyPatch) -> None: monkeypatch.delenv("PDFREST_API_KEY", raising=False) input_file = make_pdf_file(PdfRestFileID.generate(2)) diff --git a/tests/test_translate_pdf_text.py b/tests/test_translate_pdf_text.py index 1d244031..6662c1e3 100644 --- a/tests/test_translate_pdf_text.py +++ b/tests/test_translate_pdf_text.py @@ -368,6 +368,67 @@ def handler(request: httpx.Request) -> httpx.Response: assert timeout_value == pytest.approx(0.3) +def test_translate_pdf_text_to_file_includes_pages_and_output( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.delenv("PDFREST_API_KEY", raising=False) + input_file = make_pdf_file(PdfRestFileID.generate(1)) + payload_dump = TranslatePdfTextPayload.model_validate( + { + "files": [input_file], + "output_language": "fr", + "output_type": "file", + "output_format": "markdown", + "pages": ["1-2"], + "output": "translated", + } + ).model_dump(mode="json", by_alias=True, exclude_none=True, exclude_unset=True) + output_id = str(PdfRestFileID.generate()) + + seen: dict[str, int] = {"post": 0, "get": 0} + + def handler(request: httpx.Request) -> httpx.Response: + if request.method == "POST" and request.url.path == "/translated-pdf-text": + seen["post"] += 1 + payload = json.loads(request.content.decode("utf-8")) + assert payload == payload_dump + return httpx.Response( + 200, + json={ + "outputUrl": f"https://api.pdfrest.com/resource/{output_id}?format=file", + "outputId": output_id, + "inputId": str(input_file.id), + "source_languages": ["en"], + "output_language": "fr", + }, + ) + if request.method == "GET" and request.url.path == f"/resource/{output_id}": + seen["get"] += 1 + assert request.url.params["format"] == "info" + return httpx.Response( + 200, + json=_make_markdown_file(output_id).model_dump( + mode="json", by_alias=True + ), + ) + msg = f"Unexpected request {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + with PdfRestClient(api_key=VALID_API_KEY, transport=transport) as client: + response = client.translate_pdf_text_to_file( + input_file, + output_language="fr", + pages=["1-2"], + output="translated", + ) + + assert seen == {"post": 1, "get": 1} + assert isinstance(response, TranslatePdfTextFileResponse) + assert response.output_file.id == output_id + assert response.output_file.name == "notes.md" + + @pytest.mark.asyncio async def test_async_translate_pdf_text_request_customization( monkeypatch: pytest.MonkeyPatch, @@ -442,6 +503,68 @@ def handler(request: httpx.Request) -> httpx.Response: assert timeout_value == pytest.approx(0.3) +@pytest.mark.asyncio +async def test_async_translate_pdf_text_to_file_includes_pages_and_output( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.delenv("PDFREST_API_KEY", raising=False) + input_file = make_pdf_file(PdfRestFileID.generate(2)) + payload_dump = TranslatePdfTextPayload.model_validate( + { + "files": [input_file], + "output_language": "it", + "output_type": "file", + "output_format": "markdown", + "pages": ["3-4"], + "output": "async-translate", + } + ).model_dump(mode="json", by_alias=True, exclude_none=True, exclude_unset=True) + output_id = str(PdfRestFileID.generate()) + + seen: dict[str, int] = {"post": 0, "get": 0} + + def handler(request: httpx.Request) -> httpx.Response: + if request.method == "POST" and request.url.path == "/translated-pdf-text": + seen["post"] += 1 + payload = json.loads(request.content.decode("utf-8")) + assert payload == payload_dump + return httpx.Response( + 200, + json={ + "outputUrl": f"https://api.pdfrest.com/resource/{output_id}?format=file", + "outputId": output_id, + "inputId": str(input_file.id), + "source_languages": ["en"], + "output_language": "it", + }, + ) + if request.method == "GET" and request.url.path == f"/resource/{output_id}": + seen["get"] += 1 + assert request.url.params["format"] == "info" + return httpx.Response( + 200, + json=_make_markdown_file(output_id).model_dump( + mode="json", by_alias=True + ), + ) + msg = f"Unexpected request {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + async with AsyncPdfRestClient(api_key=ASYNC_API_KEY, transport=transport) as client: + response = await client.translate_pdf_text_to_file( + input_file, + output_language="it", + pages=["3-4"], + output="async-translate", + ) + + assert seen == {"post": 1, "get": 1} + assert isinstance(response, TranslatePdfTextFileResponse) + assert response.output_file.id == output_id + assert response.output_file.name == "notes.md" + + def test_translate_pdf_text_success(monkeypatch: pytest.MonkeyPatch) -> None: monkeypatch.delenv("PDFREST_API_KEY", raising=False) input_file = make_pdf_file(PdfRestFileID.generate(2)) From dd0e95a2250caafcfdf9fd8a92edc5ebe16fd699 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 11:24:25 -0600 Subject: [PATCH 09/15] tests: Add sync/async coverage for request validation and error handling - Add tests to ensure `prepare_request` rejects endpoints without a leading `/`. - Validate iterator-based file uploads for stream detection in both sync and async methods. - Cover cases where the client raises errors for non-JSON success or error responses, verifying payload content. Assisted-by: Codex --- tests/test_client.py | 120 +++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 120 insertions(+) diff --git a/tests/test_client.py b/tests/test_client.py index d02aed91..f7cc6c08 100644 --- a/tests/test_client.py +++ b/tests/test_client.py @@ -537,6 +537,56 @@ def test_prepare_request_rejects_files_with_json( ) +def test_prepare_request_rejects_missing_leading_slash( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.setenv("PDFREST_API_KEY", VALID_API_KEY) + with ( + PdfRestClient(api_key=VALID_API_KEY) as client, + pytest.raises(PdfRestConfigurationError, match="endpoint must start with '/'"), + ): + client.prepare_request("GET", "up") + + +@pytest.mark.asyncio +async def test_async_prepare_request_rejects_missing_leading_slash( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.setenv("PDFREST_API_KEY", ASYNC_API_KEY) + async with AsyncPdfRestClient(api_key=ASYNC_API_KEY) as client: + with pytest.raises( + PdfRestConfigurationError, match="endpoint must start with '/'" + ): + client.prepare_request("GET", "up") + + +def test_prepare_request_accepts_iterator_and_marks_stream( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.setenv("PDFREST_API_KEY", VALID_API_KEY) + file_iter = iter([("file", BytesIO(b"data"))]) + with PdfRestClient(api_key=VALID_API_KEY) as client: + request = client.prepare_request("POST", "/upload", files=file_iter) + + assert isinstance(request.files, list) + assert len(request.files) == 1 + assert request.has_stream_uploads() + + +@pytest.mark.asyncio +async def test_async_prepare_request_accepts_iterator_and_marks_stream( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.setenv("PDFREST_API_KEY", ASYNC_API_KEY) + file_iter = iter([("file", BytesIO(b"data"))]) + async with AsyncPdfRestClient(api_key=ASYNC_API_KEY) as client: + request = client.prepare_request("POST", "/upload", files=file_iter) + + assert isinstance(request.files, list) + assert len(request.files) == 1 + assert request.has_stream_uploads() + + def test_download_file_retries_on_error(monkeypatch: pytest.MonkeyPatch) -> None: monkeypatch.setenv("PDFREST_API_KEY", VALID_API_KEY) monkeypatch.setattr(client_module.random, "uniform", lambda *_: 0.0) @@ -641,6 +691,76 @@ def handler(_: httpx.Request) -> httpx.Response: assert exc_info.value.response_content == "Unauthorized" +def test_client_raises_for_non_json_success_response( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.setenv("PDFREST_API_KEY", VALID_API_KEY) + + def handler(_: httpx.Request) -> httpx.Response: + return httpx.Response(200, text="not-json") + + transport = httpx.MockTransport(handler) + with ( + pytest.raises(PdfRestApiError, match="Response body is not valid JSON") as exc, + PdfRestClient(transport=transport) as client, + ): + client.up() + assert exc.value.status_code == 200 + assert exc.value.response_content == "not-json" + + +@pytest.mark.asyncio +async def test_async_client_raises_for_non_json_success_response( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.setenv("PDFREST_API_KEY", ASYNC_API_KEY) + + def handler(_: httpx.Request) -> httpx.Response: + return httpx.Response(200, text="not-json") + + transport = httpx.MockTransport(handler) + async with AsyncPdfRestClient(transport=transport) as client: + with pytest.raises( + PdfRestApiError, match="Response body is not valid JSON" + ) as exc: + await client.up() + assert exc.value.status_code == 200 + assert exc.value.response_content == "not-json" + + +def test_client_uses_text_for_non_json_error_payload( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.setenv("PDFREST_API_KEY", VALID_API_KEY) + + def handler(_: httpx.Request) -> httpx.Response: + return httpx.Response(500, text="server blew up") + + transport = httpx.MockTransport(handler) + with ( + pytest.raises(PdfRestApiError, match="status code 500") as exc, + PdfRestClient(transport=transport) as client, + ): + client.up() + assert exc.value.response_content == "server blew up" + + +@pytest.mark.asyncio +async def test_async_client_uses_text_for_non_json_error_payload( + monkeypatch: pytest.MonkeyPatch, +) -> None: + monkeypatch.setenv("PDFREST_API_KEY", ASYNC_API_KEY) + + def handler(_: httpx.Request) -> httpx.Response: + return httpx.Response(500, text="server blew up") + + transport = httpx.MockTransport(handler) + async with AsyncPdfRestClient(transport=transport) as client: + with pytest.raises(PdfRestApiError, match="status code 500") as exc: + await client.up() + assert exc.value.response_content == "server blew up" + + def test_client_raises_for_non_success_response( monkeypatch: pytest.MonkeyPatch, ) -> None: From 8d93e5c825260627fe6ec43036274120ab9fb8d8 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 11:34:11 -0600 Subject: [PATCH 10/15] tests: Add sync/async test coverage for `create_from_paths` behavior - Add tests ensuring `create_from_paths` supports content-type-only uploads. - Add async test verifying `create_from_paths` supports metadata in requests. - These directly exercise the two missing branches in _parse_path_spec and validate that the multipart payload includes the right Content-Type and custom headers. Assisted-by: Codex --- tests/test_files.py | 81 +++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 81 insertions(+) diff --git a/tests/test_files.py b/tests/test_files.py index 73a10bc0..9640cc42 100644 --- a/tests/test_files.py +++ b/tests/test_files.py @@ -605,6 +605,45 @@ def handler(request: httpx.Request) -> httpx.Response: _assert_file_matches_payload(response[0], info_payload) +def test_files_create_from_paths_supports_content_type_only() -> None: + uploaded_file_id = str(uuid.uuid4()) + info_payload = _build_file_info_payload(uploaded_file_id, "report.pdf") + + def handler(request: httpx.Request) -> httpx.Response: + if request.method == "POST" and request.url.path == "/upload": + body = request.content + assert b'filename="report.pdf"' in body + assert b"Content-Type: application/test-pdf" in body + return httpx.Response( + 200, + json={ + "files": [ + {"name": "report.pdf", "id": uploaded_file_id}, + ] + }, + ) + if request.method == "GET": + assert request.url.params["format"] == "info" + return httpx.Response(200, json=info_payload) + msg = f"Unexpected request: {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + report_pdf = get_test_resource_path("report.pdf") + with PdfRestClient(api_key=VALID_API_KEY, transport=transport) as client: + response = client.files.create_from_paths( + [ + ( + report_pdf, + "application/test-pdf", + ) + ] + ) + + assert len(response) == 1 + _assert_file_matches_payload(response[0], info_payload) + + class TestDownloadHelpers: @pytest.fixture def client(self) -> Iterator[tuple[PdfRestClient, bytes, dict[str, Any]]]: @@ -1237,6 +1276,48 @@ def handler(request: httpx.Request) -> httpx.Response: _assert_file_matches_payload(response[0], info_payload) +@pytest.mark.asyncio +async def test_async_files_create_from_paths_supports_metadata() -> None: + uploaded_file_id = str(uuid.uuid4()) + info_payload = _build_file_info_payload(uploaded_file_id, "report.pdf") + + def handler(request: httpx.Request) -> httpx.Response: + if request.method == "POST" and request.url.path == "/upload": + body = request.content + assert b'filename="report.pdf"' in body + assert b"Content-Type: application/test-pdf" in body + assert b"X-Custom: header" in body + return httpx.Response( + 200, + json={ + "files": [ + {"name": "report.pdf", "id": uploaded_file_id}, + ] + }, + ) + if request.method == "GET": + assert request.url.params["format"] == "info" + return httpx.Response(200, json=info_payload) + msg = f"Unexpected request: {request.method} {request.url}" + raise AssertionError(msg) + + transport = httpx.MockTransport(handler) + report_pdf = get_test_resource_path("report.pdf") + async with AsyncPdfRestClient(api_key=VALID_API_KEY, transport=transport) as client: + response = await client.files.create_from_paths( + [ + ( + report_pdf, + "application/test-pdf", + {"X-Custom": "header"}, + ) + ] + ) + + assert len(response) == 1 + _assert_file_matches_payload(response[0], info_payload) + + def test_live_file_create(pdfrest_api_key: str, pdfrest_live_base_url: str) -> None: with PdfRestClient( api_key=pdfrest_api_key, base_url=pdfrest_live_base_url From 820417c3d12a51fe16e9417cba914543b6f78b65 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 11:40:14 -0600 Subject: [PATCH 11/15] tests: Refactor and relocate live PNG conversion tests - Moved `test_live_convert_to_png` and `test_live_async_convert_to_png` from `tests/test_convert_to_png.py` to a new file `tests/live/test_live_convert_to_png.py`. - Simplified `test_convert_to_png.py` by removing live test cases. Assisted-by: Codex --- tests/live/test_live_convert_to_png.py | 49 ++++++++++++++++++++++++++ tests/test_convert_to_png.py | 35 ------------------ 2 files changed, 49 insertions(+), 35 deletions(-) create mode 100644 tests/live/test_live_convert_to_png.py diff --git a/tests/live/test_live_convert_to_png.py b/tests/live/test_live_convert_to_png.py new file mode 100644 index 00000000..49f50452 --- /dev/null +++ b/tests/live/test_live_convert_to_png.py @@ -0,0 +1,49 @@ +from __future__ import annotations + +import pytest + +from pdfrest import AsyncPdfRestClient, PdfRestClient +from pdfrest.models import PdfRestFileBasedResponse + +from ..resources import get_test_resource_path + + +def test_live_convert_to_png( + pdfrest_api_key: str, + pdfrest_live_base_url: str, +) -> None: + resource = get_test_resource_path("report.pdf") + with PdfRestClient( + api_key=pdfrest_api_key, + base_url=pdfrest_live_base_url, + ) as client: + uploaded = client.files.create_from_paths([resource])[0] + response = client.convert_to_png( + uploaded, + output_prefix="live-convert", + page_range="1", + ) + + assert isinstance(response, PdfRestFileBasedResponse) + assert response.output_files + + +@pytest.mark.asyncio +async def test_live_async_convert_to_png( + pdfrest_api_key: str, + pdfrest_live_base_url: str, +) -> None: + resource = get_test_resource_path("report.pdf") + async with AsyncPdfRestClient( + api_key=pdfrest_api_key, + base_url=pdfrest_live_base_url, + ) as client: + uploaded = (await client.files.create_from_paths([resource]))[0] + response = await client.convert_to_png( + uploaded, + output_prefix="live-async-convert", + page_range="1", + ) + + assert isinstance(response, PdfRestFileBasedResponse) + assert response.output_files diff --git a/tests/test_convert_to_png.py b/tests/test_convert_to_png.py index 4ac8b1f7..01569644 100644 --- a/tests/test_convert_to_png.py +++ b/tests/test_convert_to_png.py @@ -18,7 +18,6 @@ build_file_info_payload, make_pdf_file, ) -from .resources import get_test_resource_path @pytest.mark.parametrize("color_model", ["rgb", "rgba", "gray"]) @@ -577,37 +576,3 @@ def handler(_: httpx.Request) -> httpx.Response: make_pdf_file(PdfRestFileID.generate(1)), page_range=[], ) - - -def test_live_convert_to_png(pdfrest_api_key: str, pdfrest_live_base_url: str) -> None: - resource = get_test_resource_path("report.pdf") - with PdfRestClient( - api_key=pdfrest_api_key, base_url=pdfrest_live_base_url - ) as client: - uploaded = client.files.create_from_paths([resource]) - response = client.convert_to_png( - uploaded[0], - output_prefix="live-convert", - page_range="1", - ) - assert isinstance(response, PdfRestFileBasedResponse) - assert response.output_files - - -@pytest.mark.asyncio -async def test_live_async_convert_to_png( - pdfrest_api_key: str, - pdfrest_live_base_url: str, -) -> None: - resource = get_test_resource_path("report.pdf") - async with AsyncPdfRestClient( - api_key=pdfrest_api_key, base_url=pdfrest_live_base_url - ) as client: - uploaded = await client.files.create_from_paths([resource]) - response = await client.convert_to_png( - uploaded[0], - output_prefix="live-async-convert", - page_range="1", - ) - assert isinstance(response, PdfRestFileBasedResponse) - assert response.output_files From 6ce49accad680fc1461bc854be9fa62aecf06e91 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 11:47:01 -0600 Subject: [PATCH 12/15] github/workflows: Update diff-cover fetch and report format configuration - Changed `git fetch` depth from `1` to `0` for full history in diff-cover. - Full history needed for diff-cover to find the merge base. - Updated diff-cover report generation to use `--format markdown:`, avoiding warning of deprecated option. Assisted-by: Codex --- .github/workflows/test-and-publish.yml | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/.github/workflows/test-and-publish.yml b/.github/workflows/test-and-publish.yml index 07b58764..0ad07535 100644 --- a/.github/workflows/test-and-publish.yml +++ b/.github/workflows/test-and-publish.yml @@ -40,14 +40,14 @@ jobs: PDFREST_API_KEY: ${{ secrets.PDFREST_API_KEY }} - name: Fetch base branch for diff-cover if: github.event_name == 'pull_request' - run: git fetch origin ${{ github.base_ref }} --depth=1 + run: git fetch origin ${{ github.base_ref }} --depth=0 - name: Run diff-cover (new code must be >= 90%) if: github.event_name == 'pull_request' run: > uv run diff-cover coverage/py${{ matrix.python-version }}/coverage.xml --compare-branch origin/${{ github.base_ref }} --fail-under 90 - --markdown-report coverage/py${{ matrix.python-version }}/diff-cover.md + --format markdown:coverage/py${{ matrix.python-version }}/diff-cover.md - name: Check client class function coverage run: > uv run python scripts/check_class_function_coverage.py From cc8de21ef4ec6f4d3c951a9e883195e600207cac Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 13:02:29 -0600 Subject: [PATCH 13/15] github/workflows: Improve handling of shallow repositories in diff-cover fetch - Updated `git fetch` logic to handle shallow repositories properly, fetching full history if needed to find the merge base. - Ensures compatibility with diff-cover requirements for pull requests. Assisted-by: Codex --- .github/workflows/test-and-publish.yml | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/.github/workflows/test-and-publish.yml b/.github/workflows/test-and-publish.yml index 0ad07535..4249f9a6 100644 --- a/.github/workflows/test-and-publish.yml +++ b/.github/workflows/test-and-publish.yml @@ -40,7 +40,12 @@ jobs: PDFREST_API_KEY: ${{ secrets.PDFREST_API_KEY }} - name: Fetch base branch for diff-cover if: github.event_name == 'pull_request' - run: git fetch origin ${{ github.base_ref }} --depth=0 + run: | + if git rev-parse --is-shallow-repository | grep -q true; then + git fetch --no-tags --prune origin ${{ github.base_ref }} --unshallow + else + git fetch --no-tags --prune origin ${{ github.base_ref }} + fi - name: Run diff-cover (new code must be >= 90%) if: github.event_name == 'pull_request' run: > From dad187edcd0d41a0cd1b596f8d3141a39b871117 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Thu, 5 Feb 2026 14:49:57 -0600 Subject: [PATCH 14/15] docs: Remove test parity script and update coverage instructions - Replaced by test coverage and the class-coverage nox session. - Removed `scripts/check_test_parity.sh` and all references to it across documentation (`README.md`, `AGENTS.md`, and `TESTING_GUIDELINES.md`). - Updated `TESTING_GUIDELINES.md` to include instructions for running `uvx nox -s class-coverage` to enforce minimum function-level coverage on key client classes (`PdfRestClient` and `AsyncPdfRestClient`). Assisted-by: Codex --- AGENTS.md | 3 - README.md | 6 -- TESTING_GUIDELINES.md | 6 +- scripts/check_test_parity.sh | 164 ----------------------------------- 4 files changed, 3 insertions(+), 176 deletions(-) delete mode 100755 scripts/check_test_parity.sh diff --git a/AGENTS.md b/AGENTS.md index 21b60517..b58ba3af 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -19,9 +19,6 @@ - `uv run pre-commit run --all-files` — enforce formatting and lint rules before pushing. - `uv run pytest` — execute the suite with the active interpreter. -- `scripts/check_test_parity.sh` — run changed tests and report sync/async - parity gaps (accepts optional base/head refs, defaults to - `upstream/main..HEAD`). - `uv build` — produce wheels and sdists identical to the release workflow. - `uvx nox -s tests` — create matrix virtualenvs via nox and execute the pytest session. diff --git a/README.md b/README.md index 360ed8b7..bd910104 100644 --- a/README.md +++ b/README.md @@ -45,9 +45,3 @@ uvx nox -s class-coverage To reuse an existing `coverage/py/coverage.json` without rerunning tests, add `-- --no-tests` (and optional `--coverage-json path`). - -Check sync/async parity for changed tests (defaults to `upstream/main..HEAD`): - -```bash -scripts/check_test_parity.sh -``` diff --git a/TESTING_GUIDELINES.md b/TESTING_GUIDELINES.md index 473dc63d..853cd4ae 100644 --- a/TESTING_GUIDELINES.md +++ b/TESTING_GUIDELINES.md @@ -19,9 +19,9 @@ iteration required. asserting method/path/headers/body). Optional payload branches (for example, `pages`, `output`, `rgb_color`, and output-prefix fields) require explicit tests so serialization differences are caught early. -- **Check parity regularly.** Run `scripts/check_test_parity.sh` (defaults to - `upstream/main..HEAD`) to spot missing sync/async counterparts, keeping - parameterized test IDs aligned between transports. +- **Check client coverage regularly.** Run `uvx nox -s class-coverage` to + enforce minimum function-level coverage for `PdfRestClient` and + `AsyncPdfRestClient`. - **Exercise both sides of the contract.** Hermetic tests (via `httpx.MockTransport`) validate serialization and local validation. Live suites prove the server behaves the same way, including invalid literal diff --git a/scripts/check_test_parity.sh b/scripts/check_test_parity.sh deleted file mode 100755 index 0b8f13d3..00000000 --- a/scripts/check_test_parity.sh +++ /dev/null @@ -1,164 +0,0 @@ -#!/usr/bin/env bash -set -euo pipefail -IFS=$'\n\t' - -base_ref="${1:-upstream/main}" -head_ref="${2:-HEAD}" - -if ! git rev-parse --verify "$base_ref" > /dev/null 2>&1; then - echo "Base ref '$base_ref' not found." >&2 - exit 1 -fi - -if ! git rev-parse --verify "$head_ref" > /dev/null 2>&1; then - echo "Head ref '$head_ref' not found." >&2 - exit 1 -fi - -test_files=() -while IFS= read -r file; do - if [[ -n "$file" ]]; then - test_files+=("$file") - fi -done < <( - git diff --name-only --diff-filter=d "$base_ref..$head_ref" -- tests | grep -E '\.py$' || true -) - -if [[ ${#test_files[@]} -eq 0 ]]; then - echo "No changed test files under tests/ for $base_ref..$head_ref." - exit 0 -fi - -tmp_output="$(mktemp)" -tmp_tests="$(mktemp)" -tmp_counts="$(mktemp)" -tmp_missing_sync="$(mktemp)" -tmp_missing_async="$(mktemp)" -tmp_payload="$(mktemp)" -trap 'rm -f "$tmp_output" "$tmp_tests" "$tmp_counts" "$tmp_missing_sync" "$tmp_missing_async" "$tmp_payload"' EXIT - -echo "Running pytest on changed tests:" -printf ' - %s\n' "${test_files[@]}" - -uv run pytest -vv -rA -n auto "${test_files[@]}" | tee "$tmp_output" - -awk ' -{ - line = $0; - sub(/^\[[^]]+\][[:space:]]+/, "", line); - sub(/[[:space:]]+\[[^]]+\]$/, "", line); - if (line ~ /^(PASSED|FAILED|SKIPPED|XFAIL|XPASS|ERROR)[[:space:]]+tests\/.*::/) { - sub(/^(PASSED|FAILED|SKIPPED|XFAIL|XPASS|ERROR)[[:space:]]+/, "", line); - print line; - } else if (line ~ /^tests\/.*::.*[[:space:]]+(PASSED|FAILED|SKIPPED|XFAIL|XPASS|ERROR)$/) { - sub(/[[:space:]]+(PASSED|FAILED|SKIPPED|XFAIL|XPASS|ERROR)$/, "", line); - print line; - } -} -' "$tmp_output" > "$tmp_tests" - -if [[ ! -s "$tmp_tests" ]]; then - echo "No test node IDs detected in pytest output; try rerunning with -vv." >&2 - exit 1 -fi - -awk -v sync_file="$tmp_missing_sync" \ - -v async_file="$tmp_missing_async" \ - -v payload_file="$tmp_payload" \ - -v counts_file="$tmp_counts" ' -function is_async(nodeid) { - return (nodeid ~ /::test_.*async_/); -} -function normalize(nodeid) { - sub(/::test_live_async_/, "::test_live_", nodeid); - sub(/::test_async_/, "::test_", nodeid); - return nodeid; -} -{ - total++; - if ($0 ~ /::test_.*(payload|validation)/) { - payload_like[$0] = 1; - } - if (is_async($0)) { - async_count++; - norm = normalize($0); - async_norm[norm] = 1; - async_orig[norm] = $0; - } else { - sync_count++; - norm = normalize($0); - sync_norm[norm] = 1; - sync_orig[norm] = $0; - } -} -END { - missing_sync = 0; - missing_async = 0; - - for (n in async_norm) { - if (!(n in sync_norm)) { - missing_sync++; - print async_orig[n] >> sync_file; - } - } - for (n in sync_norm) { - if (!(n in async_norm)) { - missing_async++; - print sync_orig[n] >> async_file; - } - } - payload_count = 0; - for (t in payload_like) { - payload_count++; - print t >> payload_file; - } - - print "total=" total > counts_file; - print "sync_count=" sync_count >> counts_file; - print "async_count=" async_count >> counts_file; - print "missing_sync=" missing_sync >> counts_file; - print "missing_async=" missing_async >> counts_file; - print "payload_count=" payload_count >> counts_file; -} -' "$tmp_tests" - -total=0 -sync_count=0 -async_count=0 -missing_sync=0 -missing_async=0 -payload_count=0 -while IFS='=' read -r key value; do - case "$key" in - total) total="$value" ;; - sync_count) sync_count="$value" ;; - async_count) async_count="$value" ;; - missing_sync) missing_sync="$value" ;; - missing_async) missing_async="$value" ;; - payload_count) payload_count="$value" ;; - esac -done < "$tmp_counts" - -echo "" -echo "Test parity report" -echo "Total tests: $total" -echo "Sync tests: $sync_count" -echo "Async tests: $async_count" -echo "Missing sync counterparts: $missing_sync" -if [[ "$missing_sync" -gt 0 ]]; then - sort "$tmp_missing_sync" | while read -r line; do - echo " - $line" - done -fi -echo "Missing async counterparts: $missing_async" -if [[ "$missing_async" -gt 0 ]]; then - sort "$tmp_missing_async" | while read -r line; do - echo " - $line" - done -fi -echo "Payload/validation-style tests (name contains payload/validation): $payload_count" -if [[ "$payload_count" -gt 0 ]]; then - sort "$tmp_payload" | while read -r line; do - echo " - $line" - done -fi From 3bc15135ee55c017b3d4e6047af477d1420c6b79 Mon Sep 17 00:00:00 2001 From: "Kevin A. Mitchell" Date: Fri, 6 Feb 2026 09:59:04 -0600 Subject: [PATCH 15/15] docs, coverage: Add internal classes to function coverage checks - Updated `AGENTS.md` to specify that underscore-prefixed methods in key client-facing classes (`_FilesClient`, `_AsyncFilesClient`) are in scope for function coverage checks. - Expanded `DEFAULT_COVERAGE_CLASSES` in `noxfile.py` to include the files client classes. - Updated `test-and-publish.yml` CI workflow to cover the files client classes. Assisted-by: Codex --- .github/workflows/test-and-publish.yml | 2 ++ AGENTS.md | 6 ++++++ noxfile.py | 7 ++++++- 3 files changed, 14 insertions(+), 1 deletion(-) diff --git a/.github/workflows/test-and-publish.yml b/.github/workflows/test-and-publish.yml index 4249f9a6..5278f259 100644 --- a/.github/workflows/test-and-publish.yml +++ b/.github/workflows/test-and-publish.yml @@ -59,6 +59,8 @@ jobs: coverage/py${{ matrix.python-version }}/coverage.json --class PdfRestClient --class AsyncPdfRestClient + --class _FilesClient + --class _AsyncFilesClient --fail-under 90 --markdown-report coverage/py${{ matrix.python-version }}/class-function-coverage.md - name: Upload coverage reports diff --git a/AGENTS.md b/AGENTS.md index b58ba3af..4dc7de69 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -146,6 +146,12 @@ Optional payload branches (`pages`, `output`, `rgb_color`, etc.) need explicit coverage so serialization regressions are caught. +- **Class function coverage scope:** The class coverage gate targets the main + client-facing classes (`PdfRestClient`, `AsyncPdfRestClient`, `_FilesClient`, + `_AsyncFilesClient`). For these classes, underscore-prefixed methods are + intentionally in scope and should be covered as part of the interface + contract. + - Write pytest tests: files named `test_*.py`, test functions `test_*`, fixtures in `conftest.py` where shared. diff --git a/noxfile.py b/noxfile.py index 32f76117..a7d3a66c 100644 --- a/noxfile.py +++ b/noxfile.py @@ -17,7 +17,12 @@ PROJECT_ROOT = Path(__file__).resolve().parent DEFAULT_EXAMPLE_PYTHON = "3.11" EXAMPLES_DIR = PROJECT_ROOT / "examples" -DEFAULT_COVERAGE_CLASSES = ("PdfRestClient", "AsyncPdfRestClient") +DEFAULT_COVERAGE_CLASSES = ( + "PdfRestClient", + "AsyncPdfRestClient", + "_FilesClient", + "_AsyncFilesClient", +) def _install_test_dependencies(session: nox.Session) -> None: