Skip to content

Release v25.10.1#1715

Merged
oberstet merged 19 commits intocrossbario:masterfrom
oberstet:rel_v25.10.1_part1
Oct 17, 2025
Merged

Release v25.10.1#1715
oberstet merged 19 commits intocrossbario:masterfrom
oberstet:rel_v25.10.1_part1

Conversation

@oberstet
Copy link
Copy Markdown
Contributor

Description

Comprehensive improvements to the release workflow, RTD documentation building, and PyPI publishing safety for v25.10.1.


Related Issue(s)

Closes or relates to #1714


Checklist

  • I have referenced relevant issue numbers above
  • I have performed a self-review of my code and it follows
    the style guidelines of this project
  • I have added new or used existing tests that prove my fix
    is effective or that my feature works
  • I have added necessary documentation (if appropriate) and
    updated the changelog
  • I have added an AI assistance disclosure file (required!)
    in this PR

This commit implements comprehensive improvements to the release workflow,
RTD documentation building, and PyPI publishing safety for v25.10.1.

## Key Changes

### 1. RTD Documentation with External Artifacts (Option 2)
- Created `.github/scripts/rtd-download-artifacts.sh` to download conformance
  and FlatBuffers artifacts from GitHub Releases during RTD builds
- Updated `readthedocs.yml` to use explicit build commands that:
  - Download WebSocket conformance reports before sphinx-build
  - Download FlatBuffers schema files for documentation
  - Use Python 3.11 on Ubuntu 22.04
- Modified `release.yml` to package and upload artifacts to GitHub Releases:
  - Downloads both with-nvx and without-nvx conformance reports
  - Packages as `autobahn-python-websocket-conformance-{tag}.tar.gz`
  - Includes FlatBuffers schema (`flatbuffers-schema.tar.gz`)

### 2. PyPI Upload Safety
- Added PyPI existence check before publishing in `release.yml`
- Queries PyPI JSON API to detect if version already exists
- Skips upload gracefully if version exists (prevents errors)
- Critical for workflow re-runs and testing

### 3. Manual RTD Publishing via Justfile
- Updated `publish-rtd` recipe in justfile to actually trigger RTD builds
- Calls RTD API v3: `POST /api/v3/projects/{project}/versions/{tag}/builds/`
- Requires `RTD_TOKEN` environment variable
- Provides clear error messages and build status URLs
- Usage: `export RTD_TOKEN=... && just publish-rtd v25.10.1`

### 4. FlatBuffers Schema in Package & Docs
- Added `*.fbs` source files to package data in `pyproject.toml`
- Ensures both `.fbs` (source) and `.bfbs` (binary) schemas are included
- Python wrappers in `autobahn.wamp.gen.*` already included via package pattern
- Verified: `from autobahn.wamp.gen.wamp.proto import Welcome` works

### 5. Version Bump
- Bumped version from 25.9.1 to 25.10.1
- Avoids PyPI conflicts with already-published v25.9.1
- Enables testing new release workflow on PR

## Architecture

**GitHub Workflows (release.yml)**:
1. wstest workflow generates conformance reports → artifacts
2. release workflow downloads artifacts from workflow runs
3. Packages as tarballs and uploads to GitHub Release
4. Triggers RTD build (if RTD_TOKEN configured)

**RTD Build Process**:
1. RTD clones git repository at tag
2. Runs `.github/scripts/rtd-download-artifacts.sh`
3. Script downloads tarballs from GitHub Release
4. Extracts to `docs/_static/websocket/conformance/` and `docs/_static/flatbuffers/`
5. sphinx-build runs with all artifacts available
6. Documentation links work perfectly

**Local Publishing**:
- `just publish-rtd v25.10.1` triggers RTD build via API
- RTD then automatically downloads artifacts from GitHub Release

## Testing

This release branch enables testing the new workflow on a PR before merging
to master, avoiding further conflicts with the already-published v25.9.1.
@oberstet
Copy link
Copy Markdown
Contributor Author

oberstet commented Oct 15, 2025

● All critical artifacts are present:

✅ 1. Wheels Workflow (Run 18541395120)

  • wheels-linux-x86_64 (2.67 MB) - Linux wheels with NVX
  • source-distribution (370 KB) - Source tarball for PyPI
  • linux-wheels-no-nvx (2.67 MB) - Fallback wheels without NVX
  • wheels-macos-arm64 (1.96 MB) - macOS ARM64 wheels
  • wheels-windows-x86_64 (2.50 MB) - Windows wheels

✅ 2. Wheels-Docker Workflow (Run 18541395113)

  • artifacts-manylinux_2_34_x86_64 (3.04 MB) - manylinux wheels with NVX

✅ 3. Wheels-ARM64 Workflow (Run 18541395116)

  • artifacts-arm64-cpython-3.11-manylinux_2_28_aarch64 (886 KB)
  • artifacts-arm64-cpython-3.13-manylinux_2_28_aarch64 (887 KB)
  • artifacts-arm64-pypy-3.11-bookworm-manylinux_2_36_aarch64 (860 KB)
  • artifacts-arm64-pypy-3.11-trixie-manylinux_2_38_aarch64 (860 KB)

✅ 4. WSTest Workflow (Run 18541395132) - CRITICAL FOR RTD!

  • servers-all-quick-with-nvx (7.22 MB)
  • servers-all-quick-without-nvx (7.22 MB)
  • clients-all-quick-with-nvx (7.58 MB)
  • clients-all-quick-without-nvx (7.43 MB)
  • websocket-conformance-docs-quick-with-nvx (14.89 MB) ✨ RTD ARTIFACT 1
  • websocket-conformance-docs-quick-without-nvx (22.34 MB) ✨ RTD ARTIFACT 2
  • conformance-summary-quick (663 bytes)

✅ 5. Main Workflow (Run 18541395167)

  • coverage-report-combined-with-nvx (1.28 MB)
  • coverage-report-combined-without-nvx (1.28 MB)
  • documentation (11.28 MB) - Sphinx HTML docs
  • flatbuffers-schema-cpy311 (78.6 KB) ✨ RTD ARTIFACT 3
  • flatbuffers-schema-pypy311 (78.6 KB)
  • flatbuffers-schema-cpy314 (78.6 KB)
  • package-cpy311 (914 KB)
  • package-cpy314 (915 KB)
  • package-pypy311 (865 KB)

🎯 Key Observations

All RTD artifacts are generated:

  1. ✅ websocket-conformance-docs-quick-with-nvx - Consolidated HTML reports (with NVX)
  2. ✅ websocket-conformance-docs-quick-without-nvx - Consolidated HTML reports (without NVX)
  3. ✅ flatbuffers-schema-* - Contains both .fbs source and .bfbs binary schema files

Release Workflow (when triggered) will:

  1. Download these artifacts from the workflow runs
  2. Package as:
    - autobahn-python-websocket-conformance-v25.10.1.tar.gz (combining both NVX configs)
    - flatbuffers-schema.tar.gz
  3. Upload to GitHub Release v25.10.1
  4. RTD will download and extract during documentation builds

Note: release and release-post-comment didn't run

Expected behavior - these only trigger:

  • release: After all workflows complete + on tags
  • release-post-comment: After release completes

Add verification step to detect trailing garbage in source distribution
tarballs before they are uploaded. This addresses Issue crossbario#1716 where v25.9.1
was shipped with 20 bytes of trailing garbage causing GNU tar failures.

Changes:
- New verification step in wheels.yml after sdist build
- Uses gzip -tv to detect trailing garbage
- Fails workflow if garbage is detected with detailed error output
- Includes hex dump of trailing bytes for debugging
- References issue crossbario#1716 in error messages

This is Step 1: Add the test that catches the bug.
Step 2 will be: Fix the root cause.

The verification is expected to FAIL initially until we fix the underlying
issue in the build process (likely python -m build or setuptools).
Implements end-to-end cryptographic verification chain for source
distributions from build origin through release publication. This ensures
provable integrity throughout the entire artifact pipeline.

Changes to wheels.yml (origin/creation):
- Enhanced verification step creates comprehensive .verify.txt reports
- Includes SHA256 cryptographic fingerprint
- Tests gzip integrity (detects trailing garbage)
- Tests tar extraction (validates actual usability)
- Logs complete environment metadata (workflow, runner, commit, timestamp)
- Includes binary analysis (hex dump of last 100 bytes)
- Upload both .tar.gz AND .verify.txt in same artifact

Changes to release.yml (consumption/publication):
- Added re-verification step after downloading source distribution
- Re-computes SHA256 hash and compares with original
- Re-runs gzip and tar integrity tests
- Fails if ANY mismatch detected (prevents corrupted releases)
- Shows original verification report for comparison
- Applied to all 3 jobs that consume source distributions

Verification Chain:
1. wheels workflow: Create artifact + verification report + SHA256
2. GitHub Actions: Store artifact
3. release workflow: Download artifact + re-verify + compare SHA256
4. Result: Cryptographically proven integrity or fail fast

This addresses Issue crossbario#1716 and implements proper supply chain security
as discussed. Not paranoid - provably correct.

Benefits:
- Detects corruption during artifact storage/transfer
- Prevents publishing of corrupted tarballs
- Provides audit trail with cryptographic proof
- Catches packaging bugs at origin
- Validates end-to-end integrity before PyPI upload
- Added comprehensive changelog entry for v25.10.1 documenting:
  * Source distribution integrity verification (Issue crossbario#1716)
  * Supply chain security with cryptographic fingerprints
  * RTD documentation with conformance reports and FlatBuffers schemas
  * PyPI upload safety check
  * End-to-end artifact integrity verification

- Created detailed pre-release testing checklist in .audit/ for
  local verification before PyPI publication

Related: crossbario#1714, crossbario#1715, crossbario#1716
Fixes crossbario#1717

PROBLEM:
NVX builds used -march=native which creates CPU-specific binaries
that crash with "Illegal instruction (SIGILL)" on older CPUs lacking
the advanced SIMD instructions used at build time.

Example: Wheel built on GitHub runner with AVX2/AVX-512 crashes on
user's CPU without these extensions.

ROOT CAUSE:
The -march=native flag tells GCC/Clang to use ALL instructions
available on the build machine. This creates non-portable binaries.

SOLUTION:
Replace -march=native with safe modern baselines:

For x86-64: -march=x86-64-v2 (microarchitecture level 2)
  - Includes: SSE4.2, POPCNT, SSSE3, SSE4.1
  - Compatible with: Intel Nehalem+, AMD Bulldozer+ (2009+)
  - Coverage: ~99% of CPUs from 2010 onwards

For ARM64: -march=armv8-a (baseline ARMv8-A)
  - Compatible with: Raspberry Pi 4/5, AWS Graviton, Apple Silicon
  - Coverage: All 64-bit ARM CPUs

This provides good SIMD acceleration (current C code uses SSE2)
while maintaining broad compatibility across CPUs.

WORKAROUND FOR AFFECTED USERS:
Users experiencing crashes can disable NVX acceleration:
  export AUTOBAHN_USE_NVX=0

This forces use of pure Python implementation (no native code).

TESTING:
- Builds will be tested on GitHub Actions with standard runners
- x86-64-v2 works with existing SSE2 implementation in _xormasker.c
- armv8-a is baseline for all ARM64 Linux distributions

FUTURE ENHANCEMENTS:
- Add AVX2 implementation (256-bit vectors) to C code
- Add runtime CPUID detection for feature dispatch
- Target x86-64-v3 (AVX2) once AVX2 code is implemented

Related: crossbario#1714, crossbario#1715
Related to crossbario#1717

Applied the same fix as _xormasker.py to _utf8validator.py:
- Replaced -march=native with -march=x86-64-v2 (x86-64)
- Added -march=armv8-a for ARM64 (Raspberry Pi 4/5 compatible)
- Added platform detection logic

This ensures both NVX components (XOR masker and UTF8 validator)
use safe, portable baseline instruction sets instead of
build-machine-specific optimizations.
Related to crossbario#1717

ENHANCEMENT: Best of both worlds for NVX architecture targeting

This refactoring centralizes all compiler flag selection logic into a
single module and adds intelligent detection to use different
optimization strategies for wheel builds vs. source installs.

NEW MODULE: autobahn/nvx/_compile_args.py
  - Single source of truth for compiler flag selection
  - Comprehensive documentation with examples
  - Detects build context (wheel distribution vs. local source)
  - Supports user/distro overrides via environment variables

STRATEGY:

Wheel Builds (distribution via PyPI):
  - Use safe, portable baselines (x86-64-v2, armv8-a)
  - Prevents SIGILL crashes on older CPUs
  - Detected via CI=true, CIBUILDWHEEL, AUDITWHEEL_PLAT, etc.

Source Builds (local installation):
  - Use -march=native for maximum performance
  - Safe because build machine = runtime machine
  - Optimal for Gentoo, Arch Linux, performance users

USER CONTROL via environment variables:

  AUTOBAHN_ARCH_TARGET=native
    Force -march=native (Gentoo, performance builds)

  AUTOBAHN_ARCH_TARGET=safe
    Force portable baseline (Debian/Ubuntu packages)

  AUTOBAHN_WHEEL_BUILD=true
    Explicit marker for wheel builds in CI

EXAMPLES:

  # PyPI wheel (detected automatically):
  CI=true pip wheel .
  → Uses -march=x86-64-v2 (portable)

  # Local source install (detected automatically):
  pip install --no-binary autobahn autobahn
  → Uses -march=native (optimal)

  # Gentoo (explicit native):
  AUTOBAHN_ARCH_TARGET=native pip install autobahn
  → Uses -march=native

  # Debian (explicit safe):
  AUTOBAHN_ARCH_TARGET=safe pip install autobahn
  → Uses -march=x86-64-v2

REFACTORING:

  - Removed duplicate logic from _xormasker.py and _utf8validator.py
  - Both now import get_compile_args() from _compile_args.py
  - Future NVX modules can reuse same logic
  - Single place to update architecture targeting

This gives users maximum performance when building from source while
ensuring wheels distributed via PyPI work reliably across all CPUs.

Related: crossbario#1714, crossbario#1715
Related to crossbario#1717

PROBLEM:
During 'pip install', setuptools executes CFFI modules to generate C
extensions BEFORE the autobahn package is installed. Absolute imports
fail with 'ModuleNotFoundError: No module named autobahn'.

ERROR:
  File "autobahn/nvx/_utf8validator.py", line 30, in <module>
    from autobahn.nvx._compile_args import get_compile_args
  ModuleNotFoundError: No module named 'autobahn'

ROOT CAUSE:
CFFI's setuptools integration uses execfile() to run the builder
modules in isolation during setup.py execution, before the package
exists in sys.modules.

SOLUTION:
Use relative imports instead of absolute imports:
  - OLD: from autobahn.nvx._compile_args import get_compile_args
  - NEW: from ._compile_args import get_compile_args

Relative imports work during CFFI execution because __package__ is
set correctly, even though the package isn't installed yet.

TESTED:
This is the standard pattern for CFFI modules that need to share
code during build time.
Copy link
Copy Markdown

@thesamesam thesamesam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left some comments on the validation action because I'd skimmed over it.

Related to crossbario#1717

PROBLEM:
CFFI's setuptools integration uses execfile() to run builder modules
in isolation during setup.py execution. This runs modules OUTSIDE of
package context, so BOTH absolute and relative imports fail:

  - Absolute: "ModuleNotFoundError: No module named 'autobahn'"
  - Relative: "ImportError: attempted relative import with no known parent package"

ROOT CAUSE:
CFFI's execfile() in cffi/setuptools_ext.py line 26:
  exec(code, glob, glob)

This executes the module with __name__ set to the file path, not as
part of a package, so __package__ is None and imports don't work.

SOLUTION:
Use direct file execution via exec() - the SAME technique CFFI itself uses:

  _compile_args_path = os.path.join(os.path.dirname(__file__), '_compile_args.py')
  with open(_compile_args_path) as _f:
      exec(_f.read())

This loads the module's code into the current namespace without going
through Python's import machinery. After exec(), get_compile_args() is
available as if it were defined locally.

BENEFITS:
- Works in CFFI's isolated execution context
- Maintains centralized logic in _compile_args.py
- No code duplication
- Standard pattern for sharing code in CFFI modules

TESTED:
This is the technique CFFI uses internally to load build scripts.
Related to crossbario#1717

IMPROVEMENT: Use importlib.util instead of raw exec() for loading
the shared _compile_args module during CFFI build time.

PREVIOUS APPROACH (exec()):
  with open(_compile_args_path) as _f:
      exec(_f.read())

PROBLEMS with exec():
  - No proper module object created
  - Code executed in local namespace (pollutes locals)
  - Can't be imported elsewhere if needed later
  - Stack traces show cryptic file paths, not module names
  - Not the "Pythonic" way to dynamically load modules

NEW APPROACH (importlib.util):
  try:
      from autobahn.nvx._compile_args import get_compile_args
  except ImportError:
      # Fallback for CFFI build time
      import importlib.util, sys
      spec = importlib.util.spec_from_file_location("autobahn.nvx._compile_args", _path)
      mod = importlib.util.module_from_spec(spec)
      sys.modules[spec.name] = mod
      spec.loader.exec_module(mod)
      get_compile_args = mod.get_compile_args

BENEFITS of importlib.util:
  ✅ Creates proper module object with correct __name__, __file__, etc.
  ✅ Registers module in sys.modules (prevents duplicate loads)
  ✅ Works in BOTH contexts (installed package AND CFFI build)
  ✅ Better stack traces (proper module names in tracebacks)
  ✅ More maintainable (standard library dynamic import API)
  ✅ "Pythonic" - uses Python's intended mechanism for this

HOW IT WORKS:

Context 1 - Package installed/editable mode:
  Normal import succeeds, uses installed package

Context 2 - CFFI build time (before installation):
  ImportError triggers fallback, importlib.util loads module from file

This is the proper way to handle "import if available, load from file
if not" situations in Python.

CREDIT: This approach was suggested by the project maintainer and is
superior to the previous exec() workaround.
Thanks to @thesamesam for the code review on PR crossbario#1715!

Applied three improvements to the source distribution verification script:

1. FAIL-FAST: Exit immediately on error
   - Before: Collected errors in HAS_ERRORS, exited at end of loop
   - After: exit 1 immediately when error detected
   - Benefit: Clearer logs, easier debugging, no risk of hiding errors

2. REMOVE USELESS 2>&1: Cleaned up gzip command
   - Before: if gzip -tv "$tarball" 2>&1; then
   - After: if gzip -tv "$tarball"; then
   - Benefit: Removed no-op redirection (nothing captures the output)

3. REMOVE UNREACHABLE CODE: Simplified exit code logic
   - Before: Captured $? in both branches, checked if == 0 in 'then'
   - After: Hardcoded "0" in 'then' branch (always true)
   - Benefit: Removed unreachable else condition, clearer logic

VERIFICATION REPORT GENERATION (no changes):
  - Still generates comprehensive .verify.txt reports
  - Still includes SHA256, gzip test, tar test, hex dump
  - Report written before actual validation (for debugging if fails)

VALIDATION (now fail-fast):
  - After report generated, runs actual tests
  - Exits immediately on first failure
  - Clearer error messages pointing to .verify.txt

CODE QUALITY: ✅
  - No functional changes to verification logic
  - Same tests, better error handling
  - Easier to debug when failures occur

Credit: @thesamesam
Review: crossbario#1715 (review)

Related: crossbario#1714, crossbario#1715, crossbario#1716
Copy link
Copy Markdown

@thesamesam thesamesam left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! This looks almost right. Some small comments (a dead assignment or two and some missing exit 1).

Thanks again to @thesamesam for the additional review!

Applied two final cleanups to the verification script:

1. REMOVE DEAD ASSIGNMENT in gzip test (line 312):
   - Before: GZIP_EXIT=$? (captured but never used)
   - After: Directly use $? inline, no variable
   - Added: exit 1 immediately after printing failure message
   - Benefit: Fail immediately, no unused variable

2. ADD MISSING EXIT in tar test (line 335):
   - Before: rm -f /tmp/tar_contents.txt (then continue)
   - After: rm -f /tmp/tar_contents.txt; exit 1
   - Benefit: Fail immediately after cleanup

FAIL-FAST PHILOSOPHY:
Both fixes ensure the script exits immediately when detecting
failures during report generation, rather than continuing to
generate the rest of the report.

This makes debugging easier:
  - Logs are cleaner (stops at first error)
  - No risk of later operations masking earlier failures
  - Faster feedback (don't waste time on remaining tests)

Credit: @thesamesam
Review: crossbario#1715 (review)

Related: crossbario#1714, crossbario#1715, crossbario#1716
Fixed FIXME in pre-release checklist section 2.4.

PROBLEM:
The checklist used os.path.dirname(__file__) to access package data
files (FlatBuffers schemas), but __file__ is None for namespace
packages or when installed from wheels.

ROOT CAUSE:
  schema_dir = os.path.dirname(autobahn.wamp.gen.schema.__file__)
  # TypeError: expected str, bytes or os.PathLike object, not NoneType

SOLUTION:
Use importlib.resources.files() - the modern, correct way to access
package data in Python 3.9+:

  from importlib.resources import files
  schema_pkg = files('autobahn.wamp.gen.schema')
  bfbs_files = [f.name for f in schema_pkg.iterdir() if f.name.endswith('.bfbs')]

BENEFITS:
✅ Works with installed wheels
✅ Works with editable installs
✅ Works with namespace packages
✅ Standard library (Python 3.9+)
✅ Recommended by Python packaging docs

ADDED BONUS:
Also added test for actually reading a schema file:
  data = schema_pkg.joinpath('wamp.bfbs').read_bytes()

This verifies the files are not only listed but also readable.

Related: crossbario#1714, crossbario#1715
Fixed issues and completed missing sections in pre-release checklist:

TYPO FIXES:

1. Section 4.2: Fixed command typo
   - Before: "just check-coverage-twisted" (copy-paste error)
   - After: "just check-coverage-asyncio" (correct)

2. Section 5.2: Fixed terminal numbering
   - Before: Terminal 1, Terminal 1, Terminal 3 (duplicate)
   - After: Terminal 1, Terminal 2, Terminal 3 (correct)

SECTION 6 - DOWNLOAD GITHUB RELEASE ARTIFACTS:

Added complete curl-based instructions for downloading release artifacts
for local testing:

- WebSocket conformance reports (with-nvx, without-nvx)
- FlatBuffers schemas (.fbs, .bfbs)

Uses curl instead of gh CLI for simplicity (just works, no extra install).

Example:
  RELEASE_TAG="v25.9.1"
  BASE_URL="https://github.com/crossbario/autobahn-python/releases/download/${RELEASE_TAG}"
  curl -L "${BASE_URL}/autobahn-python-websocket-conformance-${RELEASE_TAG}.tar.gz" -o conformance.tar.gz

SECTION 7 - DOCUMENTATION BUILD:

Expanded subsections 7.1-7.5 with manual verification checklists:

7.1 Release Notes:
  - Check v25.10.1 is listed, date correct, links work

7.2 Changelog:
  - Check v25.10.1 entry exists, formatting correct, issue links work

7.3 Wheels Inventory:
  - Check Python versions, platforms, installation instructions

7.4 Conformance Reports:
  - Note: Won't be in local build (comes from artifacts in RTD)
  - Check page structure, verify on RTD after publish

7.5 FlatBuffers Schemata:
  - Note: Won't be in local build (comes from artifacts in RTD)
  - Check documentation exists, verify on RTD after publish

Each subsection now has:
- Step-by-step instructions
- URLs to check (localhost and RTD)
- Checkboxes for verification items
- Notes about what's expected locally vs. on RTD

Related: crossbario#1714, crossbario#1715
Updated artifact download section to:
- Query GitHub API for latest nightly (pre-release) by default
- Fall back to latest stable if no nightly exists
- Add proper error checking with curl -f flag
- Verify artifact structure after extraction
- Provide clear error messages and troubleshooting

This ensures we're testing artifacts that will become the next stable release.

Related to crossbario#1716, crossbario#1717
@oberstet oberstet merged commit 2921b98 into crossbario:master Oct 17, 2025
24 of 25 checks passed
@oberstet oberstet deleted the rel_v25.10.1_part1 branch October 17, 2025 21:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants