Skip to content

Release 0.13.1#2768

Merged
seratch merged 1 commit intomainfrom
release/v0.13.1
Mar 25, 2026
Merged

Release 0.13.1#2768
seratch merged 1 commit intomainfrom
release/v0.13.1

Conversation

@github-actions
Copy link
Contributor

@github-actions github-actions bot commented Mar 24, 2026

Release readiness review (v0.13.0 -> TARGET b6d9b03)

This is a release readiness report done by $final-release-review skill.

Diff

v0.13.0...b6d9b03

Release call:

🟢 GREEN LIGHT TO SHIP No concrete release-blocking regressions or breaking-versioning mismatches were identified in v0.13.0...b6d9b03.

Scope summary:

  • 36 files changed (+3986/-128); key areas touched: new any-llm model/provider integration, Realtime response sequencing, MCP streamable HTTP robustness, agent-as-tool output fallback behavior, targeted regression tests, and release/docs/example updates.

Risk assessment (ordered by impact):

  1. Realtime response sequencing refactor has residual concurrency sensitivity
  • Risk: 🟡 MODERATE. Runtime turn sequencing could still regress in edge websocket timing patterns even though coverage is substantially expanded.
  • Evidence: src/agents/realtime/openai_realtime.py introduces a new _ResponseCreateSequencer and deferred task orchestration; tests/realtime/test_openai_realtime.py adds extensive scenario coverage for queued response.create, cancellation, and error correlation.
  • Files: src/agents/realtime/openai_realtime.py, tests/realtime/test_openai_realtime.py
  • Action: Run uv run pytest -s tests/realtime/test_openai_realtime.py -k "response_create or response_control or close"; pass criteria: all selected sequencing tests pass with no intermittent failures.
  1. Any-LLM Responses path depends on provider-private call path
  • Risk: 🟡 MODERATE. Future provider SDK internals could drift because transport-header support currently falls back to a private provider entrypoint.
  • Evidence: src/agents/extensions/models/any_llm_model.py uses provider._aresponses(...) with a compatibility shim and comments explaining public aresponses() validation limitations in any-llm 1.11.0.
  • Files: src/agents/extensions/models/any_llm_model.py, tests/models/test_any_llm_model.py, pyproject.toml
  • Action: Run uv run pytest -s tests/models/test_any_llm_model.py on Python 3.11+ with openai-agents[any-llm]; pass criteria: all AnyLLM unit tests pass, including responses/chat selection and streaming normalization paths.
  1. MCP initialized-notification tolerance introduces alternate transport path
  • Risk: 🟢 LOW. Behavior is opt-in and constrained, but transport customization paths should be validated to avoid accidental masking of non-initialized failures.
  • Evidence: src/agents/mcp/server.py adds _InitializedNotificationTolerantStreamableHTTPTransport and ignore_initialized_notification_failure wiring; tests verify initialized failures are tolerated while other failures still raise.
  • Files: src/agents/mcp/server.py, tests/mcp/test_streamable_http_client_factory.py
  • Action: Run uv run pytest -s tests/mcp/test_streamable_http_client_factory.py; pass criteria: initialized-notification tolerance tests pass and non-initialized failure propagation test remains strict.

Notes:

  • Working tree was clean during review.
  • Base tag was selected from local tags only (git tag -l 'v*' --sort=-v:refname | head -n1v0.13.0) per request.
  • Target commit was git rev-parse HEADb6d9b03051e383ec98842c9d014bce75ee3865bc.
  • This assessment is diff-based; no full local verification stack was executed as part of this report.

@github-actions github-actions bot added this to the 0.13.x milestone Mar 24, 2026
@github-actions github-actions bot force-pushed the release/v0.13.1 branch 5 times, most recently from 1ebe11e to 64b6684 Compare March 25, 2026 05:37
@seratch seratch merged commit 0a5b8c9 into main Mar 25, 2026
@seratch seratch deleted the release/v0.13.1 branch March 25, 2026 07:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant