Skip to content

[None][chore] DO NOT REVIEW, test PR#12829

Open
longlee0622 wants to merge 4 commits intoNVIDIA:mainfrom
longlee0622:transformers-5.3.0
Open

[None][chore] DO NOT REVIEW, test PR#12829
longlee0622 wants to merge 4 commits intoNVIDIA:mainfrom
longlee0622:transformers-5.3.0

Conversation

@longlee0622
Copy link
Copy Markdown
Collaborator

@longlee0622 longlee0622 commented Apr 8, 2026

Summary by CodeRabbit

  • Chores
    • Updated Hugging Face transformers dependency to version 5.3.0.

Description

Test Coverage

PR Checklist

Please review the following before submitting your PR:

  • PR description clearly explains what and why. If using CodeRabbit's summary, please make sure it makes sense.

  • PR Follows TRT-LLM CODING GUIDELINES to the best of your knowledge.

  • Test cases are provided for new code paths (see test instructions)

  • Any new dependencies have been scanned for license and vulnerabilities

  • CODEOWNERS updated if ownership changes

  • Documentation updated as needed

  • Update tava architecture diagram if there is a significant design change in PR.

  • The reviewers assigned automatically/manually are appropriate for the PR.

  • Please check this after reviewing the above items as appropriate for this PR.

GitHub Bot Help

To see a list of available CI bot commands, please comment /bot help.

Signed-off-by: Jonas Li <6110159+longlee0622@users.noreply.github.com>
@longlee0622 longlee0622 requested a review from a team as a code owner April 8, 2026 07:35
@longlee0622
Copy link
Copy Markdown
Collaborator Author

/bot run --disable-fail-fast

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 8, 2026

📝 Walkthrough

Walkthrough

Updated the Hugging Face transformers dependency in requirements.txt from version 4.57.3 to version 5.3.0. No other dependencies or configuration changes are included in this update.

Changes

Cohort / File(s) Summary
Dependency Update
requirements.txt
Bumped transformers package version from 4.57.3 to 5.3.0.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~2 minutes

🚥 Pre-merge checks | ✅ 1 | ❌ 2

❌ Failed checks (2 warnings)

Check name Status Explanation Resolution
Title check ⚠️ Warning The title indicates this is a test PR and explicitly states 'DO NOT REVIEW', which conflicts with the actual change of updating the transformers dependency from 4.57.3 to 5.3.0. Update the title to accurately reflect the main change, such as '[None][chore] Upgrade transformers dependency to 5.3.0' instead of marking it as a test PR.
Description check ⚠️ Warning The PR description is entirely template boilerplate with no actual content filled in for Description or Test Coverage sections, failing to explain the dependency update rationale or testing approach. Complete the Description section explaining why transformers was upgraded from 4.57.3 to 5.3.0, and document relevant test coverage for the dependency change.
✅ Passed checks (1 passed)
Check name Status Explanation
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@requirements.txt`:
- Line 33: The project upgrades transformers to 5.3.0 but the SDPA patch import
in transformers_sdpa_mask.py (from transformers.integrations.executorch import
sdpa_mask_without_vmap) no longer exists and the current try/except silently
skips applying the patch; update the code or requirements: either pin
transformers to a compatible 4.x release in requirements.txt (e.g., revert to
4.57.3) or modify transformers_sdpa_mask.py to detect the transformers major
version (inspect transformers.__version__), and for version 5.x either implement
or import an alternative SDPA mask API for 5.x, or explicitly log and raise a
clear error when sdpa_mask_without_vmap cannot be applied so the failure is not
silent; ensure all version checks (e.g., transformers.__version__ >= '4.53.0')
are replaced with proper semantic version parsing to distinguish 4.x vs 5.x
behavior.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 60af28b4-1cd3-4da5-9dc1-e6821d63021d

📥 Commits

Reviewing files that changed from the base of the PR and between 0b5289f and 34d8e33.

📒 Files selected for processing (1)
  • requirements.txt

@tensorrt-cicd
Copy link
Copy Markdown
Collaborator

PR_Github #42297 [ run ] triggered by Bot. Commit: 34d8e33 Link to invocation

@longlee0622 longlee0622 requested a review from a team as a code owner April 8, 2026 10:27
@longlee0622 longlee0622 requested a review from QiJune April 8, 2026 10:27
@longlee0622
Copy link
Copy Markdown
Collaborator Author

/bot run --disable-fail-fast

@tensorrt-cicd
Copy link
Copy Markdown
Collaborator

PR_Github #42326 [ run ] triggered by Bot. Commit: b151eaa Link to invocation

@tensorrt-cicd
Copy link
Copy Markdown
Collaborator

PR_Github #42326 [ run ] completed with state SUCCESS. Commit: b151eaa
/LLM/main/L0_MergeRequest_PR pipeline #33115 completed with status: 'FAILURE'

CI Report

⚠️ Action Required:

  • Please check the failed tests and fix your PR
  • If you cannot view the failures, ask the CI triggerer to share details
  • Once fixed, request an NVIDIA team member to trigger CI again

Link to invocation

…oved

Transformers 5.x removed sdpa_mask_without_vmap from integrations.executorch.
Use functools.partial(sdpa_mask, use_vmap=False) when the legacy import fails.

Signed-off-by: Jonas Li <6110159+longlee0622@users.noreply.github.com>
Made-with: Cursor
AutoModelForVision2Seq was removed from the public API in Transformers v5.
Fall back to AutoModelForImageTextToText when the legacy name is unavailable.

Signed-off-by: Jonas Li <6110159+longlee0622@users.noreply.github.com>
Made-with: Cursor
Transformers v5 removed these helpers from modeling_utils. Add
hf_parameter_utils with ImportError fallback matching ModuleUtilsMixin
behavior; update CLIP, SigLIP, and Wan call sites.

Signed-off-by: Jonas Li <6110159+longlee0622@users.noreply.github.com>
Made-with: Cursor
@longlee0622 longlee0622 requested review from a team as code owners April 8, 2026 15:29
@longlee0622
Copy link
Copy Markdown
Collaborator Author

/bot run

@tensorrt-cicd
Copy link
Copy Markdown
Collaborator

PR_Github #42355 [ run ] triggered by Bot. Commit: 0510311 Link to invocation

@tensorrt-cicd
Copy link
Copy Markdown
Collaborator

PR_Github #42355 [ run ] completed with state FAILURE. Commit: 0510311
/LLM/main/L0_MergeRequest_PR pipeline #33140 completed with status: 'FAILURE'

CI Report

⚠️ Action Required:

  • Please check the failed tests and fix your PR
  • If you cannot view the failures, ask the CI triggerer to share details
  • Once fixed, request an NVIDIA team member to trigger CI again

Link to invocation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants