[None][chore] DO NOT REVIEW, test PR#12829
Conversation
Signed-off-by: Jonas Li <6110159+longlee0622@users.noreply.github.com>
|
/bot run --disable-fail-fast |
📝 WalkthroughWalkthroughUpdated the Hugging Face Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes 🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (2 warnings)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@requirements.txt`:
- Line 33: The project upgrades transformers to 5.3.0 but the SDPA patch import
in transformers_sdpa_mask.py (from transformers.integrations.executorch import
sdpa_mask_without_vmap) no longer exists and the current try/except silently
skips applying the patch; update the code or requirements: either pin
transformers to a compatible 4.x release in requirements.txt (e.g., revert to
4.57.3) or modify transformers_sdpa_mask.py to detect the transformers major
version (inspect transformers.__version__), and for version 5.x either implement
or import an alternative SDPA mask API for 5.x, or explicitly log and raise a
clear error when sdpa_mask_without_vmap cannot be applied so the failure is not
silent; ensure all version checks (e.g., transformers.__version__ >= '4.53.0')
are replaced with proper semantic version parsing to distinguish 4.x vs 5.x
behavior.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 60af28b4-1cd3-4da5-9dc1-e6821d63021d
📒 Files selected for processing (1)
requirements.txt
|
PR_Github #42297 [ run ] triggered by Bot. Commit: |
|
/bot run --disable-fail-fast |
|
PR_Github #42326 [ run ] triggered by Bot. Commit: |
|
PR_Github #42326 [ run ] completed with state
|
…oved Transformers 5.x removed sdpa_mask_without_vmap from integrations.executorch. Use functools.partial(sdpa_mask, use_vmap=False) when the legacy import fails. Signed-off-by: Jonas Li <6110159+longlee0622@users.noreply.github.com> Made-with: Cursor
AutoModelForVision2Seq was removed from the public API in Transformers v5. Fall back to AutoModelForImageTextToText when the legacy name is unavailable. Signed-off-by: Jonas Li <6110159+longlee0622@users.noreply.github.com> Made-with: Cursor
Transformers v5 removed these helpers from modeling_utils. Add hf_parameter_utils with ImportError fallback matching ModuleUtilsMixin behavior; update CLIP, SigLIP, and Wan call sites. Signed-off-by: Jonas Li <6110159+longlee0622@users.noreply.github.com> Made-with: Cursor
b151eaa to
0510311
Compare
|
/bot run |
|
PR_Github #42355 [ run ] triggered by Bot. Commit: |
|
PR_Github #42355 [ run ] completed with state
|
Summary by CodeRabbit
transformersdependency to version 5.3.0.Description
Test Coverage
PR Checklist
Please review the following before submitting your PR:
PR description clearly explains what and why. If using CodeRabbit's summary, please make sure it makes sense.
PR Follows TRT-LLM CODING GUIDELINES to the best of your knowledge.
Test cases are provided for new code paths (see test instructions)
Any new dependencies have been scanned for license and vulnerabilities
CODEOWNERS updated if ownership changes
Documentation updated as needed
Update tava architecture diagram if there is a significant design change in PR.
The reviewers assigned automatically/manually are appropriate for the PR.
Please check this after reviewing the above items as appropriate for this PR.
GitHub Bot Help
To see a list of available CI bot commands, please comment
/bot help.