[None][chore] Update flashinfer-python from 0.6.8 to 0.6.9#13631
[None][chore] Update flashinfer-python from 0.6.8 to 0.6.9#13631yihwang-nv merged 2 commits intoNVIDIA:mainfrom
Conversation
Bump flashinfer-python dependency to the latest stable release. Updated version pins in requirements.txt, security_scanning/pyproject.toml, security_scanning/poetry.lock, and ATTRIBUTIONS-Python.md. Signed-off-by: Yihan Wang <yihwang@nvidia.com>
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Enterprise Run ID: ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (3)
📝 WalkthroughWalkthroughThis pull request updates the Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~3 minutes Possibly related PRs
Suggested reviewers
🚥 Pre-merge checks | ✅ 5✅ Passed checks (5 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Review rate limit: 9/10 reviews remaining, refill in 6 minutes. Comment |
|
/bot run --disable-fail-fast |
|
PR_Github #46267 [ run ] triggered by Bot. Commit: |
Signed-off-by: Yihan Wang <yihwang@nvidia.com>
|
/bot run --disable-fail-fast |
|
PR_Github #46307 [ run ] triggered by Bot. Commit: |
|
PR_Github #46307 [ run ] completed with state |
|
/bot run --disable-fail-fast |
|
PR_Github #46312 [ run ] triggered by Bot. Commit: |
|
PR_Github #46312 [ run ] completed with state |
juney-nvidia
left a comment
There was a problem hiding this comment.
Approved from OSS compliance perspective.
Summary
requirements.txt,security_scanning/pyproject.toml,security_scanning/poetry.lock, andATTRIBUTIONS-Python.mdnvidia-cutlass-dslto 4.4.2 (addednvidia-cutlass-dsl-libs-baseandcuda-tileto flashinfer's deps)tensorrt_llm/_torch/speculative/interface.py(flashinfer.__version__ >= "0.6.4") remains satisfiedTest plan
pip install -r requirements.txtinstalls successfullypytest tests/unittest/_torch/flashinfer/ -vpytest tests/unittest/_torch/attention/test_flashinfer_attention.py -vSummary by CodeRabbit