-
Notifications
You must be signed in to change notification settings - Fork 26.2k
Manually update vllm CI pin. #166494
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Manually update vllm CI pin. #166494
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/166494
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ✅ You can merge normally! (1 Unrelated Failure)As of commit 55c770c with merge base 4710fd9 ( FLAKY - The following job failed but was likely due to flakiness present on trunk:
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
Thank you for the fix! |
|
Looks like we got problems. the vllm_basic_correctness_test appears to be a compile failure. Given it succeeds in vLLM CI I suspect a compile issue |
|
Should we make the pin update automatic? |
It's automated, e.g. #165274, but that PR has been blocked for 2 weeks now. So, we end up with this manual update to coordinate the fix from vLLM and PyTorch |
|
Update: I tried to repro the test failures locally and seems 80% of tests (e.g. lora ones) stuck on my local and never finish (maybe I setup something wrong). For the remaining 20% tests I can repro the failures, and it's still there when I turn off VLLM_USE_AOT_COMPILE, indicating it's more likely general compiler/vllm issues on the trunk. I can keep an eye on this but maybe we should get someone else to take a look for now. |
42a2703 to
83740c5
Compare
a96c412 to
a5916ab
Compare
|
Need the following fixes to pytorch/vllm landed to restore trunk CI to green:
|
|
I have temporarily marked vLLM trunk jobs as unstable in #169298 because we now need this change from vLLM vllm-project/vllm#29588 to fix an upstream issue from HuggingFace. Let's aim to update the pinned commit to latest vLLM main in the next couple of days |
a5916ab to
893aad5
Compare
Summary: Test Plan: Reviewers: Subscribers: Tasks: Tags:
893aad5 to
55c770c
Compare
|
@pytorchbot merge -f 'All vLLM tests are passing now, no need to run trunk jobs' |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Please use Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary: Test Plan: Reviewers: Subscribers: Tasks: Tags: Fixes #ISSUE_NUMBER Pull Request resolved: pytorch#166494 Approved by: https://github.com/huydhn
Summary: Test Plan: Reviewers: Subscribers: Tasks: Tags: Fixes #ISSUE_NUMBER Pull Request resolved: #166494 Approved by: https://github.com/huydhn
Summary:
Test Plan:
Reviewers:
Subscribers:
Tasks:
Tags:
Fixes #ISSUE_NUMBER