Skip to content

Conversation

hmellor
Copy link
Member

@hmellor hmellor commented Sep 11, 2025

Supersedes #24043

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@hmellor hmellor enabled auto-merge (squash) September 11, 2025 07:10
@mergify mergify bot added the ci/build label Sep 11, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the transformers dependency to version 4.56.1 in several requirement files. The changes are straightforward, involving a simple version number update across multiple files.

@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Sep 11, 2025
@Isotr0py Isotr0py disabled auto-merge September 11, 2025 09:28
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@njhill njhill added the ci-no-fail-fast Disable fail-fast behavior of CI tests (continue with remaining tests after failures). label Sep 11, 2025
@DarkLight1337
Copy link
Member

Looks like HF backend for Idefics3 is broken by this upgrade: https://buildkite.com/vllm/ci/builds/30368/steps/canvas?sid=01993997-f436-4816-b6bb-ac9f68f04cc2

@hmellor
Copy link
Member Author

hmellor commented Sep 12, 2025

This test is not currently run on main:

tests/models/language/generation/test_hybrid.py::test_models[5-64-yujiepan/mamba2-codestral-v0.1-tiny-random] SKIPPED (`transformers==4.55.2` installed, but `transformers>=4.55.3` is required to run this model.)

Even when upgrading to the most similar version, v4.55.3, this test fails. @tdoublep I see you added this in #23936, did this test pass for you in your development environment?

@hmellor
Copy link
Member Author

hmellor commented Sep 12, 2025

This test is not currently run on main:

tests/models/multimodal/processing/test_common.py::test_processing_correctness[1.0-32-0.3-zai-org/GLM-4.5V] SKIPPED (`transformers==4.55.2` installed, but `transformers>=4.56` is required to run this model.)

When upgrading to v4.56.0 or v4.56.1, this test still fails with:

>               if metadata["total_num_frames"] != len(video_array):
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E               TypeError: 'NoneType' object is not subscriptable

vllm/model_executor/models/glm4_1v.py:1132: TypeError

@DarkLight1337 I see you labelled this as requiring v4.56 in #23181, has anyone run this test locally?

@hmellor
Copy link
Member Author

hmellor commented Sep 12, 2025

The Emu3 test requires a fix on the Transformers side, we'll make a patch for this ASAP

@DarkLight1337
Copy link
Member

No, I marked it as v4.56 only because the model isn't in v4.55.0 release

@hmellor
Copy link
Member Author

hmellor commented Sep 12, 2025

Ok, the zai-org/GLM-4.5V failure appears to be a problem on the vLLM side. @zRzRzRzRzRzRzR could you look into this?

@tdoublep
Copy link
Member

@hmellor I did test it in my local dev environment some time ago (when I put up that PR) and it passed. But I confirm that it doesn't pass on main right now. I will try to see if I can figure out what changed. We are also skip the test for now if it's blocking something.

@DarkLight1337
Copy link
Member

Ok, the zai-org/GLM-4.5V failure appears to be a problem on the vLLM side.

@Isotr0py is this patched by #24161 (comment)?

@Isotr0py
Copy link
Member

is this patched by #24161 (comment)?

Yes, and the corresponding fix should be quite similar to Qwen3-VL's video processing implementation. So I will fix it as long as we addressed the Qwen3-VL processing test failure in #24727 (comment)

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@hmellor
Copy link
Member Author

hmellor commented Sep 12, 2025

Thanks @tdoublep, it would be nice if we could not postpone the fix, but if if's too difficult to solve quickly I can change the reason the test skips so that this PR can be merged.

@tdoublep
Copy link
Member

@hmellor I must have made a mistake previously - I don't see how it could have been working in my local dev environment. The model is broken on transformers because the n_groups=8 parameter is ignored in the RMSNorm. I've put up a PR to fix that here: huggingface/transformers#40861

I can confirm that with that fix the test is passing:

python -mpytest tests/models/language/generation/test_hybrid.py::test_models[5-64-yujiepan/mamba2-codestral-v0.1-tiny-random] 

@hmellor
Copy link
Member Author

hmellor commented Sep 13, 2025

Thanks for the PR @tdoublep! I'll see what I can do about getting it expedited so that we can upgrade to v4.56.2 ASAP

@Isotr0py
Copy link
Member

Ok, the zai-org/GLM-4.5V failure appears to be a problem on the vLLM side.

The processor test should be fixed by #24822.

@DarkLight1337
Copy link
Member

Can you update this PR? The tests should be pass after #25644

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@hmellor hmellor changed the title Update to Transformers v4.56.1 Update to Transformers v4.56.2 Sep 25, 2025
Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@DarkLight1337 DarkLight1337 enabled auto-merge (squash) September 25, 2025 17:47
@DarkLight1337
Copy link
Member

Hmm, maybe we still need the changes in transformers backend?

@hmellor
Copy link
Member Author

hmellor commented Sep 27, 2025

Seems like it, I'll re-add the changes in a more polished way next week

Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
@hmellor hmellor requested a review from ywang96 as a code owner September 30, 2025 17:49
@mergify mergify bot added the multi-modality Related to multi-modality (#4194) label Sep 30, 2025
@hmellor
Copy link
Member Author

hmellor commented Sep 30, 2025

I've seen these errors in other PRs that have recently been updated from main

@vllm-bot vllm-bot merged commit 2a69ab4 into vllm-project:main Oct 1, 2025
79 of 84 checks passed
@hmellor hmellor deleted the transformers-4-56 branch October 1, 2025 07:01
pdasigi pushed a commit to pdasigi/vllm that referenced this pull request Oct 2, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
Co-authored-by: Isotr0py <mozf@mail2.sysu.edu.cn>
yewentao256 pushed a commit that referenced this pull request Oct 3, 2025
Signed-off-by: Harry Mellor <19981378+hmellor@users.noreply.github.com>
Co-authored-by: Cyrus Leung <tlleungac@connect.ust.hk>
Co-authored-by: Cyrus Leung <cyrus.tl.leung@gmail.com>
Co-authored-by: Isotr0py <mozf@mail2.sysu.edu.cn>
Signed-off-by: yewentao256 <zhyanwentao@126.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ci/build ci-no-fail-fast Disable fail-fast behavior of CI tests (continue with remaining tests after failures). multi-modality Related to multi-modality (#4194) ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants