Skip to content

Enable glm46v UTs on XPU#42274

Merged
ydshieh merged 1 commit intohuggingface:mainfrom
YangKai0616:glm46v-xpu
Nov 19, 2025
Merged

Enable glm46v UTs on XPU#42274
ydshieh merged 1 commit intohuggingface:mainfrom
YangKai0616:glm46v-xpu

Conversation

@YangKai0616
Copy link
Contributor

@YangKai0616 YangKai0616 commented Nov 19, 2025

What does this PR do?

This PR enabled testing of glm46v on XPU.

Notes:

  1. [General] Directly running Glm46VIntegrationTest currently fails with error:
FAILED tests/models/glm46v/test_modeling_glm46v.py::Glm46VIntegrationTest::test_small_model_integration_test_batch_flashatt2 - RuntimeError: Expected tensor for argument #1 'indices' to have one of the following scalar types: Long, Int; but got XPUBFloat16Type instead (while checking argumen...

Need to wait for zai-org/GLM-4.1V-9B-Thinking PR 21 to be merged, or use the workaround: change here to self.visual = AutoModel.from_config(config.vision_config.vision_config).

  1. [XPU] Tests related to flash_attention_2 need to wait for PR 41956 to be merged.

@github-actions
Copy link
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: glm46v

@Rocketknight1
Copy link
Member

cc @ydshieh

@ydshieh ydshieh merged commit 5804c1f into huggingface:main Nov 19, 2025
15 checks passed
@ydshieh
Copy link
Collaborator

ydshieh commented Nov 19, 2025

thank you 🤗

SangbumChoi pushed a commit to SangbumChoi/transformers that referenced this pull request Jan 23, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants