Skip to content

Fix non-deterministic T5 outputs in HiDream pipeline tests#13534

Merged
DN6 merged 2 commits intohuggingface:mainfrom
kaixuanliu:hidream-image-fix
Apr 21, 2026
Merged

Fix non-deterministic T5 outputs in HiDream pipeline tests#13534
DN6 merged 2 commits intohuggingface:mainfrom
kaixuanliu:hidream-image-fix

Conversation

@kaixuanliu
Copy link
Copy Markdown
Contributor

@kaixuanliu kaixuanliu commented Apr 21, 2026

This PR tries to fix 2 failed test cases for hiream pipeline tests:

tests/pipelines/hidream_image/test_pipeline_hidream.py::HiDreamImagePipelineFastTests::test_cpu_offload_forward_pass_twice
tests/pipelines/hidream_image/test_pipeline_hidream.py::HiDreamImagePipelineFastTests::test_sequential_offload_forward_pass_twice

Root cause: In get_dummy_components(), T5EncoderModel(config) creates the model in training mode by default. The tiny-random-t5 config has dropout_rate=0.1, so each forward pass produces different embeddings due to active dropout — even under torch.no_grad(). In production this isn't an issue because from_pretrained() automatically calls .eval(), but we need to add this explicitly for text_encoder_3 in the test file

Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
@github-actions github-actions Bot added tests size/S PR with diff < 50 LOC labels Apr 21, 2026
@kaixuanliu
Copy link
Copy Markdown
Contributor Author

@DN6 @sayakpaul pls help review, thx!

Copy link
Copy Markdown
Collaborator

@DN6 DN6 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @kaixuanliu!

@kaixuanliu
Copy link
Copy Markdown
Contributor Author

The failed check are pre-existing and unrelated to this change.

@github-actions github-actions Bot added size/S PR with diff < 50 LOC and removed size/S PR with diff < 50 LOC labels Apr 21, 2026
@DN6 DN6 merged commit 3d30b7d into huggingface:main Apr 21, 2026
11 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size/S PR with diff < 50 LOC tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants