Skip to content

Conversation

kaixuanliu
Copy link
Contributor

@kaixuanliu kaixuanliu commented Oct 17, 2025

When we run unit test like pytest -rA tests/pipelines/wan/test_wan_22.py::Wan22PipelineFastTests::test_save_load_float16, we found that the pipeline runs w/ all fp16 datatype, but after save and reload, some parts of text-encoder in pipe_loaded uses fp32, although we set torch_dtype to fp16 explicitly. Deep investigation found that the root cause is here: L783. Here we made an adjustment to the test case to manually add the component = component.to(torch_device).half() operation to align excatly with the behavior in pipe

Signed-off-by: Liu, Kaixuan <kaixuan.liu@intel.com>
@kaixuanliu
Copy link
Contributor Author

@a-r-r-o-w @DN6 pls help review, thx!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant