You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@zucchini-nlp FYI I tested #36679 PR and this resolves the issue on my side. Also tested SD3 pipeline (which had 2 out of 3 text encoders loaded as FP16 instead of assigned BF16) and all 3 text encoders are loading OK. Looking forward to merging this fix :)
System Info
transformers
version: 4.50.0.dev0Who can help?
@zucchini-nlp @ArthurZucker
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Some components are loaded with incorrect
dtype
. @zucchini-nlpgit bisect
analysis shows that this bug was introduced in 84a6789Please install diffusers for a simple reproducer:
Reproducer script (we instantiate SDXL pipeline with BF16
dtype
, and expect components to be of this type):Output (correct) before 84a6789 :
Output (incorrect) after 84a6789 was merged:
Now 2nd text encoder is incorrectly loaded as
torch.float16
which can cause a series of issues (e.g. see huggingface/optimum-habana#1815)@zucchini-nlp can you please help address this issue?
Expected behavior
2nd text encoder should be loaded as BF16
torch.bfloat16
and not as FP16torch.float16
. This was correctly the case before the bug was introduced.The text was updated successfully, but these errors were encountered: