Skip to content

fix: add Float8 dtype fallback in modeling_utils.py#44616

Closed
s-zx wants to merge 1 commit intohuggingface:mainfrom
s-zx:fix/44589-float8-dtype-fallback
Closed

fix: add Float8 dtype fallback in modeling_utils.py#44616
s-zx wants to merge 1 commit intohuggingface:mainfrom
s-zx:fix/44589-float8-dtype-fallback

Conversation

@s-zx
Copy link
Copy Markdown

@s-zx s-zx commented Mar 11, 2026

Summary

Add fallback to bfloat16 when Float8 dtype fails to set, preventing TypeError when loading FP8 models on PyTorch builds without Float8_e4m3fnStorage support.

Root Cause

torch.set_default_dtype(dtype) raises TypeError: couldn't find storage object Float8_e4m3fnStorage when Float8 is not available.

Fix

Wrap in try/except and fall back to bfloat16 when the error indicates Float8.

Fixes #44589

Float8_e4m3fn dtypes may not be available in all PyTorch builds.
Add fallback to bfloat16 when Float8 dtype fails to prevent
TypeError during model loading.

Fixes huggingface#44589
Signed-off-by: s-zx <2575376715@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

TypeError: couldn't find storage object Float8_e4m3fnStorage

2 participants