-
Notifications
You must be signed in to change notification settings - Fork 6.3k
Open
Labels
bugSomething isn't workingSomething isn't workingstaleIssues that haven't received updatesIssues that haven't received updates
Description
Describe the bug
Unable to use flux fp8 model from Kijai/flux-fp8
while having transformer_flux.py file in local. I have modified the scripts to remove any import error. I put some print statements in single_model_file.py to check why it is not loading the model.
Reproduction
The below code works fine.
single_model_file.py
def _get_single_file_loadable_mapping_class(cls):
print(cls)
diffusers_module = importlib.import_module(__name__.split(".")[0])
for loadable_class_str in SINGLE_FILE_LOADABLE_CLASSES:
loadable_class = getattr(diffusers_module, loadable_class_str)
print(cls, loadable_class)
print(issubclass(cls, loadable_class))
if issubclass(cls, loadable_class):
return loadable_class_str
return None
from diffusers import FluxTransformer2DModel
transformer = FluxTransformer2DModel.from_single_file(
"https://huggingface.co/Kijai/flux-fp8/blob/main/flux1-schnell-fp8-e4m3fn.safetensors",
torch_dtype=torch.bfloat16
)
I am getting the below output:
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'>
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_stable_cascade.StableCascadeUNet'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_2d_condition.UNet2DConditionModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.autoencoders.autoencoder_kl.AutoencoderKL'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.controlnet.ControlNetModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.transformers.transformer_sd3.SD3Transformer2DModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_motion_model.MotionAdapter'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.controlnet_sparsectrl.SparseControlNetModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'>
True
But while using the class from my local code:
from transformer_flux import FluxTransformer2DModel
FluxTransformer2DModel.__module__ = 'diffusers.models.transformers.transformer_flux'
transformer = FluxTransformer2DModel.from_single_file(
"https://huggingface.co/Kijai/flux-fp8/blob/main/flux1-schnell-fp8-e4m3fn.safetensors",
torch_dtype=torch.bfloat16
)
It is giving me following error:
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'>
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_stable_cascade.StableCascadeUNet'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_2d_condition.UNet2DConditionModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.autoencoders.autoencoder_kl.AutoencoderKL'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.controlnet.ControlNetModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.transformers.transformer_sd3.SD3Transformer2DModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.unets.unet_motion_model.MotionAdapter'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.controlnet_sparsectrl.SparseControlNetModel'>
False
<class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'> <class 'diffusers.models.transformers.transformer_flux.FluxTransformer2DModel'>
False
Traceback (most recent call last):
File "/workspace/GarmentTransferV2/test.py", line 441, in <module>
main(args)
File "/workspace/GarmentTransferV2/test.py", line 368, in main
transformer_garment = FluxTransformerGarment2DModel.from_single_file(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/workspace/garment/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/workspace/garment/lib/python3.11/site-packages/diffusers/loaders/single_file_model.py", line 182, in from_single_file
raise ValueError(
ValueError: FromOriginalModelMixin is currently only compatible with StableCascadeUNet, UNet2DConditionModel, AutoencoderKL, ControlNetModel, SD3Transformer2DModel, MotionAdapter, SparseControlNetModel, FluxTransformer2DModel
Any leads would be appreciated.
Logs
No response
System Info
- 🤗 Diffusers version: 0.30.3
- Platform: Linux-6.8.0-40-generic-x86_64-with-glibc2.35
- Running on Google Colab?: No
- Python version: 3.11.9
- PyTorch version (GPU?): 2.4.1+cu121 (True)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Huggingface_hub version: 0.25.2
- Transformers version: 4.45.2
- Accelerate version: 1.0.1
- PEFT version: not installed
- Bitsandbytes version: not installed
- Safetensors version: 0.4.5
- xFormers version: not installed
- Accelerator: NVIDIA H100 80GB HBM3, 81559 MiB
- Using GPU in script?:
Who can help?
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingstaleIssues that haven't received updatesIssues that haven't received updates