Skip to content

use_safetensors is not passed to load_sub_model #11601

@AnthonyBarbier

Description

@AnthonyBarbier

Describe the bug

In DiffusionPipeline.from_pretrained() (pipeline_utils.py) the use_safetensors option is only passed to the downloader but not to load_sub_model() which means if the safetensors are in the cached folder then they will be loaded.

Reproduction

from diffusers import StableDiffusionPipeline

model_id = "stabilityai/stable-diffusion-2"
StableDiffusionPipeline.from_pretrained( model_id ) # Will add the safetensors model to the cache

StableDiffusionPipeline.from_pretrained(
            model_id,
            use_safetensors=False, # Parameter ignored -> model.safetensors is still loaded
 )

TRANSFORMERS_VERBOSITY=debug python3 repro.py

loading weights file /home/anthonyb/.cache/huggingface/hub/models--stabilityai--stable-diffusion-2/snapshots/1e128c8891e52218b74cde8f26dbfc701cb99d79/text_encoder/model.safetensors

Changing the default value for use_safetensors from None to False in PreTrainedModel.from_pretrained() (in transformers/modeling_utils.py) to emulate what would happen if the value was propagated to the loader shows that the safetensors file is discarded and the model is loaded from bin:

loading weights file /home/anthonyb/.cache/huggingface/hub/models--stabilityai--stable-diffusion-2/snapshots/1e128c8891e52218b74cde8f26dbfc701cb99d79/text_encoder/pytorch_model.bin

Logs

System Info

Version: 0.30.3

Who can help?

@sayakpaul @DN6

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions