-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inconsistent behavior of loading .safetensors models when setting use_safetensors=True #3355
Comments
Thanks, this was driving me crazy. |
@sayakpaul do you maybe have time to look into this? |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
doesn't look like fixed. |
i don't like the safetensors format. Most of these safetensors don't load correctly in code and you don't get to know the format. It would have been nice if the safetensors would save like model.pt.safetensors or model.ckpt.safetensors. but it seems you have whatever format you want. |
Gently pinging @patrickvonplaten |
Think this is fixed with #3466 no? Can you double check? |
I tried the current main branch version on github. While I am encountering the same issue as in #3466 right now (diffusers trying to treat a local path as url), I read the code and think the inconsistent behavior regarding use_safetensors=True should have been fixed in the main branch. Again I am unable to verify for sure but it looks okay. |
Okay. Closing this issue thread for now. Feel free to reopen. |
Describe the bug
The use of function
StableDiffusionPipeline.from_ckpt
is not consistent with the document w.r.t. loading .safetensors models. Specifically, calling this function to load a .safetensors model and in the same time settinguse_safetensors=True
will result in a ValueError sayingsafetensors
is not installed. Despite already having asafetensors
installation.Related PR:
#3333 discussed about safetensors' doc, but didn't mention the bug, nor did it fix the bug.
Suspected Cause:
Line 1243 in src/diffusers/loaders.py gets the parameter:
If we set
use_safetensors=True
, then after this,use_safetensors
will beTrue
regardless ofis_safetensors_available()
.Then at line 1249 we call:
Thus if we have a .safetensors file (which means
from_safetensors == True
) this throws a ValueError.Reproduction
Gives:
Logs
No response
System Info
Colab, latest diffusers
The text was updated successfully, but these errors were encountered: