-
Notifications
You must be signed in to change notification settings - Fork 170
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Failed to load .safetensors
as state dict with error from torch.frombuffer
in safetensors.torch.load
#442
Comments
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days. |
Still a problem. |
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days. |
Still a problem. |
Ran into this issue as well. Seems like adding a check before (
frombuffer .
Would the maintainers like a pull request to that effect? EDIT: Here's my patched version of that function (works for me, but not fully tested):
|
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days. |
Bump |
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days. |
Bamp |
System Info
OS: Win 10 64 bit
Python: 3.9.13
SafeTensors: 0.4.2
Information
Reproduction
Loading the attached
failed.safetensors
file withsafetensors.torch.load_file
directly works, but reading the file into abytes
object first and then loading it withsafetensors.torch.load
fails.I get the following error:
Same error with pytest formatting:
Steps to reproduce:
failed.safetensros
failed.safetensros
intobytes
object.safetensors.torch.load
.I used the following script to get the above error:
Expected behavior
safetensors.torch.load_file
andsafetensors.torch.load
should produce the same result and load the state dict correctly.The text was updated successfully, but these errors were encountered: