Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge #8

Open
sasso-effe opened this issue Aug 8, 2023 · 0 comments

Comments

@sasso-effe
Copy link

I followed all the steps:

  • I trained pre-optimized loras with train_preoptimized_liloras.py
  • Used them to train the hypernetwork with train_hyperdreambooth.py
  • Passed an image in input to generate the weights with hypernetwork_gen_weigth.py
    At this point, I would like to test what I got running inference_test.py, but I get this error:
Traceback (most recent call last):
  File "/home/wizard/bendai/research/dawnai/hyper_dreambooth/inference_test.py", line 19, in <module>
    pipe.unet.load_attn_procs(model_path)
  File "/home/wizard/mambaforge/envs/hyper/lib/python3.9/site-packages/diffusers/loaders.py", line 297, in load_attn_procs
    state_dict = safetensors.torch.load_file(model_file, device="cpu")
  File "/home/wizard/mambaforge/envs/hyper/lib/python3.9/site-packages/safetensors/torch.py", line 259, in load_file
    with safe_open(filename, framework="pt", device=device) as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge

Any idea? Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant