-
Notifications
You must be signed in to change notification settings - Fork 227
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Infer problem about loading lora weights #57
Comments
i� have encounted the same problem |
I try to load Lora weight as this way. The weights can be loaded, but I train the SD2.1 which generates a noise picture. #65 So, I am not sure this is correct. You can try it. Welcome to discuss~
|
This works, but one caveat is that in this current snippet Lines 79 to 93 in a9ad795
with which you can save the trained lora weights by
Then you should be able to load directly with |
Hey~
Good jobs~ I have trained SD Lora on my custom dataset. But I have some problems with inference ONLY.
With the state_dict() we saved by
'''
lora_state_dict = get_peft_model_state_dict(unet_, adapter_name="default")
StableDiffusionPipeline.save_lora_weights(os.path.join(output_dir, "unet_lora"), lora_state_dict)
'''
The keys of the saved model are named like
'''
base_model.model.mid_block.resnets.1.time_emb_proj.lora_B.weight
'''
But I checked the pytorch_lora_weights.safetensors are like
'''
lora_unet_up_blocks_2_attentions_0_proj_in.lora_up.weight
'''
which can be correctly loaded by "pipe.load_lora_weights()".
But the models we saved can not be loaded directly.
So, the question is how to load the Lora weights we save. Or should we convert the Lora weights before we save?
Thanks~
The text was updated successfully, but these errors were encountered: