Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DreamBooths created with current version of Colab cannot be converted to LORAs in Kohya #248

Open
shadowlocked opened this issue Aug 5, 2023 · 0 comments
Labels
bug Something isn't working

Comments

@shadowlocked
Copy link

Describe the bug

I have been converting archival DreamBooths to LORA with the kohya_ss framework, for a month, and the conversions have all gone well.

However, I have been converting DreamBooth models trained prior to this period. Having now actually trained a couple of new DreamBooth models in the past week on the Colab, neither of them will convert. This is the error that results every time:

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮ │ C:\Users\[USER]\Desktop\KOHYA2\kohya_ss\networks\extract_lora_from_models.py:189 in <module> │ │ │ │ 186 parser = setup_parser() │ │ 187 │ │ 188 args = parser.parse_args() │ │ ❱ 189 svd(args) │ │ 190 │ │ │ │ C:\Users\[USER]\Desktop\KOHYA2\kohya_ss\networks\extract_lora_from_models.py:45 in svd │ │ │ │ 42 print(f"loading SD model : {args.model_org}") │ │ 43 text_encoder_o, _, unet_o = model_util.load_models_from_stable_diffusion_checkpoint(ar │ │ 44 print(f"loading SD model : {args.model_tuned}") │ │ ❱ 45 text_encoder_t, _, unet_t = model_util.load_models_from_stable_diffusion_checkpoint(ar │ │ 46 │ │ 47 # create LoRA network to extract weights: Use dim (rank) as alpha │ │ 48 if args.conv_dim is None: │ │ │ │ C:\Users\[USER]\Desktop\KOHYA2\kohya_ss\library\model_util.py:1059 in │ │ load_models_from_stable_diffusion_checkpoint │ │ │ │ 1056 │ │ │ torch_dtype="float32", │ │ 1057 │ │ ) │ │ 1058 │ │ text_model = CLIPTextModel._from_config(cfg) │ │ ❱ 1059 │ │ info = text_model.load_state_dict(converted_text_encoder_checkpoint) │ │ 1060 │ print("loading text encoder:", info) │ │ 1061 │ │ │ 1062 │ return text_model, vae, unet │ │ │ │ C:\Users\[USER]\Desktop\KOHYA2\kohya_ss\venv\lib\site-packages\torch\nn\modules\module.py:1604 │ │ in load_state_dict │ │ │ │ 1601 │ │ │ │ │ │ ', '.join('"{}"'.format(k) for k in missing_keys))) │ │ 1602 │ │ │ │ 1603 │ │ if len(error_msgs) > 0: │ │ ❱ 1604 │ │ │ raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( │ │ 1605 │ │ │ │ │ │ │ self.__class__.__name__, "\n\t".join(error_msgs))) │ │ 1606 │ │ return _IncompatibleKeys(missing_keys, unexpected_keys) │ │ 1607 │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────╯

If this can't be fixed, hopefully I could get access to an earlier version of the Colab.

Reproduction

N/A

Logs

No response

System Info

Colab standard

@shadowlocked shadowlocked added the bug Something isn't working label Aug 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant