-
Notifications
You must be signed in to change notification settings - Fork 26.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: LoRa's wont work #10036
Comments
+1 |
1 similar comment
+1 |
with further probing i found out that only certain older LoRa's no longer work. Newer ones still work |
Wrong. I've been trying to make a Lora made by myself to load with no success. Another Lora made by me loads with no problem. |
I made a Lora today, same way I did in the past. with the latest version of stable diffusion webui. It no longer works throwing these errors. |
Might be due to some LoRAs using See Kohya LoRa for For now this change seems to fix the issue for txt2img at least:
I'm too lazy to open a PR for this but maybe someone with more motivation will see this. |
Confirmed this works, thanks! |
related #9979 |
For me it not works i have installed a1111-sd-webui-locon extension and it did the trick. |
Can someone post working lora.py with all that corrections? I tried to put it myself and I only get error an LORA wont loading :/ |
Working for me, thanks |
I did the changes but doesn't work for me |
Doesn't work for me either. getting this error on LORAs that were working beofre module.weight.copy_(weight) |
Work For Me!lora.py:237
this not insert 237
|
i think we are running into a different issue, their broadcast error is different. ours is [A,B] does not match with [A,B,A,B], theirs is [A,B,1,1] does not match with [A,B,3,3] I can confirm you can fix with https://github.com/KohakuBlueleaf/a1111-sd-webui-lycoris extension. but this is an ugly fix. |
I tried that and still no luck. I'm still getting an error: |
I finally gave up and use now the "Additional Networks" extension which
loads the loras with no proble. It's an automatic1111 issue that remains
unsolved.
El vie, 16 jun 2023 a las 21:29, Joey ***@***.***>) escribió:
… Doesn't work for me either.
getting this error on LORAs that were working beofre
module.weight.copy_(weight) RuntimeError: output with shape [200, 320]
doesn't match the broadcast shape [200, 320, 200, 320]
i think we are running into a different issue, their broadcast error is
different.
ours is [A,B] does not match with [A,B,A,B],
theirs is [A,B,1,1] does not match with [A,B,3,3]
I can confirm you can fix with
https://github.com/KohakuBlueleaf/a1111-sd-webui-lycoris extension. but
this is an ugly fix.
I tried that and still no luck. I'm still getting an error:
Traceback (most recent call last):
File "C:\ai\stable-diffusion-21\extensions-builtin\Lora\lora.py", line
253, in load_loras
lora = load_lora(name, lora_on_disk)
File "C:\ai\stable-diffusion-21\extensions-builtin\Lora\lora.py", line
211, in load_lora
module.weight.copy_(weight)
RuntimeError: output with shape [128, 320] doesn't match the broadcast
shape [128, 320, 128, 320]
—
Reply to this email directly, view it on GitHub
<#10036 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACEW7XH6I3GR45LSLNOBWWLXLSXZDANCNFSM6AAAAAAXTXK5XU>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Can anyone solve this problem? |
How ? |
You need to use the extension to load the Loras, by prompt still doesn't
work
El mar, 11 jul 2023 a las 12:24, Noham Coulon ***@***.***>)
escribió:
… Doesn't work for me either.
getting this error on LORAs that were working beofre
module.weight.copy_(weight) RuntimeError: output with shape [200, 320]
doesn't match the broadcast shape [200, 320, 200, 320]
i think we are running into a different issue, their broadcast error is
different.
ours is [A,B] does not match with [A,B,A,B],
theirs is [A,B,1,1] does not match with [A,B,3,3]
I can confirm you can fix with
https://github.com/KohakuBlueleaf/a1111-sd-webui-lycoris extension. but
this is an ugly fix.
How ?
I installed the extension but it still doesnt work
—
Reply to this email directly, view it on GitHub
<#10036 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACEW7XGU7LAL5Q3MV3WOWK3XPUSUPANCNFSM6AAAAAAXTXK5XU>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
facing the same problem. my version is 1.5.1 68f336b. |
I can't reproduce this. There's mention that this only happens for LoRAs extracted from checkpoints while running Torch 2.0, but those are also working for me, and I'm running Torch 2.0. #10089 was also supposed to fix this in the first place and that's been implemented since 1.2.0. Could someone here experiencing this provide a link to a LoRA that causes this error? Also if you have any informational at all about when this did work, if at all, so a git bisect could be done if needed. This fork has a few similar issues that were made, but the issue ended up being 1) users not using the locon/lycoris extension (which is no longer needed, as these extra networks are implemented natively now), 2) LoRAs made for SD2.1 (which the webui previously didn't support I believe, but very much should now), or 3) extension incompatibilities (composable lora in particular -- but all extra extensions should be disabled to verify if a LoRA is working or not). vladmandic/automatic#120 |
For me @thojmr fix worked when I encountered this problem with this two lora's: https://mega.nz/file/ReBhBSLD#rN-ZtKgaRTi5brxxW2MrHHgcDRYu_fEfcUL9pvaAGT8 They worked normally on additional networks tab. However - after a recent LORA update I don't need those fixes anymore and Lora's that I provide work out of the box. |
Yeah, those were accounted for months ago by #10089. I guess I'm moreso referring to the error mentioned here. #10036 (comment)
It's a different broadcast shape than the one the PR accounts for. |
I am now using lyCORIS output channel for all my loras, seems work |
Okay, since the few extra comments here back up what I initially had mentioned in my comment, where any further issues are likely user error, I'm going to go ahead and close this. If this issue persists, open a new issue with additional details and a link to the LoRA model that does not work. |
Is there an existing issue for this?
What happened?
I have this error code when I use a LoRa, and they are not applied to the prompt
Steps to reproduce the problem
Using any lora
What should have happened?
LoRa's should be used
Commit where the problem happens
5ab7f21
What platforms do you use to access the UI ?
Windows
What browsers do you use to access the UI ?
Mozilla Firefox
Command Line Arguments
List of extensions
Console logs
Additional information
No response
The text was updated successfully, but these errors were encountered: