Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Apple silicon:black output randomly happens #2446

Closed
A2Sumie opened this issue Oct 13, 2022 · 15 comments
Closed

Apple silicon:black output randomly happens #2446

A2Sumie opened this issue Oct 13, 2022 · 15 comments
Labels
bug-report Report of a bug, yet to be confirmed

Comments

@A2Sumie
Copy link

A2Sumie commented Oct 13, 2022

Describe the bug

I'm using a Apple silicon device(mbp16) and black outputs happens from time to time.
Tried Euler a and DPM fast samplers, the failure happens randomly.
No error outputs in console.

To Reproduce
Steps to reproduce the behavior:

  1. Start a conversion batch.
  2. Wait for completion.
  3. See error

Expected behavior
All outputs are properly processed.

Screenshots
If applicable, add screenshots to help explain your problem.
Running another batch, will attach when occurs.
00036-724949706-

Desktop (please complete the following information):

  • OS: Macos 12
  • Browser Edge Mac(image in output folder is also black)
  • Commit revision 698d303

Additional context
Add any other context about the problem here.

@A2Sumie A2Sumie added the bug-report Report of a bug, yet to be confirmed label Oct 13, 2022
@A2Sumie A2Sumie changed the title Apple silicon:black output randomly happens at end of batch Apple silicon:black output randomly happens Oct 13, 2022
@ClashSAN
Copy link
Collaborator

try --precision full --no-half, worth a shot
https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Troubleshooting

@Karsten385
Copy link

I run the command --precision full --no-half --opt-split-attention-v1 --disable-safe-unpickle and I still get the black ones on my Mac as well, @ClashSAN . Generally it's like a third of the batch that turns out black.

@ClashSAN
Copy link
Collaborator

ClashSAN commented Oct 13, 2022

was this always an issue? can you revert to an older commit?

@A2Sumie
Copy link
Author

A2Sumie commented Oct 13, 2022

try --precision full --no-half, worth a shot https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/Troubleshooting

The default line in the script is
python webui.py --precision full --no-half --opt-split-attention-v1
Haven't tried --disable-safe-unpickle yet.

@ClashSAN
Copy link
Collaborator

ClashSAN commented Oct 13, 2022

try removing the --opt-split-attention-v1, the newer opt-split-attention implementation will be on by default.

@A2Sumie
Copy link
Author

A2Sumie commented Oct 13, 2022

try removing the --opt-split-attention-v1, the newer opt-split-attention implementation will be on by default.

Looks promising in the new batch, still running.

@ClashSAN
Copy link
Collaborator

ClashSAN commented Oct 13, 2022

dylancl@01b071c
Looks like they updated the script, default parameters are changed. so make sure both the script and repo are updated.

the invoke commit - #2234

if the issue is solved, say so. you can close it, @A2Sumie

@A2Sumie A2Sumie closed this as completed Oct 13, 2022
@A2Sumie
Copy link
Author

A2Sumie commented Oct 13, 2022

Not solved.
I'm not sure but I was using AC power when the outputs became stable.
Now I'm on battery and black outputs returned. Not sure if this is the cause.

@A2Sumie A2Sumie reopened this Oct 13, 2022
@TaciteOFF
Copy link

TaciteOFF commented Oct 13, 2022

I also have this issue since October 11 evening (around 7 PM UTC+2)
I used to be able to generate a batch of 4 images in one go, now if I do that I guess black output every time.
Everything seems slower too.

@A2Sumie
Copy link
Author

A2Sumie commented Oct 13, 2022

I also have this issue since October 11 evening (around 7 PM UTC+2) I used to be able to generate a batch of 4 images in one go, now if I do that I guess black output every time. Everything seems slower too.

I run the command --precision full --no-half --opt-split-attention-v1 --disable-safe-unpickle and I still get the black ones on my Mac as well, @ClashSAN . Generally it's like a third of the batch that turns out black.

I've found something weird.
Toggling fans to full speed would significantly increase the rate of success.
Using battery power or throwing my Mac on to my bed gives bunches of black outputs.
Can you check this out?

@Kalekki
Copy link
Contributor

Kalekki commented Oct 13, 2022

If you have modelname.vae.pt in the models directory, adding --no-half-vae could fix the issue

@A2Sumie
Copy link
Author

A2Sumie commented Oct 13, 2022

If you have modelname.vae.pt in the models directory, adding --no-half-vae could fix the issue

I have this added but does not help. :(

@Karsten385
Copy link

I've found something weird. Toggling fans to full speed would significantly increase the rate of success. Using battery power or throwing my Mac on to my bed gives bunches of black outputs. Can you check this out?

You actually might be onto something here, I put my Mac standing up on its short side so the fans had plenty of room to breathe and it put out a bunch of images correctly in a row. Weird.

@remixer-dec
Copy link
Contributor

remixer-dec commented Oct 15, 2022

Not sure if there is an issue for consistent black image outputs on M1 or if it is the same problem, but for me only a few samplers work correctly. Important note, they all work with 1 sampling step (or is it because no sampling is applied on step 1?), but some of them output black images for 2+ steps. I marked with ✅ ones that do work for me

[text-to-image]
Euler A ❌
Euler ✅
LMS ❌
Heun ❌
DPM2 ✅
DPM2 a ❌
DPM fast ✅
DPM adaptive ✅
LMS Karras ❌
DPM2 Karras ✅
DPM2 a Karras ❌
DDIM ✅
PLMS ✅

[image-to-image]
All the samplers work correctly ✅

UPDATE: the method bellow fixes my issue with samplers, but does NOT fix random black image outputs
in file repositories/k-diffusion/k_diffusion/sampling.py after def to_d(x, sigma, denoised): add sigma = sigma.to('cpu').to('mps') <- for some reason when the sigma value is close to 0 and is located not on the CPU, it shows it as 0, moving it to cpu fixes it for me, may be it is only a problem for specific nightly torch release / mac os version.

Also, the issue discussed in this topic only appears for a single sampling step, if you plot every step, it recovers in the next step.

brycedrennan added a commit to brycedrennan/imaginAIry that referenced this issue Nov 13, 2022
brycedrennan added a commit to brycedrennan/imaginAIry that referenced this issue Nov 13, 2022
@MrPalais
Copy link

MrPalais commented Dec 7, 2022

Not sure if there is an issue for consistent black image outputs on M1 or if it is the same problem, but for me only a few samplers work correctly. Important note, they all work with 1 sampling step (or is it because no sampling is applied on step 1?), but some of them output black images for 2+ steps. I marked with ✅ ones that do work for me

[text-to-image] Euler A ❌ Euler ✅ LMS ❌ Heun ❌ DPM2 ✅ DPM2 a ❌ DPM fast ✅ DPM adaptive ✅ LMS Karras ❌ DPM2 Karras ✅ DPM2 a Karras ❌ DDIM ✅ PLMS ✅

[image-to-image] All the samplers work correctly ✅

UPDATE: the method bellow fixes my issue with samplers, but does NOT fix random black image outputs in file repositories/k-diffusion/k_diffusion/sampling.py after def to_d(x, sigma, denoised): add sigma = sigma.to('cpu').to('mps') <- for some reason when the sigma value is close to 0 and is located not on the CPU, it shows it as 0, moving it to cpu fixes it for me, may be it is only a problem for specific nightly torch release / mac os version.

Also, the issue discussed in this topic only appears for a single sampling step, if you plot every step, it recovers in the next step.

Thanks a lot, the fix works with the lastest version on a MAC STUDIO 2022 so I can use "Euler A"

mattstern31 added a commit to mattstern31/imagin-AIry-Python that referenced this issue Nov 11, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug-report Report of a bug, yet to be confirmed
Projects
None yet
Development

No branches or pull requests

8 participants