-
Notifications
You must be signed in to change notification settings - Fork 25.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inpainting colors desaturated #6074
Comments
My inpainting even doesn't work, try to correct eyes the results is only black color painted over eyes |
Is it still washed out if you changed the VAE's used? Mine were always have those kind of results if I'm using VAE meant for anime generations (like they put greyish layer on top of it). |
Yeah, changing the VAE didn't seem to make a difference. It is definitely like a "greyish layer on top" though. I'll run some more tests soon, see if I can identify the culprit and post some example pics. |
Oh! I figured it out! I made a customization in processing.py that I neglected to include in my PR #5644 : I don't understand the inpainting model well enough to explain why my one-line version works better, but look at the difference it makes: At least for faceswaps, I don't notice any side effects with my version, but I'm hesitant to submit another PR without understanding the reason for the washed-out look. Thoughts? |
Well, subjectively speaking, given the example above, I like your version better. My observation so far concluding that SD treat faces differently than any other parts of a picture, way too much emphasis on human face or faces in general. Perhaps you could give another examples that does not include face if you trying to raise a new PR without proper reasoning behind it, IMHO. |
Sure, here's one more. Swap to "tomato" at 0.75 conditioning mask strength I don't know if the current implementation is bugged, but I have never been able to get it to produce a pleasing result when the conditioning mask strength value is > 0 and like < 0.95. Not only is it washed out, it also doesn't seem to morph the shape of the image at high values nearly as much as one might expect. |
I don't really know how the results is meant or supposed to be, TBH. I kind of just adapt to what I have kind of guy. Using your source image, I can confirm that it is washed out and does not even try to morph to what in prompt if we are using the 'original' masked content. AFAIK, not changing the subject of the masked content is what original supposed to do, but TBH I don't know how it was supposed to. Regarding the washed out, yes, I can confirm that. My results: EDIT: * It is even more DESATURATED in my examples. Different configs somewhere perhaps? |
@EllangoK |
Excuse me,sorry I don't understand where the command"--disable-safe-unpickle --xformers, though I tried other combinations of arguments"should write |
Dunno if this is the same issue, but some models (possibly vae related?) will give scuffed colors when you inpaint Even though in the preview just before generation completes it seems fine Does anyone know what I could do to fix this? It makes editing really hard It was actually somehow the default orangemix vae not being loaded in img2img. Forced it on in settings, set --no-half-vae (it wouldn't work otherwise) and way better results now |
Related/dup issues: #2754, #5557 I do think this is VAE related but it would be appreciated if somebody more knowledgeable can give feedback and/or know how to fix what exactly is happening here before closing this. The VAE of choice, It should be mentioned as well that there are only a few known VAEs for SD1/2 models that are not simply merges or out encoder layer fixes, the latter of which is actually a bad idea to even use as it harms hires fix results. sALTaccount/VAE-BlessUp#1 (comment) (tl;dr: loss of detail and color range) This is what should be worked off of for resolving this. Any other VAE is likely tampered with or is a merge that is not really representative of it's original training.
|
For what it's worth, I only ever encountered issues with one model/vae combination(anything that orangemix vae is a rename or merge of) so it's probably not a widespread issue, which is why I guess there are barely any people talking about it in the past 6 months. |
Definitely running into this very commonly with multiple models. The weird thing is that it seems to get notably worse over time. Inpainting right after starting StableDiffusion and it's barely noticeable with color correction enabled unless you really look. After hours of inpainting on multiple images/models it's really noticeable. Restarting the process cleans it up again for a while. |
After a couple quick tests I did notice running img2img @ 0 denoise strength (basically testing to see if encoding and then decoding the latent produces the same result, which it should) only produces desaturated colors with the NovelAI VAE. No other VAEs (of the ones I listed above, sans SD1.0) cause this issue. Lines up with what @Enferlain experienced too. |
@catboxanon Does it still desaturate if you use a NovelAI based model but force a different VAE globally via the settings menu? Might be a half-decent workaround until the root cause can be determined. Looking at the code I had a hunch that maybe it was line 194 that was possibly causing some kind of rounding error specific to the affected VAE, but even running your edited version without that line it doesn't seem to make any difference for me - still desaturates as bad as the original code. |
It doesn't. This is basically what I already pointed out though.
If you're referring to the code mentioned in #6074 (comment), that should only apply to inpainting models. That might be a separate issue altogether. |
This is just I believe the NovelAI VAE also never produces NaNs only if you use their leaked checkpoint and VAE both at full precision (fp32). The leaked code indicates this is how they run their models, they don't run at half precision anywhere in their pipeline. |
I was facing the same issue of "washed-out" colors after any kind of inpainting or img2img. After some tests I realized I was not using the VAE recommended by the model, and using the proper VAE solved all the issues. So, I'd recommend checking that before going into half precision or using color correction. |
This worked. |
@ThereforeGames does updating adetailer #308 on the latest version solve your issue? |
Is there an existing issue for this?
What happened?
Hi,
It seems that img2img inpainting produces very washed out results, regardless of color correction mode. I'll list what I think are the relevant settings below.
I may update this post with example images later, please let me know if that would be helpful.
Steps to reproduce the problem
All other settings--including sampler--seem to be irrelevant here.
What should have happened?
The contrast of the original image should be generally preserved. I reverted to a commit that's ~4 days old (don't know the hash, sorry) and it was working fine then.
Commit where the problem happens
4af3ca5
What platforms do you use to access UI ?
Windows
What browsers do you use to access the UI ?
Brave
Command Line Arguments
Additional information, context and logs
N/A
The text was updated successfully, but these errors were encountered: