-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
missing options in Dreambooth training #2542
Comments
It seems the entire "SDXL Specific Parameters" group is missing in the master branch (and release) versions. I even searched the source code for "fused backward pass" and got nothing. When I switched to dev, those options do show up. |
There are many fixes in dev. I am waiting for Kohya to merge dev to main to promote the dev branch to master. In the mean time you can use the dev branch as it should be more stable than master at this point. |
@bmaltais I am on dev, and also dont see option to checkbox SDXL? So it seems like there can be no training in SDXL at all for the moment. Basically locked out of training. Unless I am missing something |
It will only show up if you select a custom model instead of the default
huggingface ones.
…On Thu, May 30, 2024 at 12:02 raf ***@***.***> wrote:
@bmaltais <https://github.com/bmaltais> I am on dev, and also dont see
option to checkbox SDXL? So it seems like there can be no training in SDXL
at all for the moment. Basically locked out of training. Unless I am
missing something
—
Reply to this email directly, view it on GitHub
<#2542 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABZA34VCDP6ZLPSCYQD6RRDZE32LXAVCNFSM6AAAAABIKBR7JOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMZZGIYDQMZUGM>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Ok. That might be it, though I loaded my base model from /workspace on runpod Comfirming this was my error |
@b-fission Does TE1 and TE2 learning rates only apply to Dreambooth? Wanted to try it in Lora module |
Yes, TE1 and TE2 learning rates are for Dreambooth. Lora training only has a single TE learning rate option in the gui. But I wonder if TE1 and TE2 rates can be specified manually under "Additional parameters" for Lora if one were inclined to try. |
I will try it |
That is assuming you want to use a different learning rate for TE1 and TE2. The args could be entered into Additional parameters as |
I was thinking of turning Te2 off By the way for subjects, what worked better for you, dreambooth or Lora implementation? |
I think Dreambooth tends to reach the intended results I want with less tweaking compared to Lora. Afterwards, I can use the Extract Lora utility to generate a Lora from the generated Dreambooth checkpoint, so I can have the flexibility of Lora and save disk space. |
I've used both but testing some other optimisers like prodigy and Adafactor besides adamW8bit |
For me, AdamW8bit has worked well enough that I haven't put much time into trying other optimizers. I've played with Prodigy a few times with Loras and got 'interesting' results, but it sometimes learned too fast and overtrained; probably needs extra setup on my end. Haven't tried the other ones though. From what little testing I did, Lion8bit and Adafactor are interesting for having lower vram requirements, which makes them viable for training SDXL on mid-range hardware. |
thanks for the insights, similar to my understanding. I'll ditch adafactor. Interested mainly in adamw8bit vs prodigy. and testing TE1/TE2. Or TrainTE for some % steps. Though not sure that works still in dreambooth |
The option for "Stop TE (% of total steps)" only seems to be relevant in Dreambooth for SD 1.x or 2.0, but not SDXL. |
Unfortunate |
Two things on Dreambooth training for SDXL
--train_text_encoder
The text was updated successfully, but these errors were encountered: