Skip to content

Fix Flux2 DreamBooth prior preservation prompt repeats#13415

Merged
sayakpaul merged 2 commits intohuggingface:mainfrom
azolotenkov:fix-flux2-prior-preservation-repeat
Apr 15, 2026
Merged

Fix Flux2 DreamBooth prior preservation prompt repeats#13415
sayakpaul merged 2 commits intohuggingface:mainfrom
azolotenkov:fix-flux2-prior-preservation-repeat

Conversation

@azolotenkov
Copy link
Copy Markdown
Contributor

What does this PR do?

Fixes a prior-preservation batch size mismatch in the Flux2 DreamBooth LoRA scripts.

When custom instance prompts are not used, prompt_embeds already contains concatenated instance + class embeddings under --with_prior_preservation, but collate_fn also doubles prompts, so repeating by len(prompts) over-expands text embeddings by 2x and no longer matches the latent batch size.

This applies the same fix pattern as #13396 to:

  • train_dreambooth_lora_flux2.py
  • train_dreambooth_lora_flux2_klein.py

Before submitting

Who can review?

@sayakpaul

Copilot AI review requested due to automatic review settings April 4, 2026 18:08
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Fixes a prior-preservation batch size mismatch in the Flux2 DreamBooth LoRA training example scripts by adjusting how many times static text embeddings are repeated when --with_prior_preservation is enabled.

Changes:

  • Adjust num_repeat_elements to use len(prompts) // 2 under --with_prior_preservation (since collate_fn doubles prompts).
  • Apply the same fix to both Flux2 DreamBooth LoRA variants (standard + klein).

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.

File Description
examples/dreambooth/train_dreambooth_lora_flux2.py Fixes prompt-embed repeat count for prior preservation when using static (non-custom) prompts.
examples/dreambooth/train_dreambooth_lora_flux2_klein.py Same repeat-count fix in the klein variant.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread examples/dreambooth/train_dreambooth_lora_flux2.py Outdated
Comment thread examples/dreambooth/train_dreambooth_lora_flux2_klein.py Outdated
@azolotenkov azolotenkov force-pushed the fix-flux2-prior-preservation-repeat branch from 27c01ea to b6cb7b1 Compare April 4, 2026 18:23
@azolotenkov azolotenkov force-pushed the fix-flux2-prior-preservation-repeat branch from b6cb7b1 to 3a00206 Compare April 5, 2026 11:52
@azolotenkov
Copy link
Copy Markdown
Contributor Author

Updated again.

The previous repeat_interleave(..., dim=0) change fixed the prompt/text ordering issue for --with_prior_preservation with train_batch_size > 1, but while testing it locally I hit another mismatch in the same code path: weighting was still left at full batch size after model_pred and target were chunked into instance/prior halves.

This update chunks weighting alongside model_pred and target in both Flux2 scripts.

Validation used:

  • hf-internal-testing/tiny-flux2-klein
  • --with_prior_preservation
  • --train_batch_size 2

That setting now completes successfully for klein, and it exercises exactly the batch-layout path discussed in Copilot review.

@azolotenkov
Copy link
Copy Markdown
Contributor Author

@sayakpaul, this PR is the same fix pattern that was now merged for Qwen Image DreamBooth in #13441.

@github-actions github-actions bot added examples size/S PR with diff < 50 LOC labels Apr 15, 2026
@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@sayakpaul sayakpaul merged commit d308316 into huggingface:main Apr 15, 2026
27 of 29 checks passed
@sayakpaul
Copy link
Copy Markdown
Member

Failing tests are unrelated.

@azolotenkov azolotenkov deleted the fix-flux2-prior-preservation-repeat branch April 15, 2026 09:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

examples size/S PR with diff < 50 LOC

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants