Skip to content

Distillation optimizer fix#3446

Merged
copybara-service[bot] merged 1 commit intomainfrom
vladk/distill-opt-fix
Mar 18, 2026
Merged

Distillation optimizer fix#3446
copybara-service[bot] merged 1 commit intomainfrom
vladk/distill-opt-fix

Conversation

@vlad-karp
Copy link
Copy Markdown
Collaborator

Description

With the introduction of ModelBundle in the distill pipeline, it allocates the optimizer state for both models in the base Tunix peft trainer.

Since the optimizer logic is completely revised in the MaxTextDistillationTrainer anyway, the fix is simply to send a dummy optimizer to the base class.

Tests

Not appllicable

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 18, 2026

Codecov Report

❌ Patch coverage is 0% with 2 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
.../trainers/post_train/distillation/train_distill.py 0.00% 2 Missing ⚠️

📢 Thoughts on this report? Let us know!

@vlad-karp vlad-karp force-pushed the vladk/distill-opt-fix branch from 23874df to 22de090 Compare March 18, 2026 20:39
@copybara-service copybara-service Bot merged commit 28d8fce into main Mar 18, 2026
30 of 31 checks passed
@copybara-service copybara-service Bot deleted the vladk/distill-opt-fix branch March 18, 2026 23:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants