Skip to content

Refactor model loading in distillation to remove Tunix adapter.#2927

Merged
copybara-service[bot] merged 1 commit intomainfrom
agagik-distill-2
Jan 14, 2026
Merged

Refactor model loading in distillation to remove Tunix adapter.#2927
copybara-service[bot] merged 1 commit intomainfrom
agagik-distill-2

Conversation

@gagika
Copy link
Copy Markdown
Collaborator

@gagika gagika commented Jan 10, 2026

Description

Refactors the distillation trainer to remove the TunixMaxTextAdapter dependency, enabling direct control over model execution and configuration.

Key Changes:

  • Direct Model Loading: Replaces the Tunix adapter with standard MaxText nnx.Module loading via get_maxtext_model.
  • Config-Bound Forward Pass: Introduces create_forward_fn to generate distinct forward functions for Student and Teacher. This correctly binds enable_dropout from the config, allowing the Student to train with dropout while keeping the Teacher deterministic.
  • Strict Configuration: Removed environment variable fallbacks; teacher_overrides.load_parameters_path is now mandatory to ensure reproducibility.

Tests

Verified locally with the following command:

python3 -m src.MaxText.distillation.train_distill src/MaxText/configs/distillation.yml \
  run_name=distill_llama3 \
  base_output_directory=${BASE_OUTPUT_DIRECTORY} \
  checkpoint_period=2000 \
  hf_access_token=$HF_TOKEN \
  log_period=10 \
  save_checkpoint_on_completion=True \
  teacher_overrides.load_parameters_path="gs://agagik-us/llama3/llama3.1-8b/scanned_chkpt/0/items"

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link
Copy Markdown

codecov Bot commented Jan 10, 2026

Codecov Report

❌ Patch coverage is 0% with 12 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/MaxText/distillation/train_distill.py 0.00% 12 Missing ⚠️

📢 Thoughts on this report? Let us know!

Copy link
Copy Markdown
Collaborator

@richjames0 richjames0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

# Ensure load_parameters_path is set (check overrides, then env var)
# Ensure load_parameters_path is set in overrides
if not teacher_overrides.get("load_parameters_path"):
ckpt_path = os.environ.get("TEACHER_CHECKPOINT_PATH")
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Out of curiosity, why do we remove this fallback?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was testing with env TEACHER_CHECKPOINT_PATH before for teacher checkpoint path, but there is not need to have extra env, it can just be provided from distillaiton.yml file or command line argument, e.g. teacher_overrides.load_parameters_path="gs://agagik-us/...

@copybara-service copybara-service Bot merged commit 05bbde3 into main Jan 14, 2026
29 of 33 checks passed
@copybara-service copybara-service Bot deleted the agagik-distill-2 branch January 14, 2026 22:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants