Skip to content

Fix Gemma4 MoE parameter mappings#3554

Merged
copybara-service[bot] merged 1 commit intomainfrom
agagik-gemma4-moe
Apr 2, 2026
Merged

Fix Gemma4 MoE parameter mappings#3554
copybara-service[bot] merged 1 commit intomainfrom
agagik-gemma4-moe

Conversation

@gagika
Copy link
Copy Markdown
Collaborator

@gagika gagika commented Apr 2, 2026

Description

Update Gemma 4 MoE parameter mappings for refactored HF weights

Tests

End to end Gemma4-26b model with and without scan

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link
Copy Markdown

codecov Bot commented Apr 2, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

@github-actions
Copy link
Copy Markdown

github-actions Bot commented Apr 2, 2026

🤖 Hi @gagika, I've received your request, and I'm working on it now! You can track my progress in the logs for more details.

Copy link
Copy Markdown

@github-actions github-actions Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

## 📋 Review Summary

The pull request successfully updates the Gemma 4 MoE parameter mappings to match the refactored Hugging Face weights. It also includes several improvements to the end-to-end test scripts, such as fixing path typos and correcting variable names in comments.

🔍 General Feedback

  • The parameter mapping changes in src/maxtext/checkpoint_conversion/utils/param_mapping.py are consistent across the model configuration.
  • The correction of //base.yml to /base.yml in the shell scripts is a small but important fix for path reliability.
  • I've noted a change in the USE_MULTIMODAL flag in the test scripts; please verify if this change is intended to be permanent for these specific tests.

TOKENIZER_PATH='google/gemma-4-26b-a4b-it'
# To convert the multimodal model, make sure the use_multimodal is set to be true
USE_MULTIMODAL=true
USE_MULTIMODAL=false
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 The `USE_MULTIMODAL` flag has been changed from `true` to `false`. If these tests are intended to verify multimodal capabilities for Gemma 4, this change will disable that verification. Please confirm if this is intended.
Suggested change
USE_MULTIMODAL=false
USE_MULTIMODAL=false


# To get converted ckpt:
python3 -m maxtext.checkpoint_conversion.to_maxtext "${MAXTEXT_CONFIGS_DIR:-${MAXTEXT_REPO_ROOT:-$PWD}/src/maxtext/configs}"//base.yml \
python3 -m maxtext.checkpoint_conversion.to_maxtext "${MAXTEXT_CONFIGS_DIR:-${MAXTEXT_REPO_ROOT:-$PWD}/src/maxtext/configs}"/base.yml \
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟢 Good fix on the configuration path. The double slash `//` was likely a typo.
Suggested change
python3 -m maxtext.checkpoint_conversion.to_maxtext "${MAXTEXT_CONFIGS_DIR:-${MAXTEXT_REPO_ROOT:-$PWD}/src/maxtext/configs}"/base.yml \
python3 -m maxtext.checkpoint_conversion.to_maxtext "${MAXTEXT_CONFIGS_DIR:-${MAXTEXT_REPO_ROOT:-$PWD}/src/maxtext/configs}"/base.yml \

TOKENIZER_PATH='google/gemma-4-26b-a4b'
# To convert the multimodal model, make sure the use_multimodal is set to be true
USE_MULTIMODAL=true
USE_MULTIMODAL=false
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 Similar to the other 26b script, `USE_MULTIMODAL` is set to `false`. Please ensure this is the desired state for these end-to-end tests.
Suggested change
USE_MULTIMODAL=false
USE_MULTIMODAL=false

@copybara-service copybara-service Bot merged commit 447a940 into main Apr 2, 2026
48 checks passed
@copybara-service copybara-service Bot deleted the agagik-gemma4-moe branch April 2, 2026 21:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants