Skip to content

Fix ring of experts using TP#3351

Merged
copybara-service[bot] merged 1 commit intomainfrom
chengnuojin-fix-ring-tp
Mar 9, 2026
Merged

Fix ring of experts using TP#3351
copybara-service[bot] merged 1 commit intomainfrom
chengnuojin-fix-ring-tp

Conversation

@NuojCheng
Copy link
Copy Markdown
Collaborator

@NuojCheng NuojCheng commented Mar 9, 2026

Description

When use_ring_of_experts=True and TP is enabled, a sharding shape error appears. This PR fixes it by adding constraint on the unpermute shapes.

FIXES: b/490177598

Tests

CI tests

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 9, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

Comment thread src/maxtext/layers/moe.py

# Sum up the partial outputs across the expert shards.
output = jnp.reshape(output, (-1, sequence_length, self.config.emb_dim))
output = jnp.reshape(output, (-1, sequence_length, self.config.emb_dim // self.get_tensor_parallelism_size()))
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need manually rs gradient output?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we rely on JAX autodiff then no. If custom vjp then probably yes.

@NuojCheng NuojCheng force-pushed the chengnuojin-fix-ring-tp branch from efdffea to e3fa572 Compare March 9, 2026 17:10
@copybara-service copybara-service Bot merged commit e2f6b0e into main Mar 9, 2026
29 of 30 checks passed
@copybara-service copybara-service Bot deleted the chengnuojin-fix-ring-tp branch March 9, 2026 18:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants