Skip to content

Conversation

@NicoGrande
Copy link
Collaborator

@NicoGrande NicoGrande commented Jan 12, 2026

Description

This PR fixes support for running vLLM decode with dummy weights. This was previously failing when load_parameters_config=None as Pydantic expects a string with "" being the default.

Additionally this PR adds the logical axis rules context manager necessary to shard the model upon initialization.

Tests

Running vLLM decode with dummy weights.

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link

codecov bot commented Jan 12, 2026

Codecov Report

❌ Patch coverage is 0% with 10 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/MaxText/vllm_decode.py 0.00% 10 Missing ⚠️

📢 Thoughts on this report? Let us know!

Copy link
Collaborator

@richjames0 richjames0 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@NicoGrande NicoGrande force-pushed the nicogrande/enable-dummy-weights-decode branch from c57d0ef to 7649164 Compare January 13, 2026 00:05
@copybara-service copybara-service bot merged commit 9d52020 into main Jan 13, 2026
26 checks passed
@copybara-service copybara-service bot deleted the nicogrande/enable-dummy-weights-decode branch January 13, 2026 00:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants