fix: fall back to HF for Mistral3 VLMs with non-Mistral4 text backbone#1557
Merged
fix: fall back to HF for Mistral3 VLMs with non-Mistral4 text backbone#1557
Conversation
The custom Mistral3ForConditionalGeneration model (added for Mistral4 MoE+MLA text backbones) was intercepting all models with that HF architecture, including Devstral-Small which uses a dense Ministral3 text backbone. This caused an AttributeError on `moe_intermediate_size`. Add a `supports_config` classmethod that custom model classes can define to opt out for incompatible configs. The registry's new `resolve_custom_model_cls` method checks this before returning a custom class, falling back to HF's native implementation when unsupported. Signed-off-by: HuiyingLi <willwin.lee@gmail.com> Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com> Signed-off-by: HuiyingLi <willwin.lee@gmail.com>
Contributor
Author
|
/ok to test a678fa5 |
akoumpa
approved these changes
Mar 17, 2026
linnanwang
pushed a commit
that referenced
this pull request
Apr 24, 2026
#1557) The custom Mistral3ForConditionalGeneration model (added for Mistral4 MoE+MLA text backbones) was intercepting all models with that HF architecture, including Devstral-Small which uses a dense Ministral3 text backbone. This caused an AttributeError on `moe_intermediate_size`. Add a `supports_config` classmethod that custom model classes can define to opt out for incompatible configs. The registry's new `resolve_custom_model_cls` method checks this before returning a custom class, falling back to HF's native implementation when unsupported. Signed-off-by: HuiyingLi <willwin.lee@gmail.com> Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Mistral3ForConditionalGenerationmodel (for Mistral4 MoE+MLA) was intercepting all models with that HF architecture, including Devstral-Small which uses a dense Ministral3 text backbone, causingAttributeError: 'Ministral3Config' object has no attribute 'moe_intermediate_size'supports_configclassmethod toMistral3ForConditionalGenerationthat returnsFalsefor non-Mistral4 text configsresolve_custom_model_clsto_ModelRegistrythat checkssupports_configbefore returning a custom model class, falling back to HF's native implementation when unsupportedresolve_custom_model_cls(5) andsupports_config(4)Test plan
resolve_custom_model_clswith: found/not-found/supports-true/supports-false/config-passthroughsupports_configwith: mistral4-text/non-mistral4-text/no-text-config/no-model-typeruff checkandruff formatpass on all changed files🤖 Generated with Claude Code