Ensure same dtype for subconfig when _from_config#44629
Merged
hmellor merged 4 commits intohuggingface:mainfrom Mar 13, 2026
Merged
Ensure same dtype for subconfig when _from_config#44629hmellor merged 4 commits intohuggingface:mainfrom
dtype for subconfig when _from_config#44629hmellor merged 4 commits intohuggingface:mainfrom
Conversation
dtype for subconfig when _from_config
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
hmellor
reviewed
Mar 12, 2026
| # Set the same `dtype` on all subconfigs to avoid dtype mismatch. When "auto" dtype | ||
| # with nested models, we can't dispatch different dtype per backbone module | ||
| for sub_config_key in config.sub_configs: | ||
| if (sub_config := getattr(config, sub_config_key)) is not None: |
Member
There was a problem hiding this comment.
Will this ever be None? The type hint suggests that it shouldn't be
Member
Author
There was a problem hiding this comment.
the subconfig? I remember it was possible in some backbone_configs in pure-vision models. In generative mllm it doesn't happen
Actually I started refactoring those backbones so maybe not needed anymore, then I can clean up everywhere
Member
There was a problem hiding this comment.
Yeah the values in config.sub_configs. I was just curious because it's hinted as dict[str, type["PreTrainedConfig"]]. Checking for None doesn't hurt though
hmellor
approved these changes
Mar 12, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do?
Fixes "auto" dtype when the model is initialized
from_configIt was already fixed for
from_pretrainedin #42990 but vLLM creates models withAutoModel._from_configwhich caused the same dtype mismatch errorAdded a test as well. cc @hmellor
Note
Medium Risk
Touches core model initialization and could affect dtype override behavior for any model with
sub_configs, though the change is narrow (dtype propagation) and covered by a targeted regression test.Overview
Fixes dtype mismatches when instantiating nested/compound models via
PreTrainedModel._from_configby forcing the chosendtypeonto allconfig.sub_configs(mirroringfrom_pretrainedbehavior), ensuring modules don’t initialize with conflicting dtypes (notably fordtype="auto"paths used by vLLM/AutoModel).Adds a regression test around
LlavaForConditionalGeneration._from_configverifying that a parentconfig.dtypeoverrides differing sub-config dtypes and results in a consistent dtype across language/vision components and the top-level model.Written by Cursor Bugbot for commit 5a895f4. This will update automatically on new commits. Configure here.