Skip to content

Fix maybe_update_config signature to match updated vllm API#10

Draft
Copilot wants to merge 2 commits intomainfrom
copilot/fix-vllm-plugin-api-change
Draft

Fix maybe_update_config signature to match updated vllm API#10
Copilot wants to merge 2 commits intomainfrom
copilot/fix-vllm-plugin-api-change

Conversation

Copy link
Copy Markdown

Copilot AI commented Apr 23, 2026

vllm added a new hf_config parameter to QuantizationConfig.maybe_update_config, breaking the ParoQuantConfig override which only accepted (model_name, revision).

Changes

  • Added hf_config: PretrainedConfig | None = None between model_name and revision in ParoQuantConfig.maybe_update_config to match the new base class signature
  • Added from transformers import PretrainedConfig import
# Before
def maybe_update_config(self, model_name: str, revision: str | None = None):

# After
def maybe_update_config(self, model_name: str, hf_config: PretrainedConfig | None = None, revision: str | None = None):

The hf_config argument is accepted but unused — layer skipping continues to be auto-detected from safetensors metadata.

Copilot AI linked an issue Apr 23, 2026 that may be closed by this pull request
Copilot AI changed the title [WIP] Fix vllm plugin API change in maybe_update_config function Fix maybe_update_config signature to match updated vllm API Apr 23, 2026
Copilot AI requested a review from Readon April 23, 2026 01:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

vllm plugin api changed。

2 participants