Skip to content

fix wrong match of ignore_layers and use warning instead of error for mismatch#1553

Merged
xin3he merged 5 commits intomainfrom
xinhe/3-16
Mar 17, 2026
Merged

fix wrong match of ignore_layers and use warning instead of error for mismatch#1553
xin3he merged 5 commits intomainfrom
xinhe/3-16

Conversation

@xin3he
Copy link
Contributor

@xin3he xin3he commented Mar 16, 2026

Description

layers.1 now matches layers.1x, which is not expected.

Type of Change

  • Bug fix
  • New feature
  • Documentation update
  • Performance improvement
  • Code refactoring
  • Other (please specify):

Related Issues

Fixes or relates to #

Checklist Before Submitting

  • My code has been tested locally.
  • Documentation has been updated as needed.
  • New or updated tests are included where applicable.

Signed-off-by: Xin He <xin3.he@intel.com>
Copilot AI review requested due to automatic review settings March 16, 2026 11:50
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR aims to fix overly-broad matching for ignore_layers (e.g., layers.1 unintentionally matching layers.1x), so that layer-exclusion behaves as users expect during quantization config expansion.

Changes:

  • Normalizes ignore_layers earlier in set_layer_config (splitting by comma and appending . for digit-suffixed names).
  • Changes unmatched layer_config entries from raising an error to logging a warning.
  • Removes string normalization inside get_fp_layer_names.
Comments suppressed due to low confidence (1)

auto_round/compressors/utils.py:966

  • get_fp_layer_names is now iterating over ignore_layers without normalizing when a string is passed. Since strings are iterable, callers passing the documented comma-separated string (e.g. existing tests) will iterate character-by-character and return incorrect results. Either restore string parsing inside this function (accept str | Sequence[str]) or update the signature/docstring and all call sites accordingly.
def get_fp_layer_names(model: torch.nn.Module, ignore_layers: str):
    """Identifies and returns layers in the model to exclude from quantization.

    This function processes a comma-separated list of fully precision (FP) layers,
    matches them to the names of layers in the model, and returns a list of such
    layers to exclude from quantization.

    Args:
        model (torch.nn.Module): The model whose layers will be inspected.
        ignore_layers (str): A comma-separated string of layer names to be excluded
            from quantization. Whitespace is ignored in this string.

    Returns:
        list: A list of layer names that match the specified FP layers or are
        subcomponents of those layers.
    """
    from auto_round.utils import SUPPORTED_LAYER_TYPES

    if not ignore_layers:
        return []

    all_layer_names = []
    for n, m in model.named_modules():
        if type(m) in SUPPORTED_LAYER_TYPES:
            all_layer_names.append(n)
    not_to_quantized_layers = []

    for fp_layer in ignore_layers:

You can also share your feedback on Copilot code review. Take the survey.

@xin3he xin3he requested a review from wenhuach21 March 16, 2026 12:09
xin3he added 2 commits March 16, 2026 20:10
Signed-off-by: Xin He <xin3.he@intel.com>
Signed-off-by: Xin He <xin3.he@intel.com>
@xin3he xin3he changed the title fix wrong match of ignore_layers fix wrong match of ignore_layers and use warning instead of raise Error for mismatch Mar 16, 2026
@xin3he xin3he changed the title fix wrong match of ignore_layers and use warning instead of raise Error for mismatch fix wrong match of ignore_layers and use warning instead of error for mismatch Mar 16, 2026
@wenhuach21 wenhuach21 self-requested a review March 16, 2026 12:16
xin3he added 2 commits March 16, 2026 20:21
Signed-off-by: Xin He <xin3.he@intel.com>
Signed-off-by: Xin He <xin3.he@intel.com>
@xin3he xin3he merged commit d8ee85c into main Mar 17, 2026
29 checks passed
@xin3he xin3he deleted the xinhe/3-16 branch March 17, 2026 05:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants