Skip to content

Conversation

@ConvolutedDog
Copy link
Contributor

@ConvolutedDog ConvolutedDog commented Nov 20, 2025

This commit fixes the squeeze operator to behave consistently with PyTorch
by implementing no-op behavior when squeezing dimensions that are not of size 1.

Previously:
squeeze(x, [1]) on tensor with shape [32, 10, 5] would fail

Now:
squeeze(x, [1]) on tensor with shape [32, 10, 5] returns the original tensor
without modification, matching PyTorch's behavior

This fixes compatibility issues when converting PyTorch models that use
squeeze with dimensions that may not always be 1 during inference."

This work was done in collaboration with guan404ming's commit d87841d.

Modify dimension check to skip non-1 dimensions.
Added a new Squeeze3 module and its expected output for testing.
Update squeeze function to skip non-unit dimensions.
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @ConvolutedDog, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request updates the squeeze operator to align its behavior with PyTorch, specifically by making it a no-operation when an attempt is made to remove a dimension that does not have a size of 1. This adjustment eliminates previous errors and enhances the seamless conversion of models from PyTorch, ensuring robust handling of tensor manipulations.

Highlights

  • PyTorch Consistency: The squeeze operator now behaves consistently with PyTorch, performing a no-op when attempting to squeeze dimensions not of size 1.
  • Error Prevention: Previously, squeezing a non-unit dimension would cause a failure; this is now prevented.
  • Model Compatibility: This change improves compatibility for converting PyTorch models that use squeeze with potentially non-unit dimensions.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request correctly updates the behavior of the squeeze operator in both TOPI and Relax to align with PyTorch, where attempting to squeeze a dimension not of size 1 is a no-op. The changes in include/tvm/topi/transform.h and src/relax/op/tensor/manipulate.cc correctly implement this logic, and the new test case in tests/python/relax/test_frontend_from_exported_program.py validates it. However, the docstrings for the squeeze function in python/tvm/relax/op/manipulate.py, python/tvm/topi/transform.py, and for SqueezeAttrs in include/tvm/relax/attrs/manipulate.h have not been updated to reflect this new behavior, which could cause confusion. I've added comments to address these documentation inconsistencies.

@ConvolutedDog
Copy link
Contributor Author

@mshr-h @tlopex Could you please re-run the tests?

@tlopex
Copy link
Member

tlopex commented Nov 20, 2025

@tvm-bot rerun

Copy link
Contributor

@mshr-h mshr-h left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Please include testcase for that in the relax op test.
https://github.com/apache/tvm/blob/main/tests/python/relax/test_op_manipulate.py#L840

@ConvolutedDog ConvolutedDog marked this pull request as draft November 24, 2025 03:13
@ConvolutedDog ConvolutedDog marked this pull request as ready for review November 24, 2025 06:14
@ConvolutedDog ConvolutedDog requested a review from mshr-h November 24, 2025 06:15
Copy link
Member

@tlopex tlopex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thank you

Copy link
Contributor

@mshr-h mshr-h left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thanks!

@mshr-h mshr-h merged commit faab2e7 into apache:main Nov 24, 2025
11 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants