Skip to content

[BugFix][Relax][Torch] Honor multi-axis dims in torch.flip converter#19511

Merged
tlopex merged 1 commit intoapache:mainfrom
swjng:fix/torch-flip-multi-axis
May 6, 2026
Merged

[BugFix][Relax][Torch] Honor multi-axis dims in torch.flip converter#19511
tlopex merged 1 commit intoapache:mainfrom
swjng:fix/torch-flip-multi-axis

Conversation

@swjng
Copy link
Copy Markdown
Contributor

@swjng swjng commented May 6, 2026

Motivation

PyTorch's torch.flip(x, dims=[...]) reverses every listed axis. The
Relax converter _flip (base_fx_graph_translator.py) instead coerces
the list to a single integer:

if isinstance(dims, list | tuple) and len(dims) > 0:
    dims = dims[0]

Only the first axis is forwarded to relax.op.flip, which is itself
single-axis. The remaining axes are silently dropped.

Minimal repro (vs PyTorch eager) on a (3, 4) input with
dims=[-1, -2]:

ref: [11, 10,  9,  8,  7,  6,  5,  4, ...]   # both axes flipped
tvm: [ 3,  2,  1,  0,  7,  6,  5,  4, ...]   # only last axis flipped

max_abs_diff = 8.0. Both the torch.export and legacy fx paths share
this converter, so both are affected.

Fix

Iterate over dims in the converter and emit one relax.op.flip per
axis (flips along distinct axes commute, so the order is irrelevant).
A scalar dims is wrapped to a single-element list; non-int /
non-sequence arguments still raise TypeError.

relax.op.flip itself is unchanged: it is used elsewhere as a
single-axis op, and widening its signature would expand the scope of
this fix beyond the PyTorch frontend.

The PyTorch frontend's _flip coerces a list of dims to a single int
(`dims = dims[0]`) and forwards only one axis to relax.op.flip, which
is itself single-axis. As a result torch.flip(x, dims=[-1, -2])
silently flips just the last axis (max_abs_diff=8.0 vs PyTorch eager
on a (3, 4) input).

Iterate over dims instead, emitting one relax.op.flip per axis. Flips
along distinct axes commute, so order is irrelevant. relax.op.flip
itself is unchanged — it is used elsewhere as a single-axis op, and
broadening its signature would expand scope beyond this bug.
Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request updates the _flip translator in the Relax Torch frontend to support multi-axis flipping, aligning with torch.flip semantics. The implementation now iterates through the provided dimensions and emits a relax.op.flip operation for each axis, whereas it previously only supported a single axis. Additionally, new test cases have been added to the FX and exported program frontend test suites to verify multi-axis and negative axis support. I have no feedback to provide as there are no review comments.

Copy link
Copy Markdown
Member

@tlopex tlopex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tlopex tlopex merged commit 61b49bb into apache:main May 6, 2026
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants