Skip to content

Fix #8462: embed patch sizes in einops pattern for einops >= 0.8 compatibility#8834

Open
williams145 wants to merge 3 commits intoProject-MONAI:devfrom
williams145:fix/issue-8462-patchembedding-einops-compat
Open

Fix #8462: embed patch sizes in einops pattern for einops >= 0.8 compatibility#8834
williams145 wants to merge 3 commits intoProject-MONAI:devfrom
williams145:fix/issue-8462-patchembedding-einops-compat

Conversation

@williams145
Copy link
Copy Markdown
Contributor

Fixes #8462

Description

PatchEmbeddingBlock with proj_type="perceptron" raises TypeError on einops >= 0.8.x because Rearrange.__init__() no longer accepts arbitrary keyword arguments.

The current code builds an axes_len dict and passes it as **axes_len to Rearrange(pattern, **axes_len). einops 0.8.x removed support for this.

Fix: embed the patch size values directly as integer literals in the einops pattern string — e.g. "b c (h 16) (w 16) (d 16)" instead of using kwargs. einops supports integer literals in patterns across all versions including 0.8.x. The change is semantically identical.

Types of changes

  • Non-breaking change (fix or new feature that would not break existing functionality).
  • Breaking change
  • New tests added
  • Quick tests passed locally by running ./runtests.sh --quick --unittests --disttests.
  • In-line docstrings updated.

…s >= 0.8 compatibility

einops 0.8.x removed support for arbitrary kwargs in Rearrange.__init__().
Replace axes_len dict and **axes_len kwarg with integer literals embedded
directly in the pattern string. Semantically identical, compatible with all
einops versions.

Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 18, 2026

📝 Walkthrough

Walkthrough

Introduces a new PyTorch module _PatchRearrange that converts inputs shaped (B, C, ...spatial...) into a sequence (B, n_patches, patch_dim) using explicit view/permute/reshape. In PatchEmbeddingBlock.__init__ for proj_type == "perceptron", construction of the Rearrange layer is wrapped in a try block; if einops.Rearrange raises a TypeError, code falls back to _PatchRearrange(spatial_dims, patch_size) before applying nn.Linear(self.patch_dim, hidden_size). The original Rearrange -> Linear path is preserved when no error occurs.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 33.33% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Title check ✅ Passed The title clearly and specifically describes the main change: fixing einops compatibility by embedding patch sizes in the einops pattern.
Description check ✅ Passed The description covers the problem, the solution approach, and includes the required checklist items. All key details are present.
Linked Issues check ✅ Passed The PR fully addresses issue #8462 by replacing einops kwargs with embedded integer literals in the pattern, fixing the TypeError with einops >= 0.8.
Out of Scope Changes check ✅ Passed The changes are narrowly scoped to fixing PatchEmbeddingBlock's einops compatibility. No out-of-scope modifications detected.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
monai/networks/blocks/patchembedding.py (1)

100-106: Fix looks correct and semantically equivalent.

Embedded integer literals in the einops pattern cleanly sidesteps the removed kwargs path in einops ≥ 0.8. Existing shape tests parameterized over spatial_dims∈{2,3} and patch_size∈{8,16} cover this path.

Minor nit (ruff B905): the zip on Line 102 could use strict=True since dim_names and patch_size are both length spatial_dims by construction — would also surface any future spatial_dims > 3 misuse (currently silently truncates via ("h","w","d")[:spatial_dims]).

♻️ Optional diff
-            from_chars = "b c " + " ".join(f"({name} {psize})" for name, psize in zip(dim_names, patch_size))
+            from_chars = "b c " + " ".join(
+                f"({name} {psize})" for name, psize in zip(dim_names, patch_size, strict=True)
+            )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 100 - 106, Change the
non-strict zip used when building from_chars to a strict zip so mismatched
lengths between dim_names and patch_size are caught early; specifically update
the expression that constructs from_chars (where dim_names, patch_size and
spatial_dims are used) to use zip(..., strict=True) so any future misuse (e.g.,
spatial_dims > 3) surfaces immediately.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 100-106: Change the non-strict zip used when building from_chars
to a strict zip so mismatched lengths between dim_names and patch_size are
caught early; specifically update the expression that constructs from_chars
(where dim_names, patch_size and spatial_dims are used) to use zip(...,
strict=True) so any future misuse (e.g., spatial_dims > 3) surfaces immediately.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: b56cc03c-93bf-453e-a791-3e7817fd215c

📥 Commits

Reviewing files that changed from the base of the PR and between 851054c and 78756f8.

📒 Files selected for processing (1)
  • monai/networks/blocks/patchembedding.py

In some einops 0.8.x builds, Rearrange.__init__() does not accept
**kwargs for axis sizes, raising TypeError. Add _PatchRearrange as a
pure-PyTorch fallback that produces identical output. The primary path
still uses Rearrange with named axes (**axes_len); the fallback only
activates if that call raises TypeError.

Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 131-135: Tests don't deterministically exercise the fallback
_PatchRearrange because current tests run whichever einops version is installed;
add a unit test that forces the fallback by monkey-patching the Rearrange symbol
to raise TypeError (or by directly instantiating _PatchRearrange) and assert
that outputs of the fallback match the normal Rearrange path for a
representative input shape; target names to modify/assert are _PatchRearrange,
Rearrange and the patch embedding behavior (e.g., the patch_embeddings sequence)
in the existing tests/networks/blocks/test_patchembedding.py so both branches
are covered and compared.
- Around line 32-55: Update the _PatchRearrange class: narrow the type of
patch_size in __init__ to tuple[int, ...], replace the view call in forward
(x.view(*v)) with reshape to handle non-contiguous tensors safely, and add
Google-style docstrings for the class __init__ and forward methods describing
Args and Returns (and any raised exceptions) per project guidelines; reference
the _PatchRearrange class, its __init__ signature, and the forward method when
making these changes.
- Around line 126-135: The _PatchRearrange fallback needs fixes: add
Google-style docstrings to the class methods __init__ and forward describing
arguments, return values, and behavior; change the type hint patch_size: tuple
to patch_size: tuple[int, ...]; replace any use of x.view(*v) with x.reshape(*v)
to avoid errors on non-contiguous tensors; and add a deterministic unit test
that directly instantiates and exercises _PatchRearrange (independent of
einops/Rearrange) to validate its behavior for representative
spatial_dims/patch_size combinations.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 8833ef1f-b19d-4c70-a2ad-3f4bd3d6a4a8

📥 Commits

Reviewing files that changed from the base of the PR and between 78756f8 and 093f0b0.

📒 Files selected for processing (1)
  • monai/networks/blocks/patchembedding.py

Comment thread monai/networks/blocks/patchembedding.py Outdated
Comment on lines +32 to +55
class _PatchRearrange(nn.Module):
"""Fallback patch rearrangement using pure PyTorch, for einops compatibility."""

def __init__(self, spatial_dims: int, patch_size: tuple) -> None:
super().__init__()
self.spatial_dims = spatial_dims
self.patch_size = patch_size

def forward(self, x: torch.Tensor) -> torch.Tensor:
B, C = x.shape[0], x.shape[1]
sp = x.shape[2:]
g = tuple(s // p for s, p in zip(sp, self.patch_size))
v: list[int] = [B, C]
for gi, pi in zip(g, self.patch_size):
v += [gi, pi]
x = x.view(*v)
n = self.spatial_dims
gdims = list(range(2, 2 + 2 * n, 2))
pdims = list(range(3, 3 + 2 * n, 2))
x = x.permute(0, *gdims, *pdims, 1).contiguous()
n_patches = 1
for gi in g:
n_patches *= gi
return x.reshape(B, n_patches, -1)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

If the fallback is kept: tighten type hint, prefer reshape, add Google-style docstrings.

Three points on _PatchRearrange:

  1. x.view(*v) at line 47 will raise on non-contiguous inputs. reshape is safer and no slower here.
  2. patch_size: tuple is too loose — tuple[int, ...].
  3. Per coding guidelines, Google-style docstrings with Args:/Returns: are expected on __init__ and forward; the current one-line class docstring doesn't cover them.

As per coding guidelines: "Docstrings should be present for all definition which describe each variable, return value, and raised exception in the appropriate section of the Google-style of docstrings."

♻️ Proposed patch
 class _PatchRearrange(nn.Module):
-    """Fallback patch rearrangement using pure PyTorch, for einops compatibility."""
-
-    def __init__(self, spatial_dims: int, patch_size: tuple) -> None:
+    """Fallback patch rearrangement in pure PyTorch, equivalent to the einops pattern
+    ``"b c (h p1) (w p2) (d p3) -> b (h w d) (p1 p2 p3 c)"`` (trimmed to ``spatial_dims``).
+    """
+
+    def __init__(self, spatial_dims: int, patch_size: tuple[int, ...]) -> None:
+        """
+        Args:
+            spatial_dims: number of spatial dimensions (2 or 3).
+            patch_size: per-axis patch size; length must equal ``spatial_dims``.
+        """
         super().__init__()
         self.spatial_dims = spatial_dims
         self.patch_size = patch_size
 
     def forward(self, x: torch.Tensor) -> torch.Tensor:
+        """
+        Args:
+            x: input tensor of shape ``(B, C, *spatial)`` where each spatial dim is
+                divisible by the corresponding ``patch_size`` entry.
+
+        Returns:
+            Tensor of shape ``(B, n_patches, prod(patch_size) * C)``.
+        """
         B, C = x.shape[0], x.shape[1]
         sp = x.shape[2:]
         g = tuple(s // p for s, p in zip(sp, self.patch_size))
         v: list[int] = [B, C]
         for gi, pi in zip(g, self.patch_size):
             v += [gi, pi]
-        x = x.view(*v)
+        x = x.reshape(*v)
         n = self.spatial_dims
         gdims = list(range(2, 2 + 2 * n, 2))
         pdims = list(range(3, 3 + 2 * n, 2))
         x = x.permute(0, *gdims, *pdims, 1).contiguous()
         n_patches = 1
         for gi in g:
             n_patches *= gi
         return x.reshape(B, n_patches, -1)
🧰 Tools
🪛 Ruff (0.15.10)

[warning] 43-43: zip() without an explicit strict= parameter

Add explicit value for parameter strict=

(B905)


[warning] 45-45: zip() without an explicit strict= parameter

Add explicit value for parameter strict=

(B905)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 32 - 55, Update the
_PatchRearrange class: narrow the type of patch_size in __init__ to tuple[int,
...], replace the view call in forward (x.view(*v)) with reshape to handle
non-contiguous tensors safely, and add Google-style docstrings for the class
__init__ and forward methods describing Args and Returns (and any raised
exceptions) per project guidelines; reference the _PatchRearrange class, its
__init__ signature, and the forward method when making these changes.

Comment on lines +126 to +135
# for 3d: "b c (h p1) (w p2) (d p3) -> b (h w d) (p1 p2 p3 c)"
chars = (("h", "p1"), ("w", "p2"), ("d", "p3"))[:spatial_dims]
from_chars = "b c " + " ".join(f"({k} {v})" for k, v in chars)
to_chars = f"b ({' '.join([c[0] for c in chars])}) ({' '.join([c[1] for c in chars])} c)"
axes_len = {f"p{i + 1}": p for i, p in enumerate(patch_size)}
self.patch_embeddings = nn.Sequential(
Rearrange(f"{from_chars} -> {to_chars}", **axes_len), nn.Linear(self.patch_dim, hidden_size)
)
try:
rearrange_layer: nn.Module = Rearrange(f"{from_chars} -> {to_chars}", **axes_len)
except TypeError:
rearrange_layer = _PatchRearrange(spatial_dims, tuple(int(p) for p in patch_size))
self.patch_embeddings = nn.Sequential(rearrange_layer, nn.Linear(self.patch_dim, hidden_size))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

einops Rearrange pattern integer literals support 0.8

💡 Result:

No, einops Rearrange patterns do not support standalone integer literals (e.g., 'b 2 h w') as axis specifications. Patterns use symbolic axis names (letters like 'b', 'h'), anonymous axes (numbers in parentheses like '(b1 b2)'), ellipsis (...), or underscores (_ for skipping). Integer values are provided via the axes_lengths keyword argument (e.g., rearrange(x, 'b (h1 h) w c -> ...', h1=2)) or parsed from tensor shapes via parse_shape. This syntax has been consistent across versions, including 0.8.0+ (released 2024-04-28), with no changes introducing integer literal support in patterns per release notes, changelogs, and docs. Examples in official documentation and tutorials confirm named axes only.

Citations:


🏁 Script executed:

# Get the file and examine _PatchRearrange class and test coverage
head -150 monai/networks/blocks/patchembedding.py | tail -100

Repository: Project-MONAI/MONAI

Length of output: 4640


🏁 Script executed:

# Check for existing tests covering the fallback path
find . -name "*.py" -path "*/test*" -type f | xargs grep -l "patchembedding" | head -5

Repository: Project-MONAI/MONAI

Length of output: 130


🏁 Script executed:

# Get the _PatchRearrange class definition (around lines 32-55)
sed -n '32,56p' monai/networks/blocks/patchembedding.py

Repository: Project-MONAI/MONAI

Length of output: 958


🏁 Script executed:

# Check the test file for coverage of _PatchRearrange and the fallback path
cat monai/tests/networks/blocks/test_patchembedding.py

Repository: Project-MONAI/MONAI

Length of output: 145


🏁 Script executed:

# Find the test file with correct path
find . -name "test_patchembedding.py" -type f 2>/dev/null

Repository: Project-MONAI/MONAI

Length of output: 109


🏁 Script executed:

# Also check for any tests that import or test _PatchRearrange
rg "_PatchRearrange" --type py

Repository: Project-MONAI/MONAI

Length of output: 275


🏁 Script executed:

# Read the test file
cat ./tests/networks/blocks/test_patchembedding.py

Repository: Project-MONAI/MONAI

Length of output: 7510


🏁 Script executed:

# Check einops version constraints and any comments about the fallback logic
head -30 monai/networks/blocks/patchembedding.py

Repository: Project-MONAI/MONAI

Length of output: 1308


🏁 Script executed:

# Verify if x.view() in _PatchRearrange could fail on non-contiguous tensors
rg "x\.view\(" monai/networks/blocks/patchembedding.py -A 2 -B 2

Repository: Project-MONAI/MONAI

Length of output: 233


Retract the simplification suggestion; the current try/except approach is necessary.

einops Rearrange does not support integer literals in patterns—numeric axis values must be passed via axes_lengths (the current axes_len dict). The suggested approach of embedding integers as literals (e.g., "b c (h 16) (w 16) (d 16)") is not feasible. The fallback with _PatchRearrange is the correct strategy for version compatibility.

However, address these remaining issues in _PatchRearrange:

  1. Missing Google-style docstrings: Add docstrings to __init__ and forward methods describing arguments, return values, and behavior per coding guidelines.
  2. Type hint specificity: Change patch_size: tuple to patch_size: tuple[int, ...].
  3. Use reshape() instead of view(): Line 47 uses x.view(*v), which fails on non-contiguous tensors; reshape() is safer.
  4. Incomplete test coverage: The test suite only exercises the Rearrange path (since einops is installed). The fallback is never deterministically validated. Add a test that directly instantiates and tests _PatchRearrange independently.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 126 - 135, The
_PatchRearrange fallback needs fixes: add Google-style docstrings to the class
methods __init__ and forward describing arguments, return values, and behavior;
change the type hint patch_size: tuple to patch_size: tuple[int, ...]; replace
any use of x.view(*v) with x.reshape(*v) to avoid errors on non-contiguous
tensors; and add a deterministic unit test that directly instantiates and
exercises _PatchRearrange (independent of einops/Rearrange) to validate its
behavior for representative spatial_dims/patch_size combinations.

Comment on lines +131 to +135
try:
rearrange_layer: nn.Module = Rearrange(f"{from_chars} -> {to_chars}", **axes_len)
except TypeError:
rearrange_layer = _PatchRearrange(spatial_dims, tuple(int(p) for p in patch_size))
self.patch_embeddings = nn.Sequential(rearrange_layer, nn.Linear(self.patch_dim, hidden_size))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Fallback path isn't deterministically covered by tests.

_PatchRearrange only executes when Rearrange(..., **axes_len) raises TypeError, i.e. only on einops ≥ 0.8. Existing test_shape cases (tests/networks/blocks/test_patchembedding.py) exercise whichever branch the installed einops version selects, so CI never compares both paths in the same run. Suggest a targeted test that forces the fallback — e.g. monkey-patch Rearrange to raise TypeError, or instantiate _PatchRearrange directly and compare its output against the Rearrange path for a known input.

As per coding guidelines: "Ensure new or modified definitions will be covered by existing or new unit tests."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 131 - 135, Tests don't
deterministically exercise the fallback _PatchRearrange because current tests
run whichever einops version is installed; add a unit test that forces the
fallback by monkey-patching the Rearrange symbol to raise TypeError (or by
directly instantiating _PatchRearrange) and assert that outputs of the fallback
match the normal Rearrange path for a representative input shape; target names
to modify/assert are _PatchRearrange, Rearrange and the patch embedding behavior
(e.g., the patch_embeddings sequence) in the existing
tests/networks/blocks/test_patchembedding.py so both branches are covered and
compared.

Signed-off-by: UGBOMEH OGOCHUKWU WILLIAMS <williamsugbomeh@gmail.com>
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (2)
monai/networks/blocks/patchembedding.py (2)

131-135: ⚠️ Potential issue | 🟡 Minor

Add deterministic coverage for the fallback branch.

The fallback only runs when Rearrange(...) raises TypeError, so normal shape tests cover whichever branch the installed dependency happens to take. Add a direct _PatchRearrange test or monkey-patch Rearrange to force this path.

As per coding guidelines: "Ensure new or modified definitions will be covered by existing or new unit tests."

#!/bin/bash
# Description: Check whether tests directly exercise _PatchRearrange or force the Rearrange fallback.

fd -i '^test_patchembedding\.py$' -x sh -c '
  printf "\n== %s ==\n" "$1"
  rg -n -C3 "_PatchRearrange|monkeypatch|Rearrange|patch_embeddings" "$1" || true
' sh {}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 131 - 135, Add a
deterministic unit test that exercises the fallback branch by forcing the
_PatchRearrange path: either write a test that instantiates and validates
_PatchRearrange directly (ensuring it produces the same output/shape as the
normal path) or monkey-patch the Rearrange symbol used in patchembedding.py to
raise TypeError before importing/constructing the module so the try/except in
patch_embeddings triggers; reference _PatchRearrange, Rearrange and
patch_embeddings in the test to make the coverage explicit and assert expected
tensor shapes/behaviour.

32-55: ⚠️ Potential issue | 🟡 Minor

Carry forward the fallback hardening fixes.

forward() still uses view() on user tensors, so non-contiguous inputs can fail. Also tighten patch_size, add Google-style method docstrings, and address Ruff B905 with explicit zip(strict=...) if the Python target supports it.

As per coding guidelines: "Docstrings should be present for all definition which describe each variable, return value, and raised exception in the appropriate section of the Google-style of docstrings."

Proposed patch
 class _PatchRearrange(nn.Module):
-    """Fallback patch rearrangement using pure PyTorch, for einops compatibility."""
+    """Fallback patch rearrangement using pure PyTorch, for einops compatibility."""
 
-    def __init__(self, spatial_dims: int, patch_size: tuple) -> None:
+    def __init__(self, spatial_dims: int, patch_size: tuple[int, ...]) -> None:
+        """Initialize the patch rearrangement layer.
+
+        Args:
+            spatial_dims: Number of spatial dimensions.
+            patch_size: Patch size for each spatial dimension.
+        """
         super().__init__()
         self.spatial_dims = spatial_dims
         self.patch_size = patch_size
 
     def forward(self, x: torch.Tensor) -> torch.Tensor:
+        """Rearrange image patches into a patch sequence.
+
+        Args:
+            x: Input tensor with shape ``(batch, channels, *spatial)``.
+
+        Returns:
+            Tensor with shape ``(batch, n_patches, patch_dim)``.
+
+        Raises:
+            RuntimeError: If ``x`` cannot be reshaped into the configured patch grid.
+        """
         batch, channels = x.shape[0], x.shape[1]
         sp = x.shape[2:]
-        g = tuple(s // p for s, p in zip(sp, self.patch_size))
+        g = tuple(s // p for s, p in zip(sp, self.patch_size, strict=True))
         v: list[int] = [batch, channels]
-        for gi, pi in zip(g, self.patch_size):
+        for gi, pi in zip(g, self.patch_size, strict=True):
             v += [gi, pi]
-        x = x.view(*v)
+        x = x.reshape(*v)

Verify the Python target before applying zip(strict=True):

#!/bin/bash
# Description: Check project Python target metadata for zip(strict=True) support.

fd '^(pyproject\.toml|setup\.cfg|setup\.py)$' --max-depth 2 -x sh -c '
  printf "\n== %s ==\n" "$1"
  grep -nEi "requires-python|python_requires|target-version" "$1" || true
' sh {}

python - <<'PY'
import sys
print("sandbox_python:", sys.version)
print("zip_strict_supported:", sys.version_info >= (3, 10))
PY
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/networks/blocks/patchembedding.py` around lines 32 - 55, The forward()
in class _PatchRearrange uses tensor.view on potentially non-contiguous inputs,
lacks a Google-style docstring, doesn't validate/tighten patch_size, and
triggers Ruff B905 for zip(strict=...). Fix by: 1) validate and canonicalize
self.patch_size in __init__ (ensure tuple of ints of length spatial_dims and
values > 0); 2) add a Google-style docstring to forward describing Args and
Returns and any raised exceptions; 3) avoid view on non-contiguous tensors by
using torch.reshape (or call .contiguous() before view) for the initial reshape
and the final reshape, and call .contiguous() after permute() before reshaping;
4) replace unsafe zip(...) usage with a conditional strict zip: if
sys.version_info >= (3,10) use zip(..., strict=True) else fallback to regular
zip (use unique symbols: class _PatchRearrange, method forward, attribute
self.patch_size, local vars g, v, gdims, pdims) so the change is robust across
Python targets.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@monai/networks/blocks/patchembedding.py`:
- Around line 131-135: Add a deterministic unit test that exercises the fallback
branch by forcing the _PatchRearrange path: either write a test that
instantiates and validates _PatchRearrange directly (ensuring it produces the
same output/shape as the normal path) or monkey-patch the Rearrange symbol used
in patchembedding.py to raise TypeError before importing/constructing the module
so the try/except in patch_embeddings triggers; reference _PatchRearrange,
Rearrange and patch_embeddings in the test to make the coverage explicit and
assert expected tensor shapes/behaviour.
- Around line 32-55: The forward() in class _PatchRearrange uses tensor.view on
potentially non-contiguous inputs, lacks a Google-style docstring, doesn't
validate/tighten patch_size, and triggers Ruff B905 for zip(strict=...). Fix by:
1) validate and canonicalize self.patch_size in __init__ (ensure tuple of ints
of length spatial_dims and values > 0); 2) add a Google-style docstring to
forward describing Args and Returns and any raised exceptions; 3) avoid view on
non-contiguous tensors by using torch.reshape (or call .contiguous() before
view) for the initial reshape and the final reshape, and call .contiguous()
after permute() before reshaping; 4) replace unsafe zip(...) usage with a
conditional strict zip: if sys.version_info >= (3,10) use zip(..., strict=True)
else fallback to regular zip (use unique symbols: class _PatchRearrange, method
forward, attribute self.patch_size, local vars g, v, gdims, pdims) so the change
is robust across Python targets.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 22291e60-3ff7-46af-ac07-515fd865b4ec

📥 Commits

Reviewing files that changed from the base of the PR and between 093f0b0 and f1cffe8.

📒 Files selected for processing (1)
  • monai/networks/blocks/patchembedding.py

@williams145
Copy link
Copy Markdown
Contributor Author

@ericspod: CI is green and the branch is up to date. Happy to answer any questions if helpful.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG]einops.layer.torch.Rearrange got an unexpected keyword argument

1 participant