Skip to content

[torchlib] Fix aten.embedding_bag.padding_idx to support None values #2384

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

Copilot
Copy link
Contributor

@Copilot Copilot AI commented Jun 14, 2025

The aten.embedding_bag.padding_idx function was failing when padding_idx is None, which is a valid value in PyTorch indicating that no padding index should be ignored. This was causing DLRM model exports to fail with the error:

padding_idx must not be None. This is likely a dispatcher error

Changes Made

  1. Updated function signature: Changed padding_idx: int = -1 to padding_idx: Optional[int] = None
  2. Removed problematic assertion: Eliminated the assertion that was throwing the error when padding_idx is None
  3. Added conditional logic:
    • When padding_idx is None: delegate to _aten_embedding_bag_onnx() (regular embedding_bag without padding filtering)
    • When padding_idx is not None: use existing _aten_embedding_bag_1d_padding_idx_onnx() implementation

Behavior

# This now works correctly (previously threw assertion error)
result = aten_embedding_bag_padding_idx(
    weight, indices, offsets, 
    padding_idx=None  # No padding index to ignore
)

# This continues to work as before
result = aten_embedding_bag_padding_idx(
    weight, indices, offsets, 
    padding_idx=2  # Filter out index 2
)

The fix maintains full backward compatibility while correctly handling the None case as expected by PyTorch's behavior.

Fixes #2219.


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
@Copilot Copilot AI changed the title [WIP] [torchlib] Implement aten.embedding_bag.padding_idx [torchlib] Fix aten.embedding_bag.padding_idx to support None values Jun 14, 2025
@Copilot Copilot AI requested a review from justinchuby June 14, 2025 15:43
Copilot finished work on behalf of justinchuby June 14, 2025 15:44
@justinchuby justinchuby marked this pull request as ready for review June 14, 2025 16:05
Copy link

codecov bot commented Jun 14, 2025

❌ 4 Tests Failed:

Tests completed Failed Passed Skipped
15543 4 15539 1879
View the top 3 failed test(s) by shortest run time
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0637_test_max_one_input
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_max_one_input'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_max_one_input' (e=No module named 'tests.onnx_backend_test_code.test_max_one_input') (file: 'C:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_max_one_input.py', absolute path: 'C:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_max_one_input.py', current folder: C:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset13
E   
E   @script()
E   def bck_test_max_one_input(data_0: FLOAT[3]) -> (FLOAT[3]):
E       result = opset13.Max(data_0)
E       return result
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0914_test_reduce_sum_square_default_axes_keepdims_example
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_reduce_sum_square_default_axes_keepdims_example'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_reduce_sum_square_default_axes_keepdims_example' (e=No module named 'tests.onnx_backend_test_code.test_reduce_sum_square_default_axes_keepdims_example') (file: 'C:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_sum_square_default_axes_keepdims_example.py', absolute path: 'C:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_reduce_sum_square_default_axes_keepdims_example.py', current folder: C:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT, INT64
E   from onnxscript.onnx_opset import opset18
E   
E   @script()
E   def bck_test_reduce_sum_square_default_axes_keepdims_example(data: FLOAT[3,2,2], axes: INT64[0]) -> (FLOAT[1,1,1]):
E       reduced = opset18.ReduceSumSquare(data, axes, keepdims=1)
E       return reduced
onnxscript.backend.onnx_export_test.TestOnnxBackEnd::test_export2python_produces_correct_onnx_script_model_0955_test_resize_downsample_scales_linear_half_pixel_symmetric
Stack Traces | 0.004s run time
onnxscript\backend\onnx_export_test.py:137: in extract_functions
    mod = importlib.import_module(import_name)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
C:\hostedtoolcache\windows\Python\3.11.9\x64\Lib\importlib\__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
E   ModuleNotFoundError: No module named 'tests.onnx_backend_test_code.test_resize_downsample_scales_linear_half_pixel_symmetric'

The above exception was the direct cause of the following exception:
.nox\test\Lib\site-packages\parameterized\parameterized.py:620: in standalone_func
    return func(*(a + p.args), **p.kwargs, **kw)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:271: in test_export2python_produces_correct_onnx_script_model
    functions = extract_functions(backend_test.name, code, self.test_folder)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
onnxscript\backend\onnx_export_test.py:139: in extract_functions
    raise AssertionError(
E   AssertionError: Unable to import 'tests.onnx_backend_test_code.test_resize_downsample_scales_linear_half_pixel_symmetric' (e=No module named 'tests.onnx_backend_test_code.test_resize_downsample_scales_linear_half_pixel_symmetric') (file: 'C:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_resize_downsample_scales_linear_half_pixel_symmetric.py', absolute path: 'C:\\a\\onnxscript\\onnxscript\\tests\\onnx_backend_test_code\\test_resize_downsample_scales_linear_half_pixel_symmetric.py', current folder: C:\a\onnxscript\onnxscript
E   ---- CONTENT --
E   import numpy
E   from onnx import TensorProto
E   from onnx.helper import make_tensor
E   from onnxscript import script, external_tensor
E   from onnxscript.values import Opset
E   from onnxscript.onnx_types import FLOAT
E   from onnxscript.onnx_opset import opset19
E   
E   @script()
E   def bck_test_resize_downsample_scales_linear_half_pixel_symmetric(X: FLOAT[1,1,1,4], scales: FLOAT[4]) -> (FLOAT[1,1,1,2]):
E       Y = opset19.Resize(X, None, scales, coordinate_transformation_mode='half_pixel_symmetric', mode='linear')
E       return Y

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Development

Successfully merging this pull request may close these issues.

[torchlib] Implement aten.embedding_bag.padding_idx
2 participants