Skip to content

Conversation

Copilot
Copy link
Contributor

@Copilot Copilot AI commented Sep 24, 2025

This PR implements the aten_bilinear function that was previously raising NotImplementedError. The bilinear transformation computes y = x1^T A x2 + b where:

  • input1 has shape (..., in1_features)
  • input2 has shape (..., in2_features)
  • weight has shape (out_features, in1_features, in2_features)
  • bias has shape (out_features) (optional)
  • Output has shape (..., out_features)

Implementation Details

The implementation is done using einsum.

Co-authored-by: justinchuby <11205048+justinchuby@users.noreply.github.com>
@Copilot Copilot AI changed the title [WIP] Implement aten_binear Implement aten_bilinear function using Einsum operation Sep 24, 2025
@Copilot Copilot AI requested a review from justinchuby September 24, 2025 03:54
Copilot finished work on behalf of justinchuby September 24, 2025 03:54
@justinchuby

This comment was marked as duplicate.

Copy link

codecov bot commented Sep 24, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 70.10%. Comparing base (168fd8a) to head (462d155).
✅ All tests successful. No failed tests found.

Additional details and impacted files
@@           Coverage Diff           @@
##             main    #2574   +/-   ##
=======================================
  Coverage   70.10%   70.10%           
=======================================
  Files         222      222           
  Lines       26184    26188    +4     
  Branches     2581     2582    +1     
=======================================
+ Hits        18355    18360    +5     
+ Misses       6929     6928    -1     
  Partials      900      900           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@justinchuby
Copy link
Collaborator

Can you re-express it using Transpose and Matmul? That may produce a graph that is easier to optimize. @copilot

@justinchuby justinchuby changed the title Implement aten_bilinear function using Einsum operation Implement aten_bilinear function Sep 24, 2025
@justinchuby
Copy link
Collaborator

You may not need to flatten for MatMul. It supports N-dim inputs @copilot

Copy link
Contributor Author

Copilot AI commented Sep 24, 2025

Can you re-express it using Transpose and Matmul? That may produce a graph that is easier to optimize. @copilot

Done! I've re-implemented the function using MatMul and Transpose operations instead of Einsum. The new implementation decomposes the bilinear transformation into: outer product creation → reshaping → matrix multiplication with transposed weights. This should be more optimization-friendly. See commit 22c4c0f.

@Copilot Copilot AI changed the title Implement aten_bilinear function Implement aten_bilinear function using MatMul and Transpose operations Sep 24, 2025
@Copilot Copilot AI requested a review from justinchuby September 24, 2025 18:36
Copilot finished work on behalf of justinchuby September 24, 2025 18:36
Copy link
Contributor Author

Copilot AI commented Sep 24, 2025

You may not need to flatten for MatMul. It supports N-dim inputs @copilot

You're absolutely right! I've simplified the implementation by removing the unnecessary flattening and leveraging N-dimensional MatMul support. The new approach uses two clean MatMul operations without creating outer products or flattening tensors, making it much more efficient and optimization-friendly. See commit 095967e.

@Copilot Copilot AI changed the title Implement aten_bilinear function using MatMul and Transpose operations Implement aten_bilinear function using efficient N-dimensional MatMul operations Sep 24, 2025
Copilot finished work on behalf of justinchuby September 24, 2025 18:45
@justinchuby
Copy link
Collaborator

Yeah the implementation is too complex. I will re-do this

Signed-off-by: Justin Chu <justinchuby@users.noreply.github.com>
@justinchuby justinchuby marked this pull request as ready for review September 25, 2025 16:03
@justinchuby
Copy link
Collaborator

I reverted to the original einsum implmenetation. It seems clean

@justinchuby justinchuby changed the title Implement aten_bilinear function using efficient N-dimensional MatMul operations [torchlib] Implement aten_bilinear function using Einsum Sep 25, 2025
@justinchuby justinchuby requested a review from xadupre September 25, 2025 16:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: torchlib Related to the torch/aten function lib in development
Projects
Development

Successfully merging this pull request may close these issues.

4 participants