Skip to content

feat: Add einsum ops#2

Merged
vgene merged 18 commits intoaws-neuron:feat/einsumfrom
jlonge4:feature/einsum
Feb 2, 2026
Merged

feat: Add einsum ops#2
vgene merged 18 commits intoaws-neuron:feat/einsumfrom
jlonge4:feature/einsum

Conversation

@jlonge4
Copy link

@jlonge4 jlonge4 commented Jan 21, 2026

Issue #, if available:
N/A
Description of changes:

Adds support for core Einstein summation patterns and verification utilities.

Added

  • Core Patterns: Matrix multiplication, dot products, transposes, and outer products.

Verification:

  • nkipy_einsum.py
  • pytest kernels/einsum.py

Limitations

  • Trace: Logic added but blocked by missing iota_dimension support in the HLO backend.
  • Bilinear: Currently disabled and untested.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

jlonge4 and others added 18 commits January 17, 2026 22:30
Implements Einstein summation notation (einsum) for NKIPy, supporting:
- Matrix multiplication and batch operations (ij,jk->ik, bij,bjk->bik)
- Transpose and dimension permutation (ij->ji, ijk->kji)
- Reductions and trace operations (ij->, ii->)
- Outer products (i,j->ij)
- Broadcasting patterns (ij,j->ij)
- Complex tensor contractions (ijk,jkl->il)
- N-ary operations (i,ij,j->)

Implementation decomposes einsum patterns into HLO primitives:
- dot_general for contractions
- transpose for dimension reordering
- reduce for summations
- broadcast/multiply for outer products

Includes comprehensive tests covering all major einsum patterns
and examples demonstrating real-world usage including simplified
attention mechanisms.
This script tests various einsum operations using NumPy and NKIPy. It includes matrix multiplication, batch matrix multiplication, dot product, outer product, and more, verifying results against NumPy outputs.
@jlonge4 jlonge4 changed the title Feature/einsum feat: Add einsum ops Jan 21, 2026
@jlonge4 jlonge4 marked this pull request as ready for review January 21, 2026 22:28
@jlonge4 jlonge4 requested a review from a team January 21, 2026 22:28
@vgene vgene self-assigned this Jan 26, 2026
@vgene
Copy link
Contributor

vgene commented Jan 29, 2026

Thanks for submitting this PR! The einsum support is actually a long wanted feature!

We are reviewing this now and will have feedback by EOW.

@vgene vgene changed the base branch from main to feat/einsum February 2, 2026 11:57
Copy link
Contributor

@vgene vgene left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This overall looks very promising. I'm merging it to a separate branch in the main repo first. It needs a few rounds of refactoring + adding tests.

@vgene vgene merged commit ed64c29 into aws-neuron:feat/einsum Feb 2, 2026
@vgene
Copy link
Contributor

vgene commented Feb 2, 2026

Hi @jlonge4
the changes are merged into branch feat/einsum.

We will need to

  • add unit tests in tensor_apis to cover different scenarios and shapes
  • document supported einsum categories and discuss the reasons for unsupported ones
  • refactor the lowering to HLO path

Absolutely feel free to submit more PRs to feat/einsum branch to address any of the above issues. It will eventually be merged onto main when all the issues are addressed. We will work on this as a part of the numpy API coverage improvement, with milestone set to the end of February.

@jlonge4
Copy link
Author

jlonge4 commented Feb 3, 2026

@vgene amazing thanks so much! I look forward to contributing further

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants