Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NDTensors] NamedDimsArrays module #1267

Merged
merged 39 commits into from
Nov 19, 2023
Merged

[NDTensors] NamedDimsArrays module #1267

merged 39 commits into from
Nov 19, 2023

Conversation

mtfishman
Copy link
Member

@mtfishman mtfishman commented Nov 18, 2023

This defines a new NamedDimsArrays module, defining a standalone Julia array wrapper type that attaches names to dimensions.

The design is some hybrid of the current ITensor Index system and Julia packages like NamedDims.jl/NamedPlus.jl, DimensionalData.jl, NamedArrays.jl, PyTorch's Named Tensors, etc.

The idea would be to replace the current Tensor type with a NamedDimsArray (or some subtype of AbstractNamedDimsArray), which would have named dimension-aware indexing and tensor operation syntax (i.e. smart addition, contraction, and factorization like ITensors have right now). Then ITensors can be simple opaque mutable wrappers around NamedDimsArrays.

mtfishman and others added 30 commits November 10, 2023 12:21
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
…m:ITensor/ITensors.jl into NDTensors_blocksparsearray_tensor_algebra
@codecov-commenter
Copy link

codecov-commenter commented Nov 19, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (c47eb7c) 85.44% compared to head (489b534) 54.80%.

❗ Current head 489b534 differs from pull request most recent head 28775ef. Consider uploading reports for the commit 28775ef to get more accurate results

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@             Coverage Diff             @@
##             main    #1267       +/-   ##
===========================================
- Coverage   85.44%   54.80%   -30.64%     
===========================================
  Files          89       88        -1     
  Lines        8402     8349       -53     
===========================================
- Hits         7179     4576     -2603     
- Misses       1223     3773     +2550     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@mtfishman
Copy link
Member Author

mtfishman commented Nov 19, 2023

In that latest I started adding tensor algebra definitions like contract, which are simple wrappers around TensorAlgebra.contract introduced in #1265.

Here is a demonstration of some functionality:

using NDTensors.NamedDimsArrays: align, dimnames, named, unname
using NDTensors.TensorAlgebra: TensorAlgebra

# Named dimensions
i = named(2, "i")
j = named(2, "j")
k = named(2, "k")

# Arrays with named dimensions
na1 = randn(i, j)
na2 = randn(j, k)

@show dimnames(na1) == ("i", "j")

# Indexing
@show na1[j => 2, i => 1] == na1[1, 2]

# Tensor contraction
na_dest = TensorAlgebra.contract(na1, na2)

@show issetequal(dimnames(na_dest), ("i", "k"))
# `unname` removes the names and returns an `Array`
@show unname(na_dest, (i, k))  unname(na1) * unname(na2)

# Permute dimensions (like `ITensors.permute`)
na1 = align(na1, (j, i))
@show na1[i => 1, j => 2] == na1[2, 1]

So it is very similar to the interface of ITensors.ITensor.

@mtfishman mtfishman merged commit 4c5c991 into main Nov 19, 2023
9 checks passed
@mtfishman mtfishman deleted the NDTensors_NamedDimsArrays branch November 19, 2023 23:25
@emstoudenmire
Copy link
Collaborator

Interesting to see how short the code for implementing this ended up being. And I can see how it would simplify the ITensor layer on top.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants