Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can @tensor macro be used in variable length systems? #125

Open
billysrh opened this issue Nov 1, 2022 · 2 comments
Open

How can @tensor macro be used in variable length systems? #125

billysrh opened this issue Nov 1, 2022 · 2 comments

Comments

@billysrh
Copy link

billysrh commented Nov 1, 2022

I'm used to write code in ncon fashion, and it's necessary because I have to deal with arbitrary number of tensors. But I know @ncon is slower than @tensor. How can I keep the notion of ncon, and get the speed of @tensor at the same time?

@Jutho
Copy link
Owner

Jutho commented Nov 1, 2022

There is some speed advantage with using @tensor, but it is not that substantial. It arises specifically from the fact that @tensor only works if the contraction is exactly known at compile time, and the different steps can therefore be hardcoded. That's what the macro does, it does some code rewriting at compile time.

If the contraction requires dynamic information, which is not known at compile time, you cannot use @tensor for that.

@sents
Copy link

sents commented Jan 19, 2023

If you really need the speed and you don't have a lot of different contractions you can call the macro from a generated function. The generated function can dynamically dispatch by handing it a Val type.

Generated functions have to compile a function for every call of different type they get. This means that each contraction with a different signature would have to be recompiled. So using this approach with rapidly changing contraction orders will introduce a lot of compilation overhead.

Here is an example, handing the operators the indices (as Tuples of Ints):

using TensorOperations
@generated function contract_operator!(S,
                                       A,
                                       B,
                                       op,
                                       order_A::Val{K}) where {K}
    leftside = Expr(:call, :*,
                    :(A[$(K.a...)]),
                    :(B[$(K.b...)]),
                    :(op[$(K.op...)]))
    return :(@tensor S[:] = $leftside)
end
S = rand(5,5,5,2,5,5,5,2)
a = rand(5,5,5,5,2)
b = rand(5,5,5,5,2)
op = rand(2,2,2,2)
contraction_order = Val((a = (1,-1,-2,-3,2),
                     b = (-5,-6,1,-7,3),
                     op = (2,3,-4,-8)))
contract_operator!(S,a,b,op, contraction_order);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants