-
Notifications
You must be signed in to change notification settings - Fork 27
InfiniteMPOMatrix -> InfiniteMPO conversion #76
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Setup unit tests to confirm InfiniteSum{MPO} == l*InfiniteMPO*r (l* *r means terminated)
matrixITensorToITensor--> cat_to_itensor
inds(H;tags="Link") -->inds(H,tags="Link")
Use network connectivity and set operations.
|
Looks great, thanks @JanReimers! Glad to see it ended up so simple. I recall that you said something about having to make use of some slicing functionality from your package https://github.com/JanReimers/ITensorMPOCompression.jl to implement this function. I don't see that here, is that no longer necessary? |
|
Yes, directsum made slicing unnecessary. |
Great, thanks for clarifying. I imagine that slicing could be faster, at least the way |
|
Yes my anecdotal example for that is that early versions of the compression module did all the slicing at the ITensors level. When I moved it down to the NDTensors level (with subtensors.jl) the performance improvement was stunning. |
Started with Lioc Herviou's code. Uses directsum() to build the InfiniteMPO tensors. Supports dense and block-sparse under the hood (no if hasqns() code blocks). Tested with Heisenberg/Hubbard models with 1-5 NN interactions and Lioc's FQHE model with Ncell=6 and Ly=3.
This is not the most efficient way to build InfiniteMPO, the output has extra columns and rows with I's and all zeros that should subsequently be removed with block repsecting orthogonalization and or compression. We should be able to build a more efficient InfiniteMPO by using a block respecting directsum as described in eq. E3 in the Parker paper. But if users are going to compress anyway, maybe this is not useful.