Skip to content

Commit

Permalink
Add message about switching to ITensorMPS.jl (#73)
Browse files Browse the repository at this point in the history
  • Loading branch information
mtfishman authored May 9, 2024
1 parent f3ab3e7 commit 21db181
Show file tree
Hide file tree
Showing 12 changed files with 45 additions and 47 deletions.
38 changes: 19 additions & 19 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,22 +39,22 @@ jobs:
- uses: codecov/codecov-action@v2
with:
files: lcov.info
docs:
name: Documentation
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: julia-actions/setup-julia@v1
with:
version: '1'
- uses: julia-actions/julia-buildpkg@v1
- uses: julia-actions/julia-docdeploy@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DOCUMENTER_KEY: ${{ secrets.DOCUMENTER_KEY }}
- run: |
julia --project=docs -e '
using Documenter: DocMeta, doctest
using ITensorTDVP
DocMeta.setdocmeta!(ITensorTDVP, :DocTestSetup, :(using ITensorTDVP); recursive=true)
doctest(ITensorTDVP)'
## docs:
## name: Documentation
## runs-on: ubuntu-latest
## steps:
## - uses: actions/checkout@v2
## - uses: julia-actions/setup-julia@v1
## with:
## version: '1'
## - uses: julia-actions/julia-buildpkg@v1
## - uses: julia-actions/julia-docdeploy@v1
## env:
## GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
## DOCUMENTER_KEY: ${{ secrets.DOCUMENTER_KEY }}
## - run: |
## julia --project=docs -e '
## using Documenter: DocMeta, doctest
## using ITensorTDVP
## DocMeta.setdocmeta!(ITensorTDVP, :DocTestSetup, :(using ITensorTDVP); recursive=true)
## doctest(ITensorTDVP)'
4 changes: 2 additions & 2 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "ITensorTDVP"
uuid = "25707e16-a4db-4a07-99d9-4d67b7af0342"
authors = ["Matthew Fishman <mfishman@flatironinstitute.org> and contributors"]
version = "0.3.0"
version = "0.3.1"

[deps]
ITensors = "9136182c-28ba-11e9-034c-db9fb085ebd5"
Expand All @@ -18,7 +18,7 @@ Observers = "338f10d5-c7f1-4033-a7d1-f9dec39bcaa0"
ITensorTDVPObserversExt = "Observers"

[compat]
ITensors = "0.3.58, 0.4, 0.5"
ITensors = "0.3.58, 0.4, 0.5, 0.6"
KrylovKit = "0.6, 0.7"
Observers = "0.2"
PackageExtensionCompat = "1"
Expand Down
13 changes: 5 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
| :warning: WARNING |
|:---------------------------|
| The [ITensorTDVP.jl](https://github.com/ITensor/ITensorTDVP.jl) package will be deprecated in favor of the [ITensorMPS.jl](https://github.com/ITensor/ITensorMPS.jl) package. We plan to move all of the code from this package into ITensorMPS.jl. For now, to help with backwards compatability, ITensorMPS.jl simply re-exports the functionality of ITensorTDVP.jl. To prepare for the change, please change `using ITensorTDVP` to `using ITensorMPS` in your code. |

# ITensorTDVP

[![Stable](https://img.shields.io/badge/docs-stable-blue.svg)](https://mtfishman.github.io/ITensorTDVP.jl/stable)
[![Dev](https://img.shields.io/badge/docs-dev-blue.svg)](https://mtfishman.github.io/ITensorTDVP.jl/dev)
[![Build Status](https://github.com/mtfishman/ITensorTDVP.jl/actions/workflows/CI.yml/badge.svg?branch=main)](https://github.com/mtfishman/ITensorTDVP.jl/actions/workflows/CI.yml?query=branch%3Amain)
[![Coverage](https://codecov.io/gh/mtfishman/ITensorTDVP.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/mtfishman/ITensorTDVP.jl)
[![Code Style: Blue](https://img.shields.io/badge/code%20style-blue-4495d1.svg)](https://github.com/invenia/BlueStyle)
Expand All @@ -16,6 +18,7 @@ julia> ]
pkg> add ITensorTDVP
```
However, as noted above we now recommend installing and loading `ITensorMPS` instead of `ITensorTDVP`.

## News

Expand Down Expand Up @@ -44,9 +47,3 @@ energy, psi = ITensorTDVP.dmrg(H, psi0; nsweeps=10, maxdim=100, cutoff=1e-6)

- `svd_alg` now doesn't specify a default value, so the default value is set by the `svd` function in ITensors/NDTensors. This fixes an issue using ITensorTDVP.jl and GPU backends, where the default value being set in ITensorTDVP.jl wasn't compatible with the options available in some GPU backends like CUDA.
- More generally, keyword arguments are handled better throughout the package, so default values are handled more systematically and keyword arguments are listed or forwarded more explicitly, so it should catch more mistakes like passing an incorrect keyword argument name.

## About

This package is effectively a generalization of the DMRG code in [ITensors.jl](https://github.com/ITensor/ITensors.jl), using the MPS/MPO types from that package. It provides a general MPS "solver" interface which allows us to implement a variety of MPS/MPO optimization/solver functionality like DMRG (`ITensorTDVP.dmrg`), TDVP (`ITensorTDVP.tdvp`), linear solving (`ITensorTDVP.linsolve`/`KrylovKit.linsolve`), DMRG-X (`ITensorTDVP.dmrg_x`), etc. while sharing most of the code across those different functions. Therefore, it effectively supercedes the DMRG functionality in ITensors.jl (`dmrg`), and provides its own `ITensorTDVP.dmrg` function that is essentially the same as the `dmrg` function from ITensors.jl (though for now it only outputs the state, while `ITensors.dmrg` outputs the energy and the state, likely we will make the interface more similar to `ITensors.dmrg` in future versions of the code). This package is fairly stable and appropriate for general use. The primary missing feature is a lack of modern subspace expansion tools for methods like TDVP and 1-site DMRG. However, 2-site TDVP or TEBD is often sufficient for performing subspace expansion (except when [it's not](https://arxiv.org/abs/2005.06104)).

However, note that future developments, including modern subspace expansion tools, are being developed in our next-generation tensor network library [ITensorNetworks.jl](https://github.com/mtfishman/ITensorNetworks.jl). The goal of that package is to provide contraction, optimization, and evolution tools for general tensor networks, as well as methods like DMRG, TDVP, and linear solving for tree tensor networks, and the eventual goal is to replace this package which is limited to solvers for just MPS/MPO (linear/path graph) tensor networks. However, ITensorNetworks.jl is under heavy development and is _not_ meant for general usage at the moment, except for those who are brave enough to handle missing features and breaking interfaces. Basically, for the average user who wants stable and reliable code, if you need to use MPS-based TDVP or linear solving, you should use this package for the time being.
10 changes: 2 additions & 8 deletions examples/01_tdvp.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
using ITensors: ITensors, MPO, OpSum, inner, randomMPS, siteinds
using ITensorTDVP: ITensorTDVP, tdvp
using ITensorMPS: MPO, OpSum, dmrg, inner, randomMPS, siteinds, tdvp

function main()
n = 10
Expand Down Expand Up @@ -31,16 +30,11 @@ function main()
cutoff=1e-10,
outputlevel=1,
)

@show inner', H, ϕ) / inner(ϕ, ϕ)

e2, ϕ2 = ITensors.dmrg(H, ψ; nsweeps=10, maxdim=20, cutoff=1e-10)

e2, ϕ2 = dmrg(H, ψ; nsweeps=10, maxdim=20, cutoff=1e-10)
@show inner(ϕ2', H, ϕ2) / inner(ϕ2, ϕ2), e2

e3, ϕ3 = ITensorTDVP.dmrg(H, ψ; nsweeps=10, maxdim=20, cutoff=1e-10, outputlevel=1)

@show inner(ϕ3', H, ϕ3) / inner(ϕ3, ϕ3), e3
return nothing
end

Expand Down
3 changes: 1 addition & 2 deletions examples/02_dmrg-x.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
using ITensors: MPO, MPS, OpSum, inner, siteinds
using ITensorTDVP: dmrg_x
using ITensorMPS: MPO, MPS, OpSum, dmrg_x, inner, siteinds
using Random: Random

function main()
Expand Down
2 changes: 1 addition & 1 deletion examples/03_models.jl
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
using ITensors: OpSum
using ITensorMPS: OpSum

function heisenberg(n; J=1.0, J2=0.0)
= OpSum()
Expand Down
11 changes: 9 additions & 2 deletions examples/03_solvers.jl
Original file line number Diff line number Diff line change
@@ -1,8 +1,15 @@
using ITensors: ITensor
using ITensorTDVP: TimeDependentSum, to_vec
using ITensors: ITensor, array, inds, itensor
using ITensorMPS: TimeDependentSum
using KrylovKit: exponentiate
using OrdinaryDiffEq: ODEProblem, Tsit5, solve

function to_vec(x::ITensor)
function to_itensor(x_vec)
return itensor(x_vec, inds(x))
end
return vec(array(x)), to_itensor
end

function ode_solver(
H::TimeDependentSum,
time_step,
Expand Down
4 changes: 2 additions & 2 deletions examples/03_tdvp_time_dependent.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
using ITensors: MPO, MPS, @disable_warn_order, inner, randomMPS, siteinds
using ITensorTDVP: tdvp
using ITensors: @disable_warn_order
using ITensorMPS: MPO, MPS, inner, randomMPS, siteinds, tdvp
using LinearAlgebra: norm
using Random: Random

Expand Down
3 changes: 1 addition & 2 deletions examples/04_tdvp_observers.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
using ITensors: MPO, MPS, OpSum, expect, siteinds
using ITensorTDVP: tdvp
using ITensorMPS: MPO, MPS, OpSum, expect, inner, siteinds, tdvp
using Observers: observer

function main()
Expand Down
1 change: 1 addition & 0 deletions examples/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,6 @@
ITensorTDVP = "25707e16-a4db-4a07-99d9-4d67b7af0342"
ITensors = "9136182c-28ba-11e9-034c-db9fb085ebd5"
KrylovKit = "0b1a1467-8014-51b9-945f-bf0ae24f4b77"
Observers = "338f10d5-c7f1-4033-a7d1-f9dec39bcaa0"
OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
1 change: 1 addition & 0 deletions test/Project.toml
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
[deps]
ITensorMPS = "0d1a4710-d33b-49a5-8f18-73bdf49b47e2"
ITensorTDVP = "25707e16-a4db-4a07-99d9-4d67b7af0342"
ITensors = "9136182c-28ba-11e9-034c-db9fb085ebd5"
KrylovKit = "0b1a1467-8014-51b9-945f-bf0ae24f4b77"
Expand Down
2 changes: 1 addition & 1 deletion test/test_contract_mpo.jl
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ using Test: @test, @testset
truncate!(psi_guess; maxdim=2)
Hpsi = apply(H, psi; alg="fit", nsweeps=4, init_mps=psi_guess)
@test ITensors.scalartype(Hpsi) == elt
@test inner(psit, Hpsi) inner(psit, H, psi) rtol = 3 * eps(real(elt))
@test inner(psit, Hpsi) inner(psit, H, psi) rtol = 20 * eps(real(elt))
# Test with nsite=1
Hpsi_guess = apply(H, psi; alg="naive", cutoff=1e-4)
Hpsi = apply(H, psi; alg="fit", init_mps=Hpsi_guess, nsite=1, nsweeps=2)
Expand Down

2 comments on commit 21db181

@mtfishman
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/106496

Tip: Release Notes

Did you know you can add release notes too? Just add markdown formatted text underneath the comment after the text
"Release notes:" and it will be added to the registry PR, and if TagBot is installed it will also be added to the
release that TagBot creates. i.e.

@JuliaRegistrator register

Release notes:

## Breaking changes

- blah

To add them here just re-invoke and the PR will be updated.

Tagging

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.3.1 -m "<description of version>" 21db181f232d79b7fc5a881c7f4c24c8f05c0f2a
git push origin v0.3.1

Please sign in to comment.