torch.einsum
docs don't mention that opt_einsum
must be installed separately
#127109
Labels
module: docs
Related to our documentation, both in docs/ and docblocks
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
📚 The doc issue
Neither the torch.einsum doc page nor the torch.backends.opt_einsum page mentions the necessary / sufficient conditions to make the backend available.
Before I looked into it, I imagined that any of these things could be true (now I think they are all false):
opt_einsum
is a hard dependency of pytorch,opt_einsum
is just included into pytorch, so it's always available and it doesn't matter what package you have installed, but it might be out of date ifopt_einsum
does something new,import opt_einsum
in your programimport opt_einsum
before you loadtorch
, or before any of your other imports load it?Suggest a potential alternative/fix
Mention that:
opt_einsum
package yourself separately (perhaps link to https://optimized-einsum.readthedocs.io/en/stable/install.html )import opt_einsum
in your code, pytorch will import it itself if it existsMinor bonus: the note in the einsum docs mentions
torch.backends.opt_einsum
, would be convenient if that note were a linkcc @svekars @brycebortree
The text was updated successfully, but these errors were encountered: