-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
module: docsRelated to our documentation, both in docs/ and docblocksRelated to our documentation, both in docs/ and docblockstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
📚 The doc issue
Neither the torch.einsum doc page nor the torch.backends.opt_einsum page mentions the necessary / sufficient conditions to make the backend available.
Before I looked into it, I imagined that any of these things could be true (now I think they are all false):
opt_einsumis a hard dependency of pytorch,- the code in
opt_einsumis just included into pytorch, so it's always available and it doesn't matter what package you have installed, but it might be out of date ifopt_einsumdoes something new, - the optimizations will only apply if you
import opt_einsumin your program- maybe even you have to
import opt_einsumbefore you loadtorch, or before any of your other imports load it?
- maybe even you have to
Suggest a potential alternative/fix
Mention that:
- it's necessary to install the
opt_einsumpackage yourself separately (perhaps link to https://optimized-einsum.readthedocs.io/en/stable/install.html ) - it's doesn't matter if you
import opt_einsumin your code, pytorch will import it itself if it exists
Minor bonus: the note in the einsum docs mentions torch.backends.opt_einsum, would be convenient if that note were a link
Metadata
Metadata
Assignees
Labels
module: docsRelated to our documentation, both in docs/ and docblocksRelated to our documentation, both in docs/ and docblockstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module