Skip to content

torch.einsum docs don't mention that opt_einsum must be installed separately #127109

@bmillwood

Description

@bmillwood

📚 The doc issue

Neither the torch.einsum doc page nor the torch.backends.opt_einsum page mentions the necessary / sufficient conditions to make the backend available.

Before I looked into it, I imagined that any of these things could be true (now I think they are all false):

  • opt_einsum is a hard dependency of pytorch,
  • the code in opt_einsum is just included into pytorch, so it's always available and it doesn't matter what package you have installed, but it might be out of date if opt_einsum does something new,
  • the optimizations will only apply if you import opt_einsum in your program
    • maybe even you have to import opt_einsum before you load torch, or before any of your other imports load it?

Suggest a potential alternative/fix

Mention that:

Minor bonus: the note in the einsum docs mentions torch.backends.opt_einsum, would be convenient if that note were a link

cc @svekars @brycebortree

Metadata

Metadata

Assignees

Labels

module: docsRelated to our documentation, both in docs/ and docblockstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions