Skip to content

Deprecations tracking issue #55953

@mruberry

Description

@mruberry

PyTorch has numerous operators in its public API that need to be removed or changed in a BC-breaking way. This tracking issue provides guidance on implementing these changes and tracks deprecation tasks.

Staging deprecations

Deprecations are disruptive. When a behavior is deprecated it means that a PyTorch program will no longer function as it did, and this requires users update their programs to the latest behavior, which is often frustrating and time consuming. Deprecations are also necessary, however, to keep PyTorch modern and flexible. Staging a deprecation across multiple releases is how PyTorch tries to minimize how disruptive these changes are while staying flexible.

A typical deprecation process occurs over three releases:

  • in the first release a warning is thrown the first time the deprecated behavior occurs (use TORCH_WARN_ONCE)
    • the warning should tell users how to avoid the deprecated behavior, how they can adopt the new behavior (if any), and how they can continue to use the current behavior (if possible)
    • PyTorch code should be updated to use the alternative behavior(s), where required; PyTorch should not throw warnings when performing non-deprecated operations
  • in the second release, the deprecated behavior is disabled
    • the previous warning should be updated to an error
    • disabling the behavior for a release is important to prevent "silent correctness" issues, where the behavior of a program changes unexpectedly (PyTorch does not assume that warnings are sufficient to prevent this)
  • if there is a new behavior, that behavior is enabled in the third release

For example, torch.div() had its behavior changed. In PyTorch 1.4 torch.div() acted like division in C++, where the result of integer division is rounded towards zero. In PyTorch 1.5 a warning was added in the documentation and the code. The warning explained the future behavior, how to use the future behavior, and how to use the current behavior in a non-deprecated way. In PyTorch 1.6 this warning became an error and in PyTorch 1.7 the behavior was changed .

Preserving the behavior of serialized torchscript

Some PyTorch models are scripted (using torchscript) and then serialized. The serialization format contains version information, so for these models it's possible to preserve the behavior they had when they were serialized by writing a "versioned symbol" or "adapter." See the note "Versioned Symbols" for details:

// Note [Versioned Symbols]

Returning to the deprecation of torch.div(), multiple adapters were written to preserve the behavior of PyTorch models serialized prior to PyTorch 1.6. These adapters can be found here:

auto div_tensor = R"SCRIPT(

Writing an adapter isn't required for every deprecation, but deprecations that impact many models may want an adapter to minimize how disruptive they are.

Current deprecations

cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser @anjali411

Metadata

Metadata

Assignees

Labels

high prioritymodule: deprecationtrackerA tracking issuetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions