Skip to content

with torch.enable_grad also works outside a no_grad context #19189

@HaleTom

Description

@HaleTom

📚 Documentation

The torch.autograd.enable_grad documentation says:

Enables gradient calculation inside a no_grad context. This has no effect outside of no_grad.

This implies:

    torch.set_grad_enabled(False)
    with torch.enable_grad:
        # Gradient tracking will NOT be enabled here.
    torch.set_grad_enabled(True)

vs:

    with torch.no_grad():
        with torch.enable_grad:
            # Gradient tracking IS enabled here.

However the observed behaviour (1.0.1.post2) is:

x = torch.tensor([.1], requires_grad=True)

with torch.no_grad():
    with torch.enable_grad():
        y = x * 2
print(y.requires_grad) # True (as expected)

with torch.set_grad_enabled(False):
    y = x * 2
print(y.requires_grad)  # False (as expected)

with torch.set_grad_enabled(False):
    with torch.enable_grad():
        y = x * 2
        print(y.requires_grad)  # True, but False expected from doc quote

Note the last example is not "inside a no_grad context", but it still works.

Other prior art: PyTorch set_grad_enabled(False) vs with no_grad():


I'm assuming the documentation is incorrect, and it should say simply:

Enables gradient calculation.

Would you accept a PR?

Metadata

Metadata

Assignees

Labels

high prioritymodule: autogradRelated to torch.autograd, and the autograd engine in generalmodule: docsRelated to our documentation, both in docs/ and docblockssmallWe think this is a small issue to fix. Consider knocking off high priority small issuestriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions