Skip to content

Conversation

@tunz
Copy link
Contributor

@tunz tunz commented Mar 13, 2019

Currently, we cannot run a checkpointed function with None argument.

out = torch.utils.checkpoint.checkpoint(run_fn, input_var, None)
  File "/home/tunz/anaconda3/envs/torchdev/lib/python3.7/site-packages/torch/utils/checkpoint.py", line 14, in detach_variable
    x = inp.detach()
AttributeError: 'NoneType' object has no attribute 'detach'

This PR makes checkpoint function to safely handle None argument.

Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

okey dokey

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.


def check_backward_validity(inputs):
if not any(inp.requires_grad for inp in inputs):
if not any(inp.requires_grad for inp in inputs if isinstance(inp, torch.Tensor)):
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm receiving the warning when I pass in a list of tensors which have gradients. Propose:

if not any(inp.requires_grad for inp in inputs if inputs is not None):

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@oliland send us a PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants