Skip to content

Perform autograd directly on Tensor, not Variable #1384

@adamlerer

Description

@adamlerer

What does Variable do? Historically, it meant "wrapper for a tensor that can hold a graph and grad". But generally, a user only interacts with Variables for which requires_grad=False, so it doesn't hold a graph or a grad. It's just a confusing wrapper to keep track of, with the additional confusion that you can't mix Tensor and Variable in torch ops (even though they're the same thing).

I think Tensor can directly have a requires_grad method and can optionally hold pointers to the grad and the graph. The advantages to the user are that they don't need to understand what a Variable is and they don't need to worry about calling Variable and .data at all the right places.

We can maybe do this while maintaining backwards compatibility with Variables (Variables can just be a thin wrapper around tensors that do nothing; and Tensor.data can just point at self).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions