Skip to content

Call backward of function in backpropagation #1776

@magzHL

Description

@magzHL

Unfortunately the following code snippet leads to an endless loop.
Is there a way to do backward computations on functions inside a backpropagation step?
It would be nice to have to implement quantized convolutions (e.g. like in https://arxiv.org/pdf/1612.01064.pdf)

import torch

class exampleFct(torch.autograd.Function):
    def forward(self, dataIn):
        self.save_for_backward(dataIn)
        print("forward...")
        return dataIn**2

    def backward(self, grad_output):
        dataIn, = self.saved_tensors
        x_in_back = torch.autograd.Variable(torch.Tensor([2]), requires_grad=True)
        y = x_in_back**2
        print("backward...")
        y.backward(torch.Tensor([1]))
        print("unreachable")
        print(x_in_back.grad.data)
        return grad_output*2*dataIn


x = torch.autograd.Variable(torch.Tensor([[3, 4]]), requires_grad=True)
m = exampleFct()
m(x).backward(torch.Tensor([1, 1]))
print(x.grad.data)

Metadata

Metadata

Assignees

No one assigned

    Labels

    todoNot as important as medium or high priority tasks, but we will work on these.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions