-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
todoNot as important as medium or high priority tasks, but we will work on these.Not as important as medium or high priority tasks, but we will work on these.
Description
Unfortunately the following code snippet leads to an endless loop.
Is there a way to do backward computations on functions inside a backpropagation step?
It would be nice to have to implement quantized convolutions (e.g. like in https://arxiv.org/pdf/1612.01064.pdf)
import torch
class exampleFct(torch.autograd.Function):
def forward(self, dataIn):
self.save_for_backward(dataIn)
print("forward...")
return dataIn**2
def backward(self, grad_output):
dataIn, = self.saved_tensors
x_in_back = torch.autograd.Variable(torch.Tensor([2]), requires_grad=True)
y = x_in_back**2
print("backward...")
y.backward(torch.Tensor([1]))
print("unreachable")
print(x_in_back.grad.data)
return grad_output*2*dataIn
x = torch.autograd.Variable(torch.Tensor([[3, 4]]), requires_grad=True)
m = exampleFct()
m(x).backward(torch.Tensor([1, 1]))
print(x.grad.data)
Metadata
Metadata
Assignees
Labels
todoNot as important as medium or high priority tasks, but we will work on these.Not as important as medium or high priority tasks, but we will work on these.