Skip to content

Conversation

@colesbury
Copy link
Member

The operations used by PixelShuffle are differentiable (via autograd), so we only have to implement the forward pass in the Module and get the backward automatically.

cc @alykhantejani

@soumith soumith merged commit 235d540 into pytorch:master Dec 30, 2016
@apaszke
Copy link
Contributor

apaszke commented Dec 30, 2016

We should put the implementation in functional, not in the module 😕

@alykhantejani
Copy link
Contributor

Ah cool, did not know that autograd would handle the permutations. As long as the following test passed:

        input = Variable(torch.Tensor(batch_size, channels, height, width).uniform_(), requires_grad=True)
        ps = nn.PixelShuffle(upscale_factor)
        output = ps(input)
        output.backward(output.data)
        self.assertEqual(input.data, input.grad)

i.e. the input == backwards(forwards(input)) then it LGTM.

This test is already in test_nn so that means it should work fine with autograd. Nice spot @colesbury

mrshenli pushed a commit to mrshenli/pytorch that referenced this pull request Apr 11, 2020
* Re-enable hybrid_frontend tutorials

* clean up
KsenijaS pushed a commit to KsenijaS/pytorch that referenced this pull request Dec 14, 2020
Model for opset 1 fails validation in two ways. Models for more recent
opsets are already available.

* Fixes pytorch#55 - all initializers not declared as inputs
* Fixes pytorch#13 - missing consumed_inputs for BatchNormalization nodes

Signed-off-by: Jason Plurad <[email protected]>

Co-authored-by: Wenbing Li <[email protected]>
KyleCZH pushed a commit to KyleCZH/pytorch that referenced this pull request Sep 20, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants