Skip to content

Conversation

@daquexian
Copy link
Contributor

@daquexian daquexian commented Oct 3, 2018

Obviously, the grads of conv weight and conv input are not relevant to the bias, but the original convXd_input and convXd_weight methods receive a bias parameter. What's more, while the doc says bias should have the shape (out_channels,), one will get a RuntimeError if the bias != None and in_channels != out_channels, for the weight of transposed conv has the shape (in_channels, out_channels, kH, kW) while the weight of vanilla conv has the shape (out_channels, in_channels, kH, kW)

RuntimeError: Given transposed=1, weight of size [channel1, channel2, kH, kW], expected bias to be 1-dimensional with channel2 elements, but got bias of size [channel1] instead

@daquexian daquexian changed the title Fix conv grad bug when bias != None Fix bug in grad.py when conv bias != None Oct 3, 2018
@ezyang ezyang added the module: bc-breaking Related to a BC-breaking change label Oct 5, 2018
@ezyang
Copy link
Contributor

ezyang commented Oct 5, 2018

Thank you, nice catch. This is technically BC-breaking but I doubt anyone will notice.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ezyang is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@daquexian
Copy link
Contributor Author

@ezyang Thanks! I agree that maybe no one will notice :D

@bbartoldson
Copy link

I noticed but wasn't sure enough about its being incorrect to say something. I appreciate the fix! Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

module: bc-breaking Related to a BC-breaking change open source

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants