Skip to content

Conversation

@Kaixhin
Copy link
Contributor

@Kaixhin Kaixhin commented Oct 4, 2018

Fixes #12259

This comment was marked as off-topic.

@colesbury
Copy link
Member

I think this is a good change. We have been manually initializing the weight to 1 in models like ResNet. It's silly to require users to remember to do that.

cc @soumith

@colesbury
Copy link
Member

@pytorchbot retest this please

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

colesbury has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@soumith
Copy link
Contributor

soumith commented Oct 4, 2018

sounds good to me. let's land it

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ssnl is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

bddppq added a commit to bddppq/pytorch that referenced this pull request Nov 9, 2018
This fixes master breakage introduced in pytorch#12325
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Set Batchnorm weight scalar initialization to unit (not random uniform)

6 participants