Skip to content

Conversation

@ngimel
Copy link
Collaborator

@ngimel ngimel commented May 15, 2019

Fix for #20499

@pytorchbot pytorchbot added module: cuda Related to torch.cuda, and CUDA support in general module: nn Related to torch.nn module: operators labels May 15, 2019
@ngimel ngimel requested a review from ezyang May 15, 2019 17:17
@ezyang
Copy link
Contributor

ezyang commented May 16, 2019

I fixed lint for you. If you want to run lint locally in the future, https://github.com/pytorch/pytorch/wiki/Lint-as-you-type has some instructions.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@ngimel
Copy link
Collaborator Author

ngimel commented May 16, 2019

I fixed lint for you. If you want to run lint locally in the future, https://github.com/pytorch/pytorch/wiki/Lint-as-you-type has some instructions.

Oops, sorry :-(

@ezyang
Copy link
Contributor

ezyang commented May 16, 2019

Nothing to be sorry about ;) It just means it'll take longer to land.

zdevito pushed a commit to zdevito/ATen that referenced this pull request May 16, 2019
Summary:
Fix for #20499
Pull Request resolved: pytorch/pytorch#20541

Differential Revision: D15372461

Pulled By: ezyang

fbshipit-source-id: cdc237a98244515a573216a6dac4826261c973f9
@facebook-github-bot
Copy link
Contributor

@ezyang merged this pull request in 66c6133.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: cuda Related to torch.cuda, and CUDA support in general module: nn Related to torch.nn open source

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants