Skip to content

Conversation

@raghuramank100
Copy link
Contributor

@raghuramank100 raghuramank100 commented Oct 21, 2019

Fixes #28375

Stack from ghstack:

Differential Revision: D18047293

len(module._modules) == 0 and not isinstance(module, torch.nn.Sequential):
# observer and hook will be gone after we swap the module
module.add_module('activation_post_process', module.qconfig.activation())
module.register_forward_hook(_observer_forward_hook)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The indentation might be off?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, fixed now

raghuramank100 pushed a commit that referenced this pull request Oct 21, 2019
raghuramank100 pushed a commit that referenced this pull request Oct 22, 2019
@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 94757e0.

@facebook-github-bot facebook-github-bot deleted the gh/raghuraman_k/9/head branch October 28, 2019 22:18
thiagocrepaldi pushed a commit to thiagocrepaldi/pytorch that referenced this pull request Feb 4, 2020
Summary:
Pull Request resolved: pytorch#28384

ghstack-source-id: 92340259

Test Plan:
buck test caffe2/test:quantization -- 'test_fusion_sequential_model_train \(test_quantization\.FusionTest\)' --print-passing-details

 buck test caffe2/test:quantization -- 'test_fusion_sequential_model_eval \(test_quantization\.FusionTest\)' --print-passing-details

Differential Revision: D18047293

fbshipit-source-id: 7e18b1aa76cc0fd26e8ee48a70c3a45688e73549
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants