Skip to content

Conversation

@dreiss
Copy link
Contributor

@dreiss dreiss commented Apr 16, 2020

Stack from ghstack:

Summary:
This allows bundling inputs that are large uniform buffers in
channels-last memory format.

Test Plan:
Unit test.

Differential Revision: D21142660

Summary:
This allows bundling inputs that are large uniform buffers in
channels-last memory format.

Test Plan:
Unit test.

[ghstack-poisoned]
@dr-ci
Copy link

dr-ci bot commented Apr 16, 2020

💊 Build failures summary and remediations

As of commit 05af798 (more details on the Dr. CI page):


  • 2/2 failures introduced in this PR

🕵️ 2 new failures recognized by patterns

The following build failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_linux_xenial_py3_6_gcc5_4_build (1/2)

Step: "Build" (full log | pattern match details | 🔁 rerun)

Automatic merge failed; fix conflicts and then commit the result.
CONFLICT (add/add): Merge conflict in .jenkins/pytorch/test.sh 
Auto-merging .jenkins/pytorch/test.sh 
CONFLICT (add/add): Merge conflict in .github/workflows/lint.yml 
Auto-merging .github/workflows/lint.yml 
CONFLICT (add/add): Merge conflict in .circleci/verbatim-sources/workflows-docker-builder.yml 
Auto-merging .circleci/verbatim-sources/workflows-docker-builder.yml 
CONFLICT (add/add): Merge conflict in .circleci/docker/build.sh 
Auto-merging .circleci/docker/build.sh 
CONFLICT (add/add): Merge conflict in .circleci/config.yml 
Auto-merging .circleci/config.yml 
Automatic merge failed; fix conflicts and then commit the result. 

See CircleCI build pytorch_xla_linux_bionic_py3_6_clang9_build (2/2)

Step: "Build" (full log | pattern match details | 🔁 rerun)

Apr 22 05:42:52 torch_xla/csrc/ops/ops.cpp:179:37: error: no member named 'hardsigmoid_backward' in namespace 'c10::aten'
ackages/torch/include/torch/csrc/api/include -I/opt/conda/lib/python3.6/site-packages/torch/include/TH -I/opt/conda/lib/python3.6/site-packages/torch/include/THC -I/opt/conda/include/python3.6m -c torch_xla/csrc/ops/symeig.cpp -o build/temp.linux-x86_64-3.6/torch_xla/csrc/ops/symeig.o -std=c++14 -Wno-sign-compare -Wno-deprecated-declarations -Wno-return-type -Wno-macro-redefined -Wno-return-std-move -DNDEBUG -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_XLAC -D_GLIBCXX_USE_CXX11_ABI=1 
torch/include/torch/csrc/api/include -I/opt/conda/lib/python3.6/site-packages/torch/include/TH -I/opt/conda/lib/python3.6/site-packages/torch/include/THC -I/opt/conda/include/python3.6m -c torch_xla/csrc/ops/all_to_all.cpp -o build/temp.linux-x86_64-3.6/torch_xla/csrc/ops/all_to_all.o -std=c++14 -Wno-sign-compare -Wno-deprecated-declarations -Wno-return-type -Wno-macro-redefined -Wno-return-std-move -DNDEBUG -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_XLAC -D_GLIBCXX_USE_CXX11_ABI=1 
site-packages/torch/include/torch/csrc/api/include -I/opt/conda/lib/python3.6/site-packages/torch/include/TH -I/opt/conda/lib/python3.6/site-packages/torch/include/THC -I/opt/conda/include/python3.6m -c torch_xla/csrc/ops/ops.cpp -o build/temp.linux-x86_64-3.6/torch_xla/csrc/ops/ops.o -std=c++14 -Wno-sign-compare -Wno-deprecated-declarations -Wno-return-type -Wno-macro-redefined -Wno-return-std-move -DNDEBUG -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_XLAC -D_GLIBCXX_USE_CXX11_ABI=1 
kages/torch/include/torch/csrc/api/include -I/opt/conda/lib/python3.6/site-packages/torch/include/TH -I/opt/conda/lib/python3.6/site-packages/torch/include/THC -I/opt/conda/include/python3.6m -c torch_xla/csrc/ops/arg_max.cpp -o build/temp.linux-x86_64-3.6/torch_xla/csrc/ops/arg_max.o -std=c++14 -Wno-sign-compare -Wno-deprecated-declarations -Wno-return-type -Wno-macro-redefined -Wno-return-std-move -DNDEBUG -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_XLAC -D_GLIBCXX_USE_CXX11_ABI=1 
s/torch/include/torch/csrc/api/include -I/opt/conda/lib/python3.6/site-packages/torch/include/TH -I/opt/conda/lib/python3.6/site-packages/torch/include/THC -I/opt/conda/include/python3.6m -c torch_xla/csrc/ops/threshold.cpp -o build/temp.linux-x86_64-3.6/torch_xla/csrc/ops/threshold.o -std=c++14 -Wno-sign-compare -Wno-deprecated-declarations -Wno-return-type -Wno-macro-redefined -Wno-return-std-move -DNDEBUG -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_XLAC -D_GLIBCXX_USE_CXX11_ABI=1 
rch/include/torch/csrc/api/include -I/opt/conda/lib/python3.6/site-packages/torch/include/TH -I/opt/conda/lib/python3.6/site-packages/torch/include/THC -I/opt/conda/include/python3.6m -c torch_xla/csrc/ops/max_pool_nd.cpp -o build/temp.linux-x86_64-3.6/torch_xla/csrc/ops/max_pool_nd.o -std=c++14 -Wno-sign-compare -Wno-deprecated-declarations -Wno-return-type -Wno-macro-redefined -Wno-return-std-move -DNDEBUG -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_XLAC -D_GLIBCXX_USE_CXX11_ABI=1 
e/torch/csrc/api/include -I/opt/conda/lib/python3.6/site-packages/torch/include/TH -I/opt/conda/lib/python3.6/site-packages/torch/include/THC -I/opt/conda/include/python3.6m -c torch_xla/csrc/ops/rrelu_with_noise.cpp -o build/temp.linux-x86_64-3.6/torch_xla/csrc/ops/rrelu_with_noise.o -std=c++14 -Wno-sign-compare -Wno-deprecated-declarations -Wno-return-type -Wno-macro-redefined -Wno-return-std-move -DNDEBUG -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_XLAC -D_GLIBCXX_USE_CXX11_ABI=1 
Apr 22 05:42:52 torch_xla/csrc/ops/ops.cpp:168:37: error: no member named 'hardsigmoid' in namespace 'c10::aten' 
Apr 22 05:42:52   return GenericOp(OpKind(at::aten::hardsigmoid), {input}, input.shape(), 
Apr 22 05:42:52                           ~~~~~~~~~~^ 
Apr 22 05:42:52 torch_xla/csrc/ops/ops.cpp:179:37: error: no member named 'hardsigmoid_backward' in namespace 'c10::aten' 
Apr 22 05:42:52   return GenericOp(OpKind(at::aten::hardsigmoid_backward), {grad_output, input}, 
Apr 22 05:42:52                           ~~~~~~~~~~^ 
kages/torch/include/torch/csrc/api/include -I/opt/conda/lib/python3.6/site-packages/torch/include/TH -I/opt/conda/lib/python3.6/site-packages/torch/include/THC -I/opt/conda/include/python3.6m -c torch_xla/csrc/ops/squeeze.cpp -o build/temp.linux-x86_64-3.6/torch_xla/csrc/ops/squeeze.o -std=c++14 -Wno-sign-compare -Wno-deprecated-declarations -Wno-return-type -Wno-macro-redefined -Wno-return-std-move -DNDEBUG -DTORCH_API_INCLUDE_EXTENSION_H -DTORCH_EXTENSION_NAME=_XLAC -D_GLIBCXX_USE_CXX11_ABI=1 
Apr 22 05:42:58 2 errors generated. 
Apr 22 05:42:58 /opt/conda/lib/python3.6/site-packages/torch/utils/cpp_extension.py:306: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend. 
Apr 22 05:42:58   warnings.warn(msg.format('we could not find ninja.')) 
Apr 22 05:42:58 error: command 'clang-9' failed with exit status 1 
Apr 22 05:42:58 + cleanup 
Apr 22 05:42:58 + retcode=1 
Apr 22 05:42:58 + set +x 

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker.

See how this bot performed.

This comment has been revised 5 times.

Summary:
This allows bundling inputs that are large uniform buffers in
channels-last memory format.

Test Plan:
Unit test.

Differential Revision: [D21142660](https://our.internmc.facebook.com/intern/diff/D21142660)

[ghstack-poisoned]
self.assertEqual(inflated[4][0].shape, (1 << 16,))
self.assertAlmostEqual(inflated[4][0].mean().item(), 0, delta=0.025)
self.assertAlmostEqual(inflated[4][0].std().item(), 1, delta=0.02)
self.assertEqual(inflated[5][0].shape, (1 << 16,))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Super nit: use -1 or len(samples)

@facebook-github-bot
Copy link
Contributor

@dreiss merged this pull request in 41ea7f2.

@facebook-github-bot facebook-github-bot deleted the gh/dreiss/48/head branch June 30, 2020 14:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants