Skip to content

Conversation

@pbelevich
Copy link
Contributor

@pbelevich pbelevich commented Nov 19, 2019

Stack from ghstack:

Differential Revision: D18594723

[ghstack-poisoned]
@pbelevich pbelevich requested review from zou3519 and removed request for ebetica and goldsborough November 19, 2019 16:23
zdevito pushed a commit to zdevito/ATen that referenced this pull request Nov 20, 2019
Summary: Pull Request resolved: pytorch/pytorch#30083

Test Plan: Imported from OSS

Differential Revision: D18594723

Pulled By: pbelevich

fbshipit-source-id: 5970e0aa6ef8994e9c4a741784fd053383aaceb7
@facebook-github-bot
Copy link
Contributor

@pbelevich merged this pull request in cc81769.

1 similar comment
@facebook-github-bot
Copy link
Contributor

@pbelevich merged this pull request in cc81769.

@jjlilley
Copy link

Wondering, I see the following when I run tests on caffe2/tests/...

File "/data/users/jeremyl/fbsource/fbcode/buck-out/dev/gen/caffe2/test/rpc_fork#binary,link-tree/torch/_torch_docs.py", line 2415, in
add_docstr(torch.isfinite,
AttributeError: module 'torch' has no attribute 'isfinite'

Is this related?

@zou3519
Copy link
Contributor

zou3519 commented Nov 20, 2019

cc @ailzhang the xla CI was red on this and it seems to be failing on master in the same way, do you know what could be up here?

@zou3519
Copy link
Contributor

zou3519 commented Nov 20, 2019

Wondering, I see the following when I run tests on caffe2/tests/...

File "/data/users/jeremyl/fbsource/fbcode/buck-out/dev/gen/caffe2/test/rpc_fork#binary,link-tree/torch/_torch_docs.py", line 2415, in
add_docstr(torch.isfinite,
AttributeError: module 'torch' has no attribute 'isfinite'

Is this related?

@jjlilley A clean and a rebuild might fix the problem; The python bindings for isfinite now gets autogenerated

@jjlilley
Copy link

Ah, thanks, I should have tried that earlier

@ailzhang
Copy link
Contributor

Thanks @zou3519! @mruberry has temporarily disabled this test on XLA side. And we're trying to investigate the root cause.
@pbelevich Ideally a PR that breaking XLA test should notify us before it's merged ;). The easiest way is to open an issue in pytorch/xla and link the broken PR. That way we'll have more time to prepare a fix. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants