Skip to content

Conversation

@XiaobingSuper
Copy link
Collaborator

@XiaobingSuper XiaobingSuper commented Mar 6, 2023

Stack from ghstack (oldest at bottom):

For torch.baddbmm(input, mat1,mat2, beta=0), if beta is zero, the multiplication of value input*beta will be ignored for the eager mode(always gets zero number, see https://pytorch.org/docs/stable/generated/torch.baddbmm.html?highlight=torch+baddbmm#torch.baddbmm), but the inductor is not, the inductor will get a different value if the input has a nan of inf value:

def fn_test(input, mat1, mat2):
    return torch.baddbmm(input, mat1, mat2, beta=0.0)

opt_fn = torch._dynamo.optimize("inductor")(fn_test)
a, b, c = [torch.rand((3,2,2)) for _ in range(3)]

real_out = fn_test(a, b, c)
a[:] = torch.nan
compiled_out = opt_fn(a, b,c)

print(compiled_out)
print(real_out)

before this PR, the output will be like this:

tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[   nan,    nan],
         [   nan,    nan]]])
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[0.4985, 0.1072],
         [0.0857, 0.0186]]])

cc @soumith @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @peterbell10 @desertfire

@pytorch-bot
Copy link

pytorch-bot bot commented Mar 6, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/96087

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 307aca0:
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

…0 and input has nan value"



For ```torch.baddbmm(input, mat1,mat2, beta=0)```, if ```beta``` is zero, the multiplication of value ```input*beta``` will be ignored for the eager mode(always gets zero number, see https://pytorch.org/docs/stable/generated/torch.baddbmm.html?highlight=torch+baddbmm#torch.baddbmm), but the inductor is not, the inductor will get a different value if the input has a ```nan``` of ```inf``` value:

```
def fn_test(input, mat1, mat2):
    return torch.baddbmm(input, mat1, mat2, beta=0.0)

opt_fn = torch._dynamo.optimize("inductor")(fn_test)
a, b, c = [torch.rand((3,2,2)) for _ in range(3)]

real_out = fn_test(a, b, c)
a[:] = torch.nan
compiled_out = opt_fn(a, b,c)

print(compiled_out)
print(real_out)

```
before this PR, the output will be like this:

```
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[   nan,    nan],
         [   nan,    nan]]])
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[0.4985, 0.1072],
         [0.0857, 0.0186]]])

```

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
XiaobingSuper added a commit that referenced this pull request Mar 6, 2023
… has nan value

ghstack-source-id: 5e8a805
Pull Request resolved: #96087
…0 and input has nan value"



For ```torch.baddbmm(input, mat1,mat2, beta=0)```, if ```beta``` is zero, the multiplication of value ```input*beta``` will be ignored for the eager mode(always gets zero number, see https://pytorch.org/docs/stable/generated/torch.baddbmm.html?highlight=torch+baddbmm#torch.baddbmm), but the inductor is not, the inductor will get a different value if the input has a ```nan``` of ```inf``` value:

```
def fn_test(input, mat1, mat2):
    return torch.baddbmm(input, mat1, mat2, beta=0.0)

opt_fn = torch._dynamo.optimize("inductor")(fn_test)
a, b, c = [torch.rand((3,2,2)) for _ in range(3)]

real_out = fn_test(a, b, c)
a[:] = torch.nan
compiled_out = opt_fn(a, b,c)

print(compiled_out)
print(real_out)

```
before this PR, the output will be like this:

```
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[   nan,    nan],
         [   nan,    nan]]])
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[0.4985, 0.1072],
         [0.0857, 0.0186]]])

```

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
…0 and input has nan value"



For ```torch.baddbmm(input, mat1,mat2, beta=0)```, if ```beta``` is zero, the multiplication of value ```input*beta``` will be ignored for the eager mode(always gets zero number, see https://pytorch.org/docs/stable/generated/torch.baddbmm.html?highlight=torch+baddbmm#torch.baddbmm), but the inductor is not, the inductor will get a different value if the input has a ```nan``` of ```inf``` value:

```
def fn_test(input, mat1, mat2):
    return torch.baddbmm(input, mat1, mat2, beta=0.0)

opt_fn = torch._dynamo.optimize("inductor")(fn_test)
a, b, c = [torch.rand((3,2,2)) for _ in range(3)]

real_out = fn_test(a, b, c)
a[:] = torch.nan
compiled_out = opt_fn(a, b,c)

print(compiled_out)
print(real_out)

```
before this PR, the output will be like this:

```
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[   nan,    nan],
         [   nan,    nan]]])
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[0.4985, 0.1072],
         [0.0857, 0.0186]]])

```

cc soumith voznesenskym penguinwu anijain2305 EikanWang jgong5 Guobing-Chen zhuhaozhe blzheng Xia-Weiwen wenzhe-nrv jiayisunx peterbell10 desertfire

[ghstack-poisoned]
XiaobingSuper added a commit that referenced this pull request Mar 7, 2023
… has nan value

ghstack-source-id: 218831d
Pull Request resolved: #96087
@XiaobingSuper XiaobingSuper added the ciflow/trunk Trigger trunk jobs on your pull request label Mar 7, 2023
@XiaobingSuper
Copy link
Collaborator Author

@pytorchbot merge

@pytorchmergebot
Copy link
Collaborator

Merge started

Your change will be merged once all checks pass (ETA 0-4 Hours).

Learn more about merging in the wiki.

Questions? Feedback? Please reach out to the PyTorch DevX Team

Advanced Debugging
Check the merge workflow status
here

cyyever pushed a commit to cyyever/pytorch_private that referenced this pull request Mar 12, 2023
… has nan value (#96087)

For ```torch.baddbmm(input, mat1,mat2, beta=0)```, if ```beta``` is zero, the multiplication of value ```input*beta``` will be ignored for the eager mode(always gets zero number, see https://pytorch.org/docs/stable/generated/torch.baddbmm.html?highlight=torch+baddbmm#torch.baddbmm), but the inductor is not, the inductor will get a different value if the input has a ```nan``` of ```inf``` value:

```
def fn_test(input, mat1, mat2):
    return torch.baddbmm(input, mat1, mat2, beta=0.0)

opt_fn = torch._dynamo.optimize("inductor")(fn_test)
a, b, c = [torch.rand((3,2,2)) for _ in range(3)]

real_out = fn_test(a, b, c)
a[:] = torch.nan
compiled_out = opt_fn(a, b,c)

print(compiled_out)
print(real_out)

```
before this PR, the output will be like this:

```
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[   nan,    nan],
         [   nan,    nan]]])
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[0.4985, 0.1072],
         [0.0857, 0.0186]]])

```

Pull Request resolved: pytorch/pytorch#96087
Approved by: https://github.com/jansel, https://github.com/ngimel, https://github.com/jgong5
cyyever pushed a commit to cyyever/pytorch_private that referenced this pull request Mar 12, 2023
… has nan value (#96087)

For ```torch.baddbmm(input, mat1,mat2, beta=0)```, if ```beta``` is zero, the multiplication of value ```input*beta``` will be ignored for the eager mode(always gets zero number, see https://pytorch.org/docs/stable/generated/torch.baddbmm.html?highlight=torch+baddbmm#torch.baddbmm), but the inductor is not, the inductor will get a different value if the input has a ```nan``` of ```inf``` value:

```
def fn_test(input, mat1, mat2):
    return torch.baddbmm(input, mat1, mat2, beta=0.0)

opt_fn = torch._dynamo.optimize("inductor")(fn_test)
a, b, c = [torch.rand((3,2,2)) for _ in range(3)]

real_out = fn_test(a, b, c)
a[:] = torch.nan
compiled_out = opt_fn(a, b,c)

print(compiled_out)
print(real_out)

```
before this PR, the output will be like this:

```
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[   nan,    nan],
         [   nan,    nan]]])
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[0.4985, 0.1072],
         [0.0857, 0.0186]]])

```

Pull Request resolved: pytorch/pytorch#96087
Approved by: https://github.com/jansel, https://github.com/ngimel, https://github.com/jgong5
ydwu4 added a commit to ydwu4/pytorch that referenced this pull request Mar 13, 2023
… has nan value (pytorch#96087)

For ```torch.baddbmm(input, mat1,mat2, beta=0)```, if ```beta``` is zero, the multiplication of value ```input*beta``` will be ignored for the eager mode(always gets zero number, see https://pytorch.org/docs/stable/generated/torch.baddbmm.html?highlight=torch+baddbmm#torch.baddbmm), but the inductor is not, the inductor will get a different value if the input has a ```nan``` of ```inf``` value:

```
def fn_test(input, mat1, mat2):
    return torch.baddbmm(input, mat1, mat2, beta=0.0)

opt_fn = torch._dynamo.optimize("inductor")(fn_test)
a, b, c = [torch.rand((3,2,2)) for _ in range(3)]

real_out = fn_test(a, b, c)
a[:] = torch.nan
compiled_out = opt_fn(a, b,c)

print(compiled_out)
print(real_out)

```
before this PR, the output will be like this:

```
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[   nan,    nan],
         [   nan,    nan]]])
tensor([[[0.4272, 0.6037],
         [0.4279, 0.4219]],

        [[0.0838, 0.4873],
         [0.1210, 0.5516]],

        [[0.4985, 0.1072],
         [0.0857, 0.0186]]])

```

Pull Request resolved: pytorch#96087
Approved by: https://github.com/jansel, https://github.com/ngimel, https://github.com/jgong5
@facebook-github-bot facebook-github-bot deleted the gh/XiaobingSuper/74/head branch June 8, 2023 15:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Done

Development

Successfully merging this pull request may close these issues.

7 participants