Skip to content

Conversation

@ssnl
Copy link
Collaborator

@ssnl ssnl commented Nov 13, 2019

This PR fixes the inconsistent behavior of randint's generator= kwarg. It does not accept None, which is inconsistent with how other random functions behave:

In [12]: torch.randint(0, 4, size=(2,3), generator=torch.Generator())
Out[12]:
tensor([[2, 0, 1],
        [0, 1, 3]])

In [13]: torch.randint(0, 4, size=(2,3), generator=None)
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-13-a6bc6525a1e1> in <module>
----> 1 torch.randint(0, 4, size=(2,3), generator=None)

TypeError: randint() received an invalid combination of arguments - got (int, int, generator=NoneType, size=tuple), but expected one of:
 * (int high, tuple of ints size, torch.Generator generator, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool requires_grad)
 * (int high, tuple of ints size, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool requires_grad)
 * (int low, int high, tuple of ints size, torch.Generator generator, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool requires_grad)
 * (int low, int high, tuple of ints size, Tensor out, torch.dtype dtype, torch.layout layout, torch.device device, bool requires_grad)

Other random functions work fine:

In [9]: torch.bernoulli(torch.ones(3))
Out[9]: tensor([1., 1., 1.])

In [10]: torch.bernoulli(torch.ones(3), generator=None)
Out[10]: tensor([1., 1., 1.])

This PR also documents the generator= kwarg, and fixes #29683 since it's a related easy fix.

Copy link
Contributor

@vishwakftw vishwakftw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added some minor comments with regard to the tests.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don’t need to convert the type anymore right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Likewise

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.assertEqual perhaps.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Likewise

Copy link
Contributor

@vishwakftw vishwakftw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the fixes! The changes look good to me.

@ssnl
Copy link
Collaborator Author

ssnl commented Nov 15, 2019

@vishwakftw Thanks for reviewing :)

@pytorchbot merge this please

@pytorchbot pytorchbot added the merge-this-please Was marked for merge with @pytorchbot merge this please label Nov 15, 2019
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ezyang is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@ezyang merged this pull request in 38340f5.

@ssnl ssnl deleted the randint_gennone branch November 18, 2019 18:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

merge-this-please Was marked for merge with @pytorchbot merge this please Merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[doc] torch.(multinomial|normal) generator=None kwarg is not documented

5 participants