Skip to content

Conversation

@umanwizard
Copy link
Contributor

For compatibility with numpy.

@pytorchbot pytorchbot added module: autograd Related to torch.autograd, and the autograd engine in general module: pybind Related to our Python bindings / interactions with other Python libraries labels May 16, 2019
@umanwizard umanwizard requested review from colesbury and gchanan May 16, 2019 00:11
@ssnl
Copy link
Collaborator

ssnl commented May 16, 2019

Is .T coming soon? :)

@umanwizard umanwizard added module: numpy Related to numpy support, and also numpy compatibility of our operators and removed module: autograd Related to torch.autograd, and the autograd engine in general labels May 16, 2019
@umanwizard
Copy link
Contributor Author

@ssnl Yes

Copy link
Member

@colesbury colesbury left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Needs some minimal testing, otherwise looks good.

We should also figure out a strategy for documentation for the new NumPy properties, functions, and kwargs. I think that Tensor properties (ndim, .T) are important enough to be documented directly in the Tensor documentation (tensors.rst). The kwarg aliases can probably go in a separate "NumPy" compatibility section

@pytorchbot pytorchbot added module: autograd Related to torch.autograd, and the autograd engine in general module: docs Related to our documentation, both in docs/ and docblocks module: operators labels May 16, 2019
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@umanwizard has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@umanwizard umanwizard requested a review from colesbury May 16, 2019 16:33
@fmassa
Copy link
Member

fmassa commented May 16, 2019

Do we have a plan for .size? It exists for both numpy and PyTorch, but have different meanings.
Fortunately the return types are different between both (in PyTorch, it's a method while in numpy it's an attribute), but I'm not sure if there is a way of keeping both without some magic.

@umanwizard
Copy link
Contributor Author

@fmassa I don't have any better answer than "we need to think more about what to do in those cases". So in the immediate future I'm not planning on changing anything related to size.


add_docstr_all('ndim',
r"""
Is the number of dimensions in this Tensor.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we just document it the same as ndimension?

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@umanwizard has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@umanwizard merged this pull request in 987f1cc.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Merged module: autograd Related to torch.autograd, and the autograd engine in general module: docs Related to our documentation, both in docs/ and docblocks module: numpy Related to numpy support, and also numpy compatibility of our operators module: pybind Related to our Python bindings / interactions with other Python libraries

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants