-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Add itertools.{prod, combinations, combinations_with_replacement} like op to pytorch
#9393
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
itertools.prod, itertools.combinations and itertools.combinations_with_replacement like op to pytorchitertools.{prod, combinations, combinations_with_replacement} like op to pytorch
|
CC @fmassa |
torch/_torch_docs.py
Outdated
| >>> a = torch.tensor([1, 2, 3]) | ||
| >>> b = torch.tensor([4, 5]) | ||
| >>> torch.cartesian_prod([a, b]) | ||
| (tensor([1, 1, 2, 2, 3, 3]), tensor([4, 5, 4, 5, 4, 5])) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/_torch_docs.py
Outdated
| >>> a = torch.tensor([1, 2, 3]) | ||
| >>> torch.combinations(a) | ||
| (tensor([1, 1, 2]), tensor([2, 3, 3])) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
@zasdfgbnm we'd love to have this and I am working on reviewing your code. I had a question about the behavior differences between this implementation and itertools -- please let me know if the semantics of this implementation are intended. |
|
@zou3519 I've changed the behavior of those functions,it looks much more natural now. Please take a look. |
|
Thank you @zasdfgbnm, I'll try to take a look soon. |
torch/_torch_docs.py
Outdated
| [(1, 4), (1, 5), (2, 4), (2, 5), (3, 4), (3, 5)] | ||
| >>> tensor_a = torch.tensor(a) | ||
| >>> tensor_b = torch.tensor(b) | ||
| >>> torch.cartesian_prod([tensor_a, tensor_b]) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
| self.assertEqual(c_, expected_c) | ||
| prod = torch.cartesian_prod([a, b, c]) | ||
| expected = torch.tensor(list(product([a], b, c))) | ||
| self.assertEqual(expected, prod) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
zou3519
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only looked at the code for cartesian_prod. That looks good, I'm not sure how well meshgrid can stand up to edge cases though so some testing of edge cases would be great.
|
I have a general question: for cartesian_prod, itertools.product does the following: This is pretty close to what cartesian_prod does: is it a significant usability help and/or speedup that |
|
@zou3519 Answer to your question:
import torch
import itertools
from tqdm import tqdm
a = torch.rand(100)
b = torch.rand(100)
for _ in tqdm(range(10000)):
torch.cartesian_prod([a, b])
for _ in tqdm(range(10000)):
torch.tensor(list(itertools.product(a, b)))On my computer, the first would be 7806.71it/s, but the second is only 65.69it/s.
|
|
@zou3519 I've made changes according to your advice. |
|
Fix the lint please: ./test/test_torch.py:9264:5: E303 too many blank lines (2) Otherwise, this PR looks good (see my one comment), sorry that it took us so long to get back to you, @zasdfgbnm |
|
@zou3519 Another question: for Also, if you think I also think if the input has shape |
| [3, 4], | ||
| [3, 5]]) | ||
| """ | ||
| return torch._C._VariableFunctions.cartesian_prod(tensors) |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
|
@zasdfgbnm, that is a good question. I'm not sure what the right answer here, especially because numpy doesn't have an equivalent to these functions. Let me do some digging around and get back to you. Edit: I thought about it a little more. I agree that the scalar input is confusing; it would be good to turn off the support for it. In my mind cartesian product should be an operation between two "lists" or 1d tensors. Same with combinations; if someone is trying to take combinations of a scalar tensor I think they may be doing something wrong. Which operator are you talking about for
For cartesian product I agree: if we have an input of shape (N,), the output should have shape (N,) because there was only one input in the list. |
Here I mean nothing other than simply agreeing your comment at: #9393 (comment) So now I think we are on complete agreement about this now. |
|
Awesome, @zasdfgbnm. Let me take one last pass through this PR :) |
|
@zasdfgbnm sorry, I forgot about this PR. I'll take another look through it today (I don't remember if there were any unresolved points that we brought over this PR before?) |
|
@zasdfgbnm you can ignore the last build failure, it is unrelated to your code |
|
@zou3519 I've just made changes according to your new comments. I don't think we have unresolved issue last time, but I forgot to answer your comment at #9393 (comment) , which I can answer now: JIT support is already working: |
zou3519
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm!
facebook-github-bot
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zou3519 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
…ike op to pytorch (#9393) Summary: closes pytorch/pytorch#7580 Pull Request resolved: pytorch/pytorch#9393 Differential Revision: D13659628 Pulled By: zou3519 fbshipit-source-id: 3a233befa785709395a793ba8833413be394a6fd
closes #7580