Skip to content

Conversation

@xiaomengy
Copy link
Contributor

Summary: Optimize TileOp

Differential Revision: D14145844

Summary:
Pull Request resolved: pytorch#17290

Optimize TileOp

Reviewed By: wesolwsk

Differential Revision: D14145844

fbshipit-source-id: efeec07dbbf6d6d7d81cd772d241c6b34a9b8db5
Copy link
Member

@houseroad houseroad left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Thanks!

@xiaomengy xiaomengy deleted the export-D14145844 branch March 12, 2019 19:29
petrex pushed a commit to petrex/pytorch that referenced this pull request Mar 14, 2019
* upstream/master: (87 commits)
  Make Variable::set_data non-const; cosmetic fixes.
  remove warning for upsample code (pytorch#17921)
  Optimize TileOp (pytorch#17290)
  Optimize channel_stats_op (pytorch#16243)
  enable shape inference for elementwise operators (pytorch#17885)
  Remove remaining test jit expects redux (pytorch#17924)
  Handle Scalars Better (pytorch#17875)
  Fixed a formatting issue in doc comments (pytorch#17505)
  Add nbytes, itemsize, element_size to at::Tensor. (pytorch#17810)
  Fix lint in test_distributions.py
  Fix lint in test_jit.py
  Fix lint errors in test_autograd
  Added a few extra python bindings to help with walking the IR graph from Python (pytorch#17822)
  kthvalue consistency with sort in the presence of NaN (pytorch#17824)
  Fix minor grammatical mistakes in torch/nn/modules/loss.py (pytorch#17892)
  Remove (almost all) TensorOptions from native_functions.yaml (pytorch#17385)
  Restore full Windows tests (pytorch#17102)
  Prevent VS2017 from emitting ambiguous symbol errors (second time)
  Fix windows test hang (pytorch#17778)
  torch.btrifact for tensors with greater than 3 dimensions (pytorch#14964)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants