Skip to content

Conversation

@ezyang
Copy link
Contributor

@ezyang ezyang commented May 9, 2017

Signed-off-by: Edward Z. Yang [email protected]

@soumith soumith merged commit bfc8a3e into pytorch:master May 10, 2017
caogang added a commit to caogang/pytorch that referenced this pull request May 11, 2017
* master: (26 commits)
  Fix Linear function
  Fix comparison functions
  Expose variable attribute of AccumulateGrad
  Don't modify non-volatile grads in zero_grad
  Minor fix in Prod backward
  Add new flags to Variable.backward
  Replace retain_variables with retain_graph
  Improve output wrapping logic in autograd
  Remove spurious memo argument in Module.parameters() (pytorch#1527)
  Make torch.cat not synchronize the host and device
  Reference counting documentation. (pytorch#1520)
  Restore examples with keepdim=True default.
  Explicitly pass keepdim=False for tests that require it.
  Change keepdim default to False.
  Fix test_normalize NN test.
  Add a keepdim test to torch_test.
  Make (non-legacy) nn backwards compatible.
  Add autograd tests for keepdim
  Add documentation for keepdim.
  Change all legacy/nn modules to use keepdim=True (even if tests don't fail).
  ...

# Conflicts:
#	torch/autograd/_functions/reduce.py
#	torch/autograd/variable.py
Jiaming-Liu pushed a commit to Jiaming-Liu/pytorch that referenced this pull request May 18, 2017
@ezyang ezyang deleted the pr/refcount-docs branch September 7, 2017 20:23
jjsjann123 pushed a commit to jjsjann123/pytorch that referenced this pull request Mar 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants