Skip to content

Conversation

@szagoruyko
Copy link
Contributor

with gradcheck and doc.

@soumith
Copy link
Contributor

soumith commented May 4, 2017

rename lp_normalize to normalize, and remove l2_normalize

@vadimkantorov
Copy link
Contributor

Should default dim = -1?

@szagoruyko
Copy link
Contributor Author

@soumith pushed
@vadimkantorov we could then also have p=2, idk

@soumith
Copy link
Contributor

soumith commented May 4, 2017

yes, please put p=2 as the default (you will have to either have a default dim argument, or move p to 3rd argument)

@apaszke apaszke changed the title Add F.lp_normalize and F.l2_normalize Add F.normalize May 5, 2017
.. math::
v = \frac{v}{\max(\lVert v \rVert_p, \epsilon)}
for each subtensor v over dimension dim of input.

This comment was marked as off-topic.



def normalize(input, p, dim, eps=1e-12):
r"""Performs l_p normalization of inputs over specified dimension.

This comment was marked as off-topic.

return loss


def normalize(input, p, dim, eps=1e-12):

This comment was marked as off-topic.

for each subtensor v over dimension dim of input. Each subtensor is flattened into a vector,
i.e. :math:`\lVert v \rVert_p` is not a matrix norm.
With default arguments normalizes over the last dimension with Euclidean norm.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

This comment was marked as off-topic.

@apaszke apaszke merged commit 6d693fe into pytorch:master May 7, 2017
caogang added a commit to caogang/pytorch that referenced this pull request May 8, 2017
* master:
  Add F.normalize (pytorch#1467)
  Expose custom attributes from C++ functions (pytorch#1430)
  Add high order gradient support for Sigmoid (pytorch#1496)
Jiaming-Liu pushed a commit to Jiaming-Liu/pytorch that referenced this pull request May 18, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants