-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Add F.normalize #1467
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add F.normalize #1467
Conversation
|
rename lp_normalize to normalize, and remove l2_normalize |
|
Should default dim = -1? |
|
@soumith pushed |
|
yes, please put p=2 as the default (you will have to either have a default dim argument, or move p to 3rd argument) |
torch/nn/functional.py
Outdated
| .. math:: | ||
| v = \frac{v}{\max(\lVert v \rVert_p, \epsilon)} | ||
| for each subtensor v over dimension dim of input. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/functional.py
Outdated
|
|
||
|
|
||
| def normalize(input, p, dim, eps=1e-12): | ||
| r"""Performs l_p normalization of inputs over specified dimension. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/functional.py
Outdated
| return loss | ||
|
|
||
|
|
||
| def normalize(input, p, dim, eps=1e-12): |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/functional.py
Outdated
| for each subtensor v over dimension dim of input. Each subtensor is flattened into a vector, | ||
| i.e. :math:`\lVert v \rVert_p` is not a matrix norm. | ||
| With default arguments normalizes over the last dimension with Euclidean norm. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
* master: Add F.normalize (pytorch#1467) Expose custom attributes from C++ functions (pytorch#1430) Add high order gradient support for Sigmoid (pytorch#1496)
with gradcheck and doc.