Skip to content

torch.sum(tensor, dim=()) is different from np.sum(arr, axis=()) #29137

@zou3519

Description

@zou3519

🐛 Bug

torch.sum(tensor, dim=()) performs a full reduce, while np.sum(arr, axis=()) performs no reduce.

To Reproduce

Steps to reproduce the behavior:

import torch
import numpy as np
arr = np.array([1, 2, 3])
tensor = torch.from_numpy(arr)

tensor.sum(())  # gives tensor(6)
arr.sum(())  # gives np.array([1, 2, 3])

Expected behavior

These should probably be the same. Also, I don't think it makes sense for tensor.sum(dim=()) to do a full reduce. This special casing caused the bug at #28993 and I suspect the gradient for things like torch.var may be incorrect right now.

Environment

Pytorch master

Additional context

Some people who have been involved in multidim support who may be interested: @t-vi, @ssnl, @umanwizard @gchanan. Please let me know your opinions on this subject.

cc @ezyang @gchanan @zou3519 @jerryzh168 @ssnl

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: TensorIteratormodule: bc-breakingRelated to a BC-breaking changemodule: deprecationmodule: numpyRelated to numpy support, and also numpy compatibility of our operatorsmodule: reductionstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions