-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
low priorityWe're unlikely to get around to doing this in the near futureWe're unlikely to get around to doing this in the near futuremodule: empty tensormodule: error checkingBugs related to incorrect/lacking error checkingBugs related to incorrect/lacking error checkingmodule: mpsRelated to Apple Metal Performance Shaders frameworkRelated to Apple Metal Performance Shaders frameworktriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
π΅οΈββοΈ Detected with FACTO
π Describe the bug
As a general rule, torch operations allow accessing dimension 0 on a zero-dimensional tensor (returning back the same tensor). In this case, the CPU torch.var operation can handle zero-dimensional input with dim=0, but MPS throws error:
import torch
def test_var(device):
x = torch.tensor(3.0, device=device)
try:
output = torch.var(x, dim=0)
print(f"var test succeeds for device: {device}. output: {output}")
except Exception as e:
print(f"var test fails for device: {device}: {e}")
test_var(device = "cpu")
test_var(device = "mps")terminal output:
var test succeeds for device: cpu. output: nan
var test fails for device: mps: var_mps: reduction dim must be in the range of input shape
Versions
nightly
Metadata
Metadata
Assignees
Labels
low priorityWe're unlikely to get around to doing this in the near futureWe're unlikely to get around to doing this in the near futuremodule: empty tensormodule: error checkingBugs related to incorrect/lacking error checkingBugs related to incorrect/lacking error checkingmodule: mpsRelated to Apple Metal Performance Shaders frameworkRelated to Apple Metal Performance Shaders frameworktriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module