-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Closed
Copy link
Labels
todoNot as important as medium or high priority tasks, but we will work on these.Not as important as medium or high priority tasks, but we will work on these.
Description
Issue Description
It seems that stacking is currently (PyTorch 0.4) is not supported for tensors of type HalfTensor; see code example.
I don't know if this is intentional for some reason (hence I didn't label this issue as a bug), but I couldn't find any issue or documentation pointing to that. Since some methods, like torch.utils.data.DataLoader, depend on torch.stack() (in this case implicitly through the default_collate() method), it would be great if either stacking for HalfTensors could get implemented, or the documentation could be updated to contain some kind of hint.
Code Example
Python 3.6.5 (default, Mar 30 2018, 06:41:49)
Type 'copyright', 'credits' or 'license' for more information
IPython 6.3.1 -- An enhanced Interactive Python. Type '?' for help.
In [1]: import torch
In [2]: torch.__version__
Out[2]: '0.4.0'
In [3]: tensor_1_float = torch.tensor([1, 2, 3], dtype=torch.float)
In [4]: tensor_2_float = torch.tensor([4, 5, 6], dtype=torch.float)
In [5]: torch.stack([tensor_1_float, tensor_2_float])
Out[5]:
tensor([[ 1., 2., 3.],
[ 4., 5., 6.]])
In [6]: tensor_1_half = torch.tensor([1, 2, 3], dtype=torch.half)
In [7]: tensor_2_half = torch.tensor([4, 5, 6], dtype=torch.half)
In [8]: torch.stack([tensor_1_half, tensor_2_half])
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-8-10f49aa85ea5> in <module>()
----> 1 torch.stack([tensor_1_half, tensor_2_half])
RuntimeError: _cat is not implemented for type torch.HalfTensor
On PyTorch 0.3.1, one gets basically the same error when one tries to do the same thing:
TypeError: Type torch.HalfTensor doesn't implement stateless method cat
nateyoder and adam-dziedzic
Metadata
Metadata
Assignees
Labels
todoNot as important as medium or high priority tasks, but we will work on these.Not as important as medium or high priority tasks, but we will work on these.