-
Notifications
You must be signed in to change notification settings - Fork 26.3k
Closed
Labels
todoNot as important as medium or high priority tasks, but we will work on these.Not as important as medium or high priority tasks, but we will work on these.
Description
Issue description
The eigenvectors produced by torch.symeig() are not always orthonormal.
Code example
import torch
# Create a random symmetric matrix
p, q = 10, 3
torch.manual_seed(0)
in_tensor = torch.rand(p, q, dtype=torch.float64, requires_grad=True).cuda()
cov_in = torch.mm(in_tensor.t(), in_tensor)
_, eig_vecs = torch.symeig(cov_in)
print(eig_vecs)
print(torch.mm(eig_vecs, eig_vecs.t()))
print(torch.mm(eig_vecs.t(), eig_vecs))
print(eig_vecs.norm(dim=0))
print(eig_vecs.norm(dim=1))Here is the result:
tensor([[ 0.3573, 0.0288, 0.4334],
[ 3.1050, 6.5767, -3.2518],
[ 2.3730, 2.2232, 2.1961]],
device='cuda:0', dtype=torch.float64, grad_fn=<SymeigBackward>)
tensor([[ 0.3164, -0.1103, 1.8639],
[-0.1103, 63.4675, 14.8483],
[ 1.8639, 14.8483, 15.3969]],
device='cuda:0', dtype=torch.float64, grad_fn=<MmBackward>)
tensor([[ 15.3995, 25.7064, -4.7303],
[ 25.7064, 48.1965, -16.4908],
[ -4.7303, -16.4908, 15.5848]],
device='cuda:0', dtype=torch.float64, grad_fn=<MmBackward>)
tensor([3.9242, 6.9424, 3.9478],
device='cuda:0', dtype=torch.float64, grad_fn=<NormBackward1>)
tensor([0.5625, 7.9667, 3.9239],
device='cuda:0', dtype=torch.float64, grad_fn=<NormBackward1>)
System Info
PyTorch version: 0.4.1
Is debug build: No
CUDA used to build PyTorch: 9.0.176
OS: Ubuntu 14.04.5 LTS
GCC version: (Ubuntu 4.8.4-2ubuntu1~14.04.4) 4.8.4
CMake version: version 3.11.1
Python version: 2.7
Is CUDA available: Yes
CUDA runtime version: 8.0.61
GPU models and configuration:
GPU 0: Tesla K80
Nvidia driver version: 384.111
cuDNN version: Probably one of the following:
/usr/local/cuda-7.5/lib64/libcudnn.so.5.1.3
/usr/local/cuda-7.5/lib64/libcudnn_static.a
Versions of relevant libraries:
[pip] numpy (1.13.3)
[conda] Could not collect
Metadata
Metadata
Assignees
Labels
todoNot as important as medium or high priority tasks, but we will work on these.Not as important as medium or high priority tasks, but we will work on these.