Singular Value Decomposition (SVD) is a fundamental matrix factorization technique in
linear algebra that has many applications in signal processing, statistics, machine learning,
and more. It decomposes any given matrix into three components: two orthogonal matrices
and a diagonal matrix.
Mathematical Definition:
For a matrix AAA of size m×nm \times nm×n, SVD decomposes it into the following three
matrices:
A=U⋅Σ⋅VTA = U \cdot \Sigma \cdot V^TA=U⋅Σ⋅VT
Where:
• AAA is the original matrix of size m×nm \times nm×n.
• UUU is an m×mm \times mm×m orthogonal matrix (its columns are called the left
singular vectors).
• Σ\SigmaΣ (Sigma) is an m×nm \times nm×n diagonal matrix containing the singular
values of AAA. These singular values are non-negative real numbers and are arranged
in decreasing order.
• VTV^TVT (the transpose of VVV) is an n×nn \times nn×n orthogonal matrix (its rows
are called the right singular vectors).
SVD Breakdown:
1. Matrix UUU:
o The columns of UUU are called the left singular vectors of AAA. These
vectors span the column space of AAA.
2. Matrix Σ\SigmaΣ:
o The diagonal elements of Σ\SigmaΣ are the singular values of AAA. These
values are always non-negative and are ordered from the largest to the
smallest.
3. Matrix VTV^TVT:
o The rows of VTV^TVT are called the right singular vectors of AAA. These
vectors span the row space of AAA.
Properties of SVD:
• Rank: The number of non-zero singular values in Σ\SigmaΣ is equal to the rank of the
matrix AAA.
• Norm preservation: The Frobenius norm of the matrix AAA is preserved in the
decomposition.
∥A∥F=∥U⋅Σ⋅VT∥F=∑σi2\| A \|_F = \| U \cdot \Sigma \cdot V^T \|_F = \sqrt{\sum
\sigma_i^2}∥A∥F=∥U⋅Σ⋅VT∥F=∑σi2
where σi\sigma_iσi are the singular values.
• Approximation: SVD can be used to approximate a matrix by truncating the smaller
singular values. This is a key technique in Principal Component Analysis (PCA) and
low-rank approximations.
Applications of SVD:
1. Dimensionality Reduction:
o SVD is widely used in PCA for reducing the dimensionality of large datasets
by projecting the data onto the directions of maximum variance.
2. Data Compression:
o By approximating the original matrix using a lower-rank version, SVD can
compress data, making it useful for applications like image compression.
3. Noise Reduction:
o In signal processing, SVD helps to filter out noise by keeping only the largest
singular values.
4. Latent Semantic Analysis (LSA):
o In Natural Language Processing (NLP), SVD is used to reduce the
dimensionality of term-document matrices, which helps in information
retrieval and document clustering.
5. Matrix Inversion:
o SVD can be used to compute the Moore-Penrose pseudo-inverse of a matrix,
which is helpful when the matrix is non-square or singular.
Example Code to Compute SVD in Python:
Here's a more detailed example of how you can compute and analyze the components of SVD
using NumPy in Python:
python
Copier le code
import numpy as np
# Create a matrix A
A = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
# Perform Singular Value Decomposition
U, S, Vt = np.linalg.svd(A)
# Display the components
print("Matrix A:")
print(A)
print("\nU matrix (left singular vectors):")
print(U)
print("\nSingular values (S):")
print(S)
# Convert S into a diagonal matrix for visualization
S_matrix = np.diag(S)
print("\nSigma (diagonal matrix with singular values):")
print(S_matrix)
print("\nVt matrix (right singular vectors, transposed):")
print(Vt)
# Approximate A using the truncated SVD
A_approx = np.dot(U, np.dot(S_matrix, Vt))
print("\nReconstructed matrix (A_approx):")
print(A_approx)
Interpreting the Results:
• Reconstructed Matrix: The approximation of the original matrix is calculated using
U⋅Σ⋅VTU \cdot \Sigma \cdot V^TU⋅Σ⋅VT. By truncating small singular values in
Σ\SigmaΣ, you can control how closely the reconstructed matrix approximates the
original one.
Conclusion:
SVD is a powerful tool in linear algebra with many applications in machine learning, image
processing, data analysis, and more. It enables dimensionality reduction, data compression,
noise filtering, and can be used to solve systems of linear equations or compute pseudo-
inverses of matrices. Understanding its components and how to use them efficiently is crucial
for leveraging SVD in practical problems.
4o mini