1
MATH-810
Mathematical Methods for
Artificial Intelligence
Lecture 19: SVD, Matrix Approximation and PCA
Instructor: Dr Hashir Kiani
2
Covid-19 SOPs
• Wear a mask at all times in the class
• Practice social distancing
• Follow good hygiene practices
• Use a hand sanitizer
• Avoid Handshakes
3
General Conduct
• Be respectful of others
• Only speak at your turn and preferably raise your hand if you want to say
something
• Do not cut off others when they are talking
• Join the class on time
4
Objective Today
• Recap of previous lecture
• Singular Value Decomposition (SVD) of any matrix
• Matrix Approximation
• Principal Component Analysis
5
Recap
• What kind of matrices have an SVD? • What kind of matrix is 𝜮?
• Are there any matrices that do not have an • What do the eigenvalues of 𝑨𝑻 𝑨 and 𝑨𝑨𝑻
SVD? represent?
• For a rectangular matrix 𝑨, what are the • What do the eigenvectors of 𝑨𝑻 𝑨 and 𝑨𝑨𝑻
properties of 𝑨𝑻 𝑨 and 𝑨𝑨𝑻 ? represent?
• What is the mathematical form of SVD? • How can we construct SVD of a matrix?
• What kind of matrices are 𝑼 and 𝑽? • What is the geometric interpretation of SVD?
• What is the relationship between vectors 𝒗𝒊 • What happens when we have double
and 𝒖𝒊 ? What are these vectors called? eigenvalues?
6
SVD Example
1 0 1 1 −2
𝑨= 1 0 1 2 −2
−2 1 0 𝑨𝑨𝑻 = 0 1 =
−2 1 0 −2 5
1 0
𝑨 = 𝑼𝜮𝑽𝑇 𝑨𝑨𝑻 − 𝝀𝑰 =
2−𝜆 −2
−2 5−𝜆
(2 × 3) = (2 × 2)(2 × 3)(3 × 3)
𝒅𝒆𝒕 𝑨𝑨𝑻 − 𝝀𝑰 = 2 − 𝜆 5 − 𝜆 − 4
𝚺= 6 0 0 2−𝜆 5−𝜆 −4=0
0 1 0
𝜆2 − 7𝜆 + 6 = 0
𝜆1 = 6 , 𝜆2 = 1
7
2−𝜆 −2
𝑨𝑨𝑻 − 𝝀𝑰 =
−2 5−𝜆
For 𝜆1 = 6 , 𝑨𝑨𝑻 − 𝝀𝑰 = 0
4𝑥 + 2𝑦 = 0
𝑥 1 1 1
−4 −2 ⋮ 0 −4 −2 ⋮ 0 𝐿𝑒𝑡 𝑥 = 𝑟 𝑦 = 𝑟 −2 𝒖𝟏 =
−2 −1 ⋮ 0 0 0 ⋮ 0 5 −2
𝑦 = −2𝑟
For 𝜆2 = 1 , 𝑨𝑨𝑻 − 𝝀𝑰 = 0 𝑥 − 2𝑦 = 0
𝑥 2 1 2
𝐿𝑒𝑡 𝑦 = 𝑟
1 −2 ⋮ 0 1 −2 ⋮ 0 𝑦 =𝑟 1 𝒖𝟐 =
5 1
−2 4 ⋮ 0 0 0 ⋮ 0 𝑥 = 2𝑟
11 2
𝑼=
5 −2 1
8
𝑨𝒗𝒊 = 𝜎𝒊 𝒖𝒊 𝒖𝑻𝒊 𝑨 = 𝜎𝒊 𝒗𝑻𝒊
𝑨= 𝜎𝒊 𝒖𝒊 𝒗𝑻𝒊
𝒖𝑻𝒊 𝑨 = 𝜎𝒊 𝒖𝑻𝒊 𝒖𝒊 𝒗𝑻𝒊 𝟏 𝑻
𝑨𝒗𝒊 𝒗𝑻𝒊 = 𝜎𝒊 𝒖𝒊 𝒗𝑻𝒊 𝒗𝑻𝒊 = 𝒖𝒊 𝑨
𝜎𝒊
𝟏 𝑻 1 1 1 0 1 = 1
𝒗𝑻𝟏 = 𝒖𝟏 𝑨 = × 1 −2 5 −2 1
𝜎𝟏 6 5 −2 1 0 30
𝟏 𝑻 1 1 0 1 = 1
𝒗𝑻𝟐 = 𝒖𝟐 𝑨 = 2 1 0 1 2
𝜎𝟐 5 −2 1 0 5
9
𝒗𝑻𝟑 = 𝑎 𝑏 𝑐
𝒗𝑻𝟏 ∙ 𝒗𝑻𝟑 = 𝟎 𝒗𝑻𝟐 ∙ 𝒗𝑻𝟑 = 𝟎
5 −2 1 ∙ 𝑎 𝑏 𝑐 = 0 0 1 2 ∙ 𝑎 𝑏 𝑐 =0
5𝑎 − 2𝑏 + 𝑐 = 0 𝑏 + 2𝑐 = 0
𝐿𝑒𝑡 𝑐 = 𝑟
𝑏 = −2𝑟 𝑎 𝑏 𝑐 = 𝑟 −1 −2 1
5𝑎 − 2 −2𝑟 + 𝑟 = 0
𝑎 = −𝑟
1
𝒗𝑻𝟑 = −1 −2 1
𝟔
𝑨 = 𝑼𝜮𝑽𝑇
10
5 −2 1
30 30 30
1 1 2 6 0 0 1 2
𝑨= 0
5 −2 1 0 1 0 5 5
−1 −2 1
6 6 6
5 −2 1
30 30 30
1 26 0 1 2 1 0 1
𝑨= 0 𝑨=
5 −2 6 1 0 5 5 −2 1 0
−1 −2 1
6 6 6
11
Matrix Approximation
• SVD can be used to represent a matrix 𝑨 as a sum of simpler, low-rank
matrices.
• Computationally cheaper than the full SVD
• A rank-1 matrix 𝐴𝑖 ∈ ℝ𝑚×𝑛 can be formed by:
𝐴𝑖 = 𝑢𝑖 𝑣𝑖𝑇
• Product of 𝑖𝑡ℎ orthogonal column vectors of 𝑼 and 𝑽
12
Matrix Approximation
• The matrix A can be written as a sum of rank-1 matrices:
𝑟 𝑟
𝐴 = 𝜎𝑖 𝑢𝑖 𝑣𝑖𝑇 = 𝜎𝑖 𝐴𝑖
𝑖=1 𝑖=1
• 𝜎𝑖 are the singular values
• If the sum runs only up to a value k < r, we obtain a rank-k approximation
of A:
𝑘 𝑘
መ
𝐴(𝑘) = 𝜎𝑖 𝑢𝑖 𝑣𝑖𝑇 = 𝜎𝑖 𝐴𝑖
𝑖=1 𝑖=1
13
Matrix Approximation Example
1 −2
• 𝑨= 0 1
1 0
14
QUIZ
15
Principal Component Analysis
• Technique used for dimensionality reduction
• Can use SVD to perform PCA
• Centre the data around mean
• Then find SVD for the centred matrix
• The vectors 𝒖𝒊 are the basis vectors for principal components
• The projection of data is equal to the Rank-k approximation of the matrix
of data
16
Summary
• Recap of previous lecture
• Singular Value Decomposition (SVD) of any matrix
• Matrix Approximation
• Principal Component Analysis
17
Questions?