Adv Eng Math Lecture Notes 2
Adv Eng Math Lecture Notes 2
kr)
Objectives
1
Chapter 8
Problems
Ax = λx,
• A scalar λ for which the above equation has a nontrivial solution (x ̸= 0) is called an eigenvalue of A.
• The corresponding nonzero solution x is called an eigenvector of A associated with the eigenvalue λ.
2
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
det(A − λI) = 0
| {z }
Cramer’s theorem
• Definitions:
(A − λI)x = 0.
det(A − λI) = 0.
This equation is called the characteristic equation of A, and the determinant is called the characteristic
polynomial.
Cramer’s Theorem
Example 1
−5 2
A=
.
2 −2
3
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
Step 1: Eigenvalues
−5 − λ 2
= (−5 − λ)(−2 − λ) − 4 = λ2 + 7λ + 6 = 0.
det(A − λI) = det
2 −2 − λ
Solving, we get:
λ1 = −1, λ2 = −6.
Step 2: Eigenvectors
For λ = −1:
−4 2 1
(A + I)x =
x = 0
⇒ .
x=
2 −1 2
For λ = −6:
1 2 2
(A + 6I)x =
x = 0
⇒ .
x=
2 4 −1
Theorem 1: Eigenvalues
• The eigenvalues of a square matrix are the roots of its characteristic polynomial.
• If w and x are eigenvectors corresponding to the same eigenvalue λ, then w + x and kx (k ∈ R) are
also eigenvectors.
• The eigenspace of λ is the set of all eigenvectors corresponding to λ, together with the zero vector.
4
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
Example 2:
−2 2 −3
A=
2 1 −6.
−1 −2 0
Characteristic equation:
For λ = 5:
−7 2 −3 1
(A − 5I)x =
0 −24/7 48/7
⇒ x=
2 .
0 0 0 −1
For λ = −3:
1 2 −3
(A + 3I)x =
0 0 0.
0 0 0
value.
5
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
Consider:
5 3
A=
.
3 5
Characteristic equation:
(5 − λ)2 − 9 = 0 ⇒ λ = 8, 2.
Eigenvectors:
1 1
,
λ=8: x=
.
λ=2: x=
1 −1
The principal directions are given by the eigenvectors, and the deformation leads to an ellipse with equation:
z12 z22
+ = 1.
82 22
6
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
7
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
Mass–spring systems involving several masses and springs can be treated as eigenvalue problems. For instance,
consider the mechanical system in Fig. 161, which is governed by the system of ODEs
where y1 and y2 are the displacements of the masses from rest, as shown in the figure. Primes denote derivatives
′′
y1 −5 2 y1
y ′′ = A y =⇒
=
.
(7)
y2′′ 2 −2 y2
We consider a mechanical system (e.g. two masses on springs) whose motion can be described by exponen-
8
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
y = x eωt . (8)
ÿ = A y,
ω 2 x eωt = A x eωt .
A x = λ x, where λ = ω 2 . (9)
Example:
From Example 1 in Sec. 8.1 (of some reference), suppose that A has eigenvalues λ1 = −1 and λ2 = −6. Hence
p p √
ω = ± −λ1 = ± i and ± −λ2 = ± i 6.
√ √ √
x1 e± it = x1 (cos t ± i sin t), x2 e± i 6t
= x2 (cos( 6 t) ± i sin( 6 t)).
By taking linear combinations (and real/imag parts), we can form four real solutions:
√ √
x1 cos t, x1 sin t, x2 cos( 6 t), x2 sin( 6 t).
9
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
√ √
y = a1 x1 cos t + b1 x1 sin t + a2 x2 cos( 6 t) + b2 x2 sin( 6 t),
where a1 , b1 , a2 , b2 are constants determined by initial conditions. These describe harmonic oscillations (in the
absence of damping).
• Symmetric: AT = A
• Skew-Symmetric: AT = −A
• Orthogonal: AT = A−1
• Decomposition: Any real n × n matrix A can be written uniquely as the sum of a symmetric matrix R
1 1
A + AT , A − AT .
A = R + S, R= S=
2 2
Example: Illustration
9 5 2 9.0 3.5 3.5 0 1.5 −1.5
A=
2 3 −8 = 3.5
3.0 −2.0 + −1.5
0 −6.0
5 4 3 3.5 −2.0 3.0 1.5 6.0 0
| {z } | {z }
R S
10
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
y = Ax,
y1 cos θ − sin θ x1
= .
y2 sin θ cos θ x2
| {z }
orthogonal
orthogonal n × n matrix A,
a · b = (Aa) · (Ab).
√ p
∥a∥ = a·a = (Aa) · (Aa).
11
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
A real square matrix A is orthogonal if and only if its column vectors (and also its row vectors) form an
0, j ̸= k,
aj · ak = aTj ak =
1,
j = k.
a⊤ a⊤ a1 a⊤
1 a2 ··· a⊤
1 an
1 1
. . .. .. ..
I = A−1 A = A⊤ A = . .
. a1 · · · an = . . . . .
a⊤
n a⊤
n a1 a⊤
n a2 ··· a⊤
n an
The eigenvalues of an orthogonal matrix A are either real or occur in complex-conjugate pairs, and all
− λ3 + 2
3 λ2 + 2
3 λ − 1 = 0.
Since one of the eigenvalues must be real (we can argue this from the fact that a 3D rotation–reflection matrix
has at least one real eigenvalue), we test λ = +1 or λ = −1. Substituting λ = −1 satisfies the polynomial, so
λ = −1 is indeed a root.
12
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
−(λ2 − 5 λ3 + 1) = 0.
Hence the other two eigenvalues are the roots of λ2 − 53 λ + 1 = 0, which come out to
√
5 ± i 11
,
6
13
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
x1 , x2 , . . . , xn
for Rn .
Example:
5 3
A =
3 5
1 1
,
has a basis of eigenvectors , corresponding to the eigenvalues λ1 = 8 and λ2 = 2. (See Example 1
1 −1
in Sec. 8.2.)
Even if not all n eigenvalues are different, a matrix A may still provide an eigenbasis for Rn . See Example 2
On the other hand, a matrix A may not have enough linearly independent eigenvectors to form a basis. For
k
(with k ̸= 0 arbitrary). Hence this A cannot be diagonalized
we find there is only one eigenvector, of the form
0
by a basis of eigenvectors.
Actually, eigenbases exist under more general conditions than those in Theorem 1. An important case is given
14
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
b = P−1 A P,
A
• If A
b is similar to A, then A
b has the same eigenvalues as A.
y = P−1 x
is an eigenvector of A
b for the same eigenvalue.
Let
6 −3 1 3
A=
,
P =
.
4 −1 1 4
Then
4 −3 1 3 3 0
 = P −1 A P =
=
.
−1 1 1 4 0 2
(Here P −1 was obtained from an earlier calculation in Sec. 7.8 with det P = 1.)
(6 − λ)(−1 − λ) + 12 = λ2 − 5λ + 6 = 0,
which has the same roots λ1 = 3, λ2 = 2. This confirms that  and A share eigenvalues, as expected for similar
matrices.
We can also check the eigenvectors. From the first component of (A − λI)x = 0:
(6 − λ)x1 − 3x2 = 0.
15
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
1
For λ = 3, this gives 3x1 − 3x2 = 0 =⇒ x1 = x2 . Hence an eigenvector is x1 =
.
1
For λ = 2,
4 −3 1 1 4 −3 3 0
y1 = P −1 x1 =
= ,
y2 = P −1 x2 =
= .
−1 1 1 0 −1 1 4 1
Indeed, these y1 , y2 are eigenvectors of the diagonal matrix Â. Thus, we see clearly that the columns of P are
precisely the eigenvectors of A, which is the standard method for diagonalizing a matrix.
D = X−1 A X
is diagonal, with the eigenvalues of A on its main diagonal. Here, X is the matrix whose columns are the
eigenvectors of A.
7.3 0.2 −3.7
A =
−11.5 1.0 .
5.5
17.7 1.8 −9.3
−λ3 − λ2 + 12 λ = 0.
16
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
By applying Gauss elimination to (A−λI)x = 0 for each λ = λ1 , λ2 , λ3 , we find the corresponding eigenvectors.
Then, using Gauss–Jordan elimination (Sec. 7.8, Example 1) on those eigenvectors (arranged as columns) gives
−1 1 2 −0.7 0.2 0.3
X −1
X =
3 −1 ,
1 =
−1.3 −0.2 .
0.7
−1 3 4 0.8 0.2 −0.2
−0.7 0.2 0.3 −3 −4 0
3 0 0
D = X −1 A X =
= .
−1.3 −0.2 0.7 9 4 0 −4
0 0
0 0 0
0.8 0.2 −0.2 −3 −12 0
n X
X n
Q = xT A x = ajk xj xk = a11 x21 + a12 x1 x2 + · · · + ann x2n ,
j=1 k=1
• We usually assume A is symmetric. In that case, A admits an orthonormal basis of eigenvectors. Let X be
D = X−1 A X = XT A X,
17
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
Q = xT A x = (Xy)T A(Xy) = yT XT AX y = yT D y.
Under the substitution x = Xy, a quadratic form Q = xT A x (with A real and symmetric) is transformed
where λ1 , . . . , λn are the eigenvalues of A, and X is an orthogonal matrix whose columns are the corres-
ponding eigenvectors.
Consider
3 4 x1
xT A x = [x1 x2 ]
= 3 x21 + 10 x1 x2 + 2 x22 .
6 2 x2
Note that A is not symmetric as given, but one can form the corresponding symmetric matrix
4+6
1 3 2 3 5
A + AT =
C= = ,
2
6+4
2 5 2
2
so that
xT C x = 3 x21 + 10 x1 x2 + 2 x22 .
Either way, the quadratic form is Q = 3 x21 + 10 x1 x2 + 2 x22 . Diagonalizing C (or directly applying the principal
18
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
• Transforming to coordinates y aligned with these eigenvalues, the quadratic form becomes
Q = 2 y12 + 32 y22 .
Thus,
y12 y2
2 y12 + 32 y22 = 128 =⇒ 2
+ 22 = 1,
8 2
which is an ellipse.
8.5.1 Notations
• For a matrix A = [ ajk ] with complex entries ajk = α + iβ (α, β ∈ R), we write A for the matrix obtained
T
• A denotes the transpose of A, hence also called the conjugate transpose of A.
3 + 4i 1 − i 3 − 4i 1 + i T 3 − 4i 6
Example. A =
=⇒ A=
,
A =
.
6 2 − 5i 6 2 + 5i 1+i 2 + 5i
19
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
T
• Λ is called Hermitian if Λ = Λ (equivalently akj = ajk ).
T
• Λ is Skew-Hermitian if Λ = − Λ.
T
• Λ is Unitary if Λ = Λ−1 .
A Hermitian λ2 − 11λ + 18 = 0 9, 2
1√ √
C Unitary λ2 − i λ − 1 = 0 3 + 12 i, − 12 3 + 12 i
2
Table 8.1: Examples of Hermitian, Skew-Hermitian, and Unitary matrices with their characteristic equations and
eigenvalues.
• The eigenvalues of a Hermitian matrix (and thus of a real symmetric matrix) are real.
• The eigenvalues of a skew-Hermitian matrix (and thus of a real skew-symmetric matrix) are purely
imaginary or zero.
• The eigenvalues of a unitary matrix (and thus of an orthogonal matrix) all have absolute value 1.
20
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
A unitary transformation y = Ax, with A unitary, preserves the value of the (complex) inner product
a · b = a T b,
p p
∥a∥ = aT a = |a1 |2 + · · · + |an |2 .
0, j ̸= k,
aj · ak = aj T ak =
1,
j = k.
A complex square matrix A is unitary if and only if its column vectors (and its row vectors) form a
unitary system.
det(A) = 1.
Any Hermitian, skew-Hermitian, or unitary matrix admits a basis of eigenvectors in Cn that forms a
unitary system.
21
Advanced Engineering Mathematics (Spring 2025)-SNU CEE Hyoseob Noh (hyoddubi1@[Link])
n X
X n
T
x Ax = ajk xj xk .
j=1 k=1
matrix, respectively.
22
References
23