0% found this document useful (0 votes)
92 views5 pages

The Real-Symmetric Spectral Theorem

The document discusses the properties of real symmetric matrices, focusing on eigenvalues and eigenvectors, and establishes that all eigenvalues are real and eigenvectors corresponding to distinct eigenvalues are orthogonal. It provides proofs for the existence of real eigenvalues and an orthonormal basis of eigenvectors, culminating in the Spectral Theorem which states that every real symmetric matrix can be diagonalized by an orthogonal change of basis. The document includes illustrative examples and detailed mathematical derivations to support these concepts.

Uploaded by

Daniel Solomon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views5 pages

The Real-Symmetric Spectral Theorem

The document discusses the properties of real symmetric matrices, focusing on eigenvalues and eigenvectors, and establishes that all eigenvalues are real and eigenvectors corresponding to distinct eigenvalues are orthogonal. It provides proofs for the existence of real eigenvalues and an orthonormal basis of eigenvectors, culminating in the Spectral Theorem which states that every real symmetric matrix can be diagonalized by an orthogonal change of basis. The document includes illustrative examples and detailed mathematical derivations to support these concepts.

Uploaded by

Daniel Solomon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

0.

Preliminaries: Vector Spaces, Inner Products, Matrices


1. Field ℝ

o Contains elements 0,1 with usual arithmetic; closed under +,×;


+,× associative and commutative; distributivity; each nonzero
has a multiplicative inverse; an ordered complete field.
2. Real Vector Space ℝⁿ

o Elements are n-tuples x=(x₁,…,xₙ).

o Vector addition and scalar multiplication satisfy the eight axioms


(commutativity, associativity, zero vector, additive inverses,
distributivity, etc.).
3. Standard Inner Product

o For x,y∈ℝⁿ, define


n
⟨ x , y ⟩:=∑ x i y i .
i=1

o Axioms:

1. Symmetry: ⟨x,y⟩=⟨y,x⟩.
2. Linearity in the first slot: ⟨αx+βz,y⟩=α⟨x,y⟩+β⟨z,y⟩.
3. Positive‐definiteness: ⟨x,x⟩>0 for x≠0; ⟨x,x⟩=0⇔x=0.
4. Matrices and Transpose

o A∈ℝ^{n×n} acts on x∈ℝⁿ by the usual rule.

o The transpose Aᵀ satisfies (Aᵀ){ij}=A{ji}.

o Symmetric: A=Aᵀ means A_{ij}=A_{ji} for all i,j.

5. Orthogonality and Orthonormal Sets

o Vectors u,v are orthogonal iff ⟨u,v⟩=0.

o A set {q₁,…,qₖ} is orthonormal if ⟨qᵢ,qⱼ⟩=δ_{ij} (Kronecker


delta).

6. Eigenvalues and Eigenvectors

o λ∈ℂ and v∈ℂⁿ{0} satisfy Av=λv. Such (λ,v) is an eigenpair.

o If v can be chosen in ℝⁿ, we call it a real eigenvector.


1. All Eigenvalues Are Real
Let A∈ℝ^{n×n} be symmetric. Suppose (λ,v) is an eigenpair with v∈ℂⁿ,
v≠0. We will show λ∈ℝ.
1. Form the standard Hermitian inner product on ℂⁿ:
n
⟨ x , y ⟩:=∑ x i y i .
i=1

2. Compute

⟨ Av , v ⟩=∑ ( Av )i v i=∑ ∑
i i ( j )
A ij v j v i =v ¿ A v=v ¿ ( λv )=λ v ¿ v=λ ⟨ v , v ⟩ .

3. On the other hand, since A is real and symmetric, Aᵀ=A implies A is


Hermitian (A*=A), so
⟨ Av , v ⟩=⟨ v , Av ⟩=⟨ Av , v ⟩ .
Hence the scalar ⟨Av,v⟩ equals its own complex conjugate, so it is real.
4. Thus
λ ⟨ v , v ⟩=⟨ Av , v ⟩ ∈ R .
But ⟨v,v⟩>0 (since v≠0). Therefore λ is real.

2. Eigenvectors for Distinct Eigenvalues Are Orthogonal


Let λ₁≠λ₂ be two real eigenvalues with real eigenvectors v₁,v₂∈ℝⁿ. Then
1. ⟨Av₁,v₂⟩ = ⟨λ₁v₁, v₂⟩ = λ₁⟨v₁,v₂⟩.
2. ⟨v₁,Av₂⟩ = ⟨v₁,λ₂v₂⟩ = λ₂⟨v₁,v₂⟩.
3. But by symmetry of A, ⟨Av₁,v₂⟩ = ⟨v₁,Av₂⟩. Hence
λ ₁ ⟨ v ₁, v ₂ ⟩=λ ₂ ⟨ v ₁ , v ₂ ⟩ ⇒ ( λ ₁− λ ₂ ) ⟨ v ₁ , v ₂ ⟩=0 .
Since λ₁≠λ₂, we get ⟨v₁,v₂⟩=0.

3. Existence of n Real Eigenvalues and an Orthonormal


Basis
We now prove there are n real eigenvalues (counting algebraic multiplicity)
and an orthonormal basis of eigenvectors. We give two proofs.
3.1 Algebraic Proof via the Characteristic Polynomial
1. The characteristic polynomial of A is
p A ( λ ) :=det ( A− λI ) ∈ R [ λ ] ,

a real‐coefficient polynomial of degree n.


2. By the Fundamental Theorem of Algebra, p_A(λ) factors over ℂ as
p A ( λ )=( λ−λ 1) ⋯ ( λ−λ n )

for some λ₁,…,λₙ∈ℂ (counted with multiplicity).


3. From §1, each λᵢ must be real. Hence all n roots are in ℝ.
4. For each (real) root λᵢ, there is a nonzero eigenvector vᵢ with (A−λᵢI)vᵢ=0,
so the geometric multiplicity of λᵢ is ≥1 and ≤ its algebraic multiplicity.
5. Within each eigenspace E_{λᵢ} = ker(A−λᵢI), choose a basis and then
apply the Gram–Schmidt process to convert it into an orthonormal basis. By
§2, vectors from distinct eigenspaces are already orthogonal. Altogether, we
obtain n orthonormal eigenvectors {q₁,…,qₙ}.
6. Form Q=[q₁ … qₙ]∈ℝ^{n×n}. Then QTQ=I and
AQ=[ Aq ₁ … Aq ₙ ] =[ λ ₁ q ₁ … λ ₙq ₙ ] =Q diag ( λ ₁ ,… , λ ₙ ) .
Hence
T T
A=Q diag ( λ1 ,… , λ n ) Q =QΛ Q

3.2 Analytic Proof via the Rayleigh Quotient and Induction


This proof avoids the characteristic polynomial altogether.
1. Define the Rayleigh quotient R: S^{n−1}→ℝ by
⟨ Ax , x ⟩
R ( x )= , x≠0.
⟨ x, x ⟩
Restricted to the unit sphere S^{n−1}={x:‖x‖=1}, R is a continuous
function on a compact set, so it attains a maximum μ and a minimum ν,
both real.
2. Let q be any unit‐vector achieving the maximum μ. We show Aq=μq:
 For any small vector h orthogonal to q (⟨q,h⟩=0), consider f(t)=R(q + t
h). Because R has an extremum at q on the sphere, the directional
derivative in any tangent direction must vanish. One computes (by
expanding numerator and denominator to first order) that
$$ Aq,q,q,h

o Aq,h,q,q= 0
Aq,h= μ,q,h= 0. $$

 Since this holds for all h⊥q, we deduce Aq is a linear combination of q


alone, hence Aq=μ q. So (μ,q) is an eigenpair.

3. Now set V₁ = span{q}. Its orthogonal complement V₁^⊥ has dimension


n−1 and is invariant under A: if x⊥q then
⟨ Ax , q ⟩=⟨ x , Aq ⟩=⟨ x , μq ⟩=μ ⟨ x , q ⟩=0 ,
so A x∈V₁^⊥.
4. By induction on dimension, there is an orthonormal basis of V₁^⊥
consisting of eigenvectors of A|_{V₁^⊥}. Adjoining q gives an orthonormal
basis of ℝⁿ of eigenvectors.

4. Putting It All Together: Spectral Decomposition


 Eigenvalues λ₁,…,λₙ are real (possibly with repetition).

 Eigenvectors q₁,…,qₙ can be chosen orthonormal.

 Assemble Q=[q₁ … qₙ], Λ=diag(λ₁,…,λₙ). Then


T T
Q Q=I , A=QΛ Q .
Equivalently,
n
A=∑ λi q i q Ti ,
i=1

where each projector Pᵢ = qᵢ qᵢ^T is rank-1 and PᵢPⱼ=0 for i≠j, ∑ᵢPᵢ=I.

5. Illustrative 2×2 Example (All Steps Written Out)


Let

A= (21 12) .
1. Verify symmetry: Aᵀ=A.

2. Characteristic polynomial:
det ( A−λI ) =det (2−λ
1
1
2−λ ) 2 2
=( 2−λ ) −1=λ −4 λ +3 .

3. Solve λ²−4λ+3=0 ⇒ λ=1 or λ=3.

4. Eigenvectors:

o For λ=3: solve (A−3I)v=0:

(−11 −11 )( vv )=0 ⇒ v =v .


1

2
1 2

Pick v^{(1)}=(1,1)ᵀ.

o For λ=1: solve (A−I)v=0:

(11 11)( vv )=0 ⇒ v =−v .


1

2
1 2

Pick v^{(2)}=(1,−1)ᵀ.

5. Normalize:

q 1=
1 1
√2 1 ()
,q 2=
1 1
√ 2 −1
. ( )
6. Assemble Q = [q₁ q₂], Λ=diag(3,1). Check QᵀQ=I and A=QΛQᵀ:

Q ΛQ =
T
(
1 1 1 3 0 1 1
)( )(
2 1 −1 0 1 1 −1
=
2 1
1 2
=A. )( )

Final Conclusion
Every real symmetric matrix A admits an orthonormal basis of ℝⁿ
consisting of its eigenvectors, and thus is diagonalizable by an orthogonal
change of basis. In symbols:
$$ A = Q\,Λ\,Q^T,\quad Q^T Q = I,\quad Λ=\mathrm{diag}(\lambda_1,\
dots,\lambda_n)\inℝ^{n×n}. $$
This is the Spectral Theorem in the real symmetric case, proved in full
detail from field, vector space, and inner product axioms with no omitted
steps.

You might also like