0% found this document useful (0 votes)
43 views266 pages

Unraveling Pca

Uploaded by

Yeo Zhi Zheng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views266 pages

Unraveling Pca

Uploaded by

Yeo Zhi Zheng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 266

g

l in
a ve al t
r ip n
un inc one
pr mp sis
co aly
an

m
l oe
e rb
t
pe

Paperback Book
Cover Template - Left to Right

6.14" x 9.21" Book


ext, or essential (155.96mm x 233.93mm)
ments, it may be

WORK. 13.136" x 9.460" Overall Dimensions


printed artwork.
(333.65mm x 240.28mm)

0.606" Spine Width


(15.39mm)

Standard Color
269 Pages
White Paper
Spine Width 0.606" (15.39 mm)

Barcode
ocation & Size Front Cover
000" x 1.200" 6.14" x 9.21"
0mm x 30.48mm) (155.96mm x 233.93mm)
a
e dat
th
h of
ug ls g
ro ta in
t h en r n
e m lea
id da
gu n ine
u
23 rm al
f
ch
20 -fo tic ma
ve al t

pr mp sis
n

ng a d
un inc one
co aly

lo em n
a ath ce a
an
g

ip
in

m ien
l

sc
ra

e m
o
r bl
e t
pe
Contents

1 A friendly introduction to PCA 7


1.1 The basics . . . . . . . . . . . . . . . . . . . . . . . 7
1.2 One-dimensional PCA . . . . . . . . . . . . . . . . 10
1.3 n-Dimensional PCA . . . . . . . . . . . . . . . . . . 21
1.4 Applications of PCA . . . . . . . . . . . . . . . . . . 28

2 Eigenvectors and eigenvalues 33


2.1 Min. reconstruction is max. variance . . . . . . . . 33
2.2 Eigenvectors . . . . . . . . . . . . . . . . . . . . . . 39
2.2.1 What are eigenvectors? . . . . . . . . . . . 40
2.2.2 The spectral theorem . . . . . . . . . . . . . 46
2.2.3 The eigenvectors of which matrix? . . . . . 48
2.3 Data normalization and basis transformations . . . 53
2.4 Quadratic forms . . . . . . . . . . . . . . . . . . . . 60
2.5 Why is PCA optimal? . . . . . . . . . . . . . . . . . 65
2.5.1 Characterizing the PCA solution . . . . . . . 69

3 Proving the spectral theorem 75


3.1 Restating the spectral theorem . . . . . . . . . . . . 75
3.2 Determinants . . . . . . . . . . . . . . . . . . . . . 76
3.2.1 Computing the 2 ⇥ 2 determinant . . . . . . 78
3.2.2 Negative determinants . . . . . . . . . . . . 81
3.2.3 Towards n ⇥ n determinants . . . . . . . . 82
3.2.4 Determinants for n ⇥ n matrices . . . . . . 89
3.3 The characteristic polynomial . . . . . . . . . . . . 95
3.4 Complex numbers . . . . . . . . . . . . . . . . . . 97
3.5 The fundamental theorem of algebra . . . . . . . . 110
3.5.1 From one root to n roots . . . . . . . . . . . 117
CONTENTS 3

3.5.2 Back to eigenvalues . . . . . . . . . . . . . 121


3.6 The spectral theorem . . . . . . . . . . . . . . . . . 127
3.6.1 The Schur decomposition . . . . . . . . . . 128
3.6.2 Proof of the spectral theorem . . . . . . . . 131

4 The singular value decomposition 135


4.1 Eigenvalues and singular values . . . . . . . . . . . 135
4.1.1 Singular vectors and principal components . 141
4.1.2 The singular value decomposition . . . . . . 143
4.1.3 Principal Component Analysis by SVD . . . 151
4.2 Rank . . . . . . . . . . . . . . . . . . . . . . . . . . 155
4.2.1 Computing rank . . . . . . . . . . . . . . . 162
4.3 The pseudo-inverse . . . . . . . . . . . . . . . . . . 164
4.4 Making use of the SVD . . . . . . . . . . . . . . . . 185
4.4.1 Compression and noise removal . . . . . . . 185
4.4.2 Compressing single images . . . . . . . . . 188
4.4.3 Computing rank decompositions . . . . . . 190
4.4.4 Recommendation . . . . . . . . . . . . . . . 195
4.4.5 PCA as matrix decomposition . . . . . . . . 199

5 Computing the ED and SVD 207


5.1 Computing eigenvectors . . . . . . . . . . . . . . . 208
5.1.1 Power iteration: computing 1 eigenvector . 209
5.1.2 Orthogonal iteration: adding another eigen-
vector . . . . . . . . . . . . . . . . . . . . . 219
5.1.3 Adding even more eigenvectors . . . . . . . 224
5.1.4 QR iteration . . . . . . . . . . . . . . . . . . 228
5.2 Computing the SVD . . . . . . . . . . . . . . . . . 231
5.2.1 Power iteration for the SVD . . . . . . . . . 232
5.2.2 Orthogonal iteration for the SVD . . . . . . 234
5.2.3 The QR algorithm for the SVD . . . . . . . . 242

A Some helpful linear algebra properties 249

B Proofs 251
B.2 For Chapter 2 . . . . . . . . . . . . . . . . . . . . . 251
B.3 For Chapter 3 . . . . . . . . . . . . . . . . . . . . . 252
B.4 For Chapter 4 . . . . . . . . . . . . . . . . . . . . . 255
B.5 For Chapter 5 . . . . . . . . . . . . . . . . . . . . . 257
Who is this book for? It wasn’t born from a desire to plug a gap
in the market, and I didn’t have a clear idea of the kind of book
I wanted to create when I started. For a long time, it wasn’t
even meant to be a book.
It started when I began teaching machine learning in 2018.
One of the topics was principal component analysis, and I wanted
to do a good job, since it was a topic that had entranced and mys-
tified me in equal measure as a student. I wanted to do it justice.
It turned out that I couldn’t do that in the time that was avail-
able to prepare a single lecture. On the morning before the lec-
ture, going over my slides I caught several mistakes that I couldn’t
quite fix, and many questions that I didn’t know how to answer.
I brushed it off, survived the lecture despite feeling like an im-
poster, and resolved to do better next year. The next year, the
same thing happened again. Different mistakes, different ques-
tions, but the same feeling of cheating myself and the students.
The thing is, as a student, I never delved deeply into the math-
ematics of anything. I never really got to grips with linear alge-
bra and at best, I could only ever claim to have a working grasp
of what I needed to apply it in simple settings, and to look up
what I didn’t know.
This, I expect, is how it is for many students and researchers
alike: we are taught the fundamentals from the ground up, but
we only start tuning in when things become concrete. The foun-
dations, we either never learn, or quickly forget.
After three years of this, I decided it was enough, and I be-
gan to write a blog post on PCA. This would finally force me to
properly come to grips with the subject, however deep the rabbit
hole went. Once I was finished, I could use the blog post as part
of the teaching material. Proof, if any were need, that I really
did know what I was talking about.
The more I wrote, the more questions I generated for myself,
and the more the blog post span out of control. I decided to split
the thing in two: the first part a simple self-contained story for
those, like my students, who just wanted to understand enough
of PCA to apply it effectively, and a second part that delved into
the fundamentals.
The pattern continued steadily: to really get down to the foun-
6 CONTENTS

dations, I needed to prove the spectral theorem. This required


so many dependencies it became a third blog post, so it didn’t
clutter up the second. Then, I felt, I needed to end on a clear
algorithm for PCA. You can’t say you really understand some-
thing if you don’t know how to implement it. This is always done
through the singular value decomposition (SVD), which I then
felt I should also understand properly. In the end, the expla-
nation of the SVD and the algorithm to compute it became too
big to put into a single blogpost and I split those up too, bring-
ing the tally to five. Five blog posts, with each little too big to
function well as a blog post.
In short, I realized I had accidentally written a book. Like
I said. I don’t really know who it’s for. It’s about PCA, but it
covers much more than that. Almost everything a linear algebra
textbook does. But it’s not a textbook by any stretch; I certainly
wouldn’t teach from it, and it kind of assumes you know linear
algebra already when you start reading.
It’s also not quite popular science. I occasionally go into
narrative mode, but really, the aim is to go deep. To dig up
the foundations.
I suppose it’s a book for people like me. People who have
learned linear algebra and then forgotten it. Who feel a measure
of regret that they didn’t pay full attention the first time around.
If you’ve ever marveled at the magical results that PCA produces,
and you’d like to really understand it, all the way down to the
fundament, then this book will provide you with a guide. But
perhaps it’s best to think of this as a guided tour of the forests of
linear algebra. I’ve been deep into the woods in search of treasure
and I’ve made it back out. Let me show you what I’ve found.

Version v1.1.0, see https://github.com/pbloem/pca-book/


releases.

Acknowledgements I am indebted to Emile van Krieken and


Charlie Lu for corrections and suggestions. My thanks to Nathan
Young for the use of a figure from (Young et al., 2015) and to
John Novembre for useful comments on the interpretation of the
genomic PCA analysis in Chapter 1.

Licensing All figures in this book are released under a Creative


Commons CC-BY-SA license. Source files are available.
C HAPTER 1 · A FRIENDLY
INTRODUCTION TO PRINCIPAL
COMPONENT ANALYSIS

We will work from the outside in: we will view PCA first as a way
of finding a smaller representation of a dataset. This is a typical
machine learning problem: find a compressed representation of
the data such that the reconstructions are as close to the original
as possible. This is a simple view of PCA, an we’ll be able to
compute it with nothing more than gradient descent with a few
extra tricks for satisfying constraints.
Most of the technical stuff only becomes necessary when we
want to understand why PCA works so well: this is where the
spectral theorem and the eigenvalues and -vectors, come in
to the story, they give us a deeper understanding of what we’re
doing. We’ll look at these subjects in Chapter 2.
The spectral theorem is the heart of the method, so it pays to
discuss it in some detail. We’ll state it and explain what it means
in Chapter 2, and leave the proof to Chapter 3.
I’ll assume some basic linear algebra knowledge, but I’ll try to
explain the preliminaries where possible, even if they are funda-
mental to linear algebra. There is a small list of identities and
properties in the appendix, which you may want to consult to
refresh your memory.

1.1 The basics


Let’s begin by setting up some basic notation. We are faced with
some high-dimensional dataset of instances (examples of what-
ever we’re studying) described with real-valued features. That
is, we have n instances xi and each instance is described by a
vector of m real values. We describe the dataset as a whole as
an n ⇥ m matrix X that is, we arrange the examples as rows,
8 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

and the features as columns.

feature j
instance i
Xij

For a simple example, imagine that we have a dataset of 100


people with 2 features measured per person: their monthly salary
and their income over the course of a quarter (i.e. three months).
The second is just the first times 3, so this data is redundant.
One value can be computed from the other, so we really only
need to store one number per person. Here’s what that looks
like in a scatterplot.
1.1. THE BASICS 9

Our intuition that we really only need one number to represent


the data is reflected in the fact that the data form a line. So long
as we know what that line is, we only need to know how far
along the line each instance is, so we can store the whole dataset
in one number per instance.

There are also more complex relations between two features that
have this property, like a parabola or an exponential curve. In
PCA we simplify things by only exploiting linear relations.

Of course, data in the wild is never this clean. Let’s introduce


some small variations between the monthly salary and the in-
come after three months. Some people may have changed jobs,
some people may get bonuses or vacation allowances, some peo-
ple may have extra sources of income. Here’s a more realistic
version of the data.

The data is no longer perfectly linear, but it still seems pretty lin-
ear. If we imagine the same line we had in the last plot, and
represent each person as a dot along that line, we lose some in-
formation, but we still get a decent reconstruction of the data.
If you know how to do linear regression, you can probably
work out how to draw such a line through the data, predicting
the income from the salary or the other way around (and PCA is
very similar to linear regression in many ways). However we’ll
need something that translates to higher dimensions, where we
don’t have a single target feature to predict.
10 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

To do so, we’ll develop a method purely from these first princi-


ples:

1. We want to represent the data using a small set of numbers


per instance.

2. We will limit ourselves to linear transformations of the data


to derive this small set.

3. We want to minimize the error in our reconstruction of the


data. That is, when we map the data back to the original
representation, as best we can, we want it to be as close as
possible to the original.

1.2 One-dimensional PCA


We’ll develop a one-dimensional version of PCA first. That is,
we will represent each instance xi (row i in our data matrix X),
by a single number zi as best we can. We will call zi the la-
tent representation of xi .

The phrase “latent” comes from the Latin for being hidden. This
will make more sense when we see some of the other perspec-
tives on PCA.

To start with, we will assume that the data are mean-centered.


That is, we have subtracted the mean of the data so that the
mean of the new dataset is 0 for all features.

For now, think of this as a bit of necessary data pre-processing.


We will see where this step comes from in the next chapter.
1.2. ONE-DIMENSIONAL PCA 11

Our task is to find a linear transformation from xi to zi , and


another linear transformation back again. A linear transforma-
tion from a vector xi to a single number is just the dot prod-
uct with a vector of weights. We’ll call this vector v. A linear
transformation from a single number zi back to a vector is just
the multiplication of zi by another vector of weights. We’ll call
this vector w. This gives us

zi = vT xi
xi0 = zi w

where xi0 is the reconstruction for instance i.


Look closely at that second line. It expresses exactly the in-
tuition we stated earlier: we will choose one line, represented
by the vector w, and then we just represent each data point by
how far along the line it falls, or in other words, we represent
xi0 as a multiple of w.

xi

xi0

All allowed values of xi0 are on the dotted red line, which is de-
fined by our choice of w. Where each individual ends up is de-
fined by the multiplier zi , wich is determined by the weights v.
Our objective is to choose v and w so that the reconstruction
error, the distance between xi and xi0 is minimized (over all i).
12 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

re
c on
st r
uc
tio
ne
rro
r

w
v x
T i

zi =

We can simplify this picture in two ways.


First, note that many different vectors w define the same dot-
ted line in the image above. So long as the vector points in the
same direction, any length of vector defines the same line, and
if we rescale zi properly, the reconstructed points xi0 will also be
the same. To make our solution unique, we will constrain w to
be a unit vector. That is, a vector with length one: wT w = 1.

To be more precise, this doesn’t leave a unique solution, but two


solutions, assuming that a single direction is optimal, since the
unit vector can point top right, or bottom left. In other words,
if w is a solution, then so is -w.

The second simplification is that we can get rid of the second


vector v. Imagine that we have some fixed w (that is, the dotted
line is fixed). Can we work out which choice of xi0 on the line
will minimize the reconstruction error? We will go into a bit of
detail here, since it helps to set up some intuitions that will be
important in the later chapter of this book.
If you remember your linear algebra, you’ll know that this
happens when the line of the reconstruction error is orthogonal
to the dotted line. In more fancy language, the optimal xi0 is
the orthogonal projection of xi onto the dotted line. If you look
at the image, it’s not too difficult to convince yourself that this
it true. You can imagine the reconstruction error as a kind of
rubber band pulling on xi0 , and the point where it’s orthogonal
1.2. ONE-DIMENSIONAL PCA 13

is where it comes to rest.


In higher dimensions, however, such physical intuitions will
not always save us. Since the relation between orthogonality and
least squares is key to understanding PCA, we will take some
time to prove this properly.

Best approximation theorem (1D) Let w, x 2 Rn , let W be the


line of all multiples of w, and let ŵ be the orthogonal projection
of x onto W. Then, for any other w̄ in W, we have

dist(x, ŵ) < dist(x, w̄)


where dist(a, b) denotes the Euclidean distance ka - bk.

Proof. (Adapted from Thm. 9 Ch. 7 in Lay (1994)) Note that


a-b is the vector that points from the tip of a to the bottom of b.
We can draw three vectors w̄ - x, ŵ - x and w̄ - ŵ as follows:

ŵ - x
x w ŵ w̄
w̄ - ŵ
w̄ - x

ŵ - x x w ŵ w̄
w̄ - ŵ
w̄ - x w̄ - ŵ

x w x w ŵ w̄

By basic vector addition, we know that

w̄ - x = ŵ - x + w̄ - ŵ,
so the three vectors form a triangle (when we arrange them as
shown in the picture).
14 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

We also know, by construction, that ŵ - x is orthogonal to


w̄ - ŵ, so the triangle is right-angled.
Since we have a right angled triangle, the Pythagorean the-
orem tells us that the lengths of the sides of the triangles are
related by

dist(x, w̄)2 = dist(ŵ, x)2 + dist(w̄, ŵ)2 .


Since dist(w̄, ŵ) > 0 (because w̄ and ŵ are not the same point),
we know that dist(x, w̄) must be strictly larger than dist(ŵ, x).

This result generalizes easily to any linear subspace W spanned


by a given set of vectors. We will show a more general proof in
Chapter 4. This principle, the orthogonal projection as the best
approximation is at the heart of a lot of optimization problems.

So, the best reconstruction xi0 of the point xi on the line defined
by z · w (however we choose w) is the orthogonal projection of
xi onto w. So how do we compute an orthogonal projection?
Let’s look at what we have:

We’ve projected xi down onto w and we’ve given the vector from
the projection to the original the name r. By vector addition we
know that zw + r = xi , so r = xi - zw.
1.2. ONE-DIMENSIONAL PCA 15

Two vectors are orthogonal if their dot product is zero, so we’re


looking for a z such that zwT r = 0, or equivalently, wT r =
0. We rewrite

0 = wT r = wT (xi - zw) = wT xi - zwT w .


This gives us z = wT xi /wT w. And, since we’d already defined
w to be a unit vector (so wT w = 1), we get z = wT xi .
Let’s retrace our steps. We had two weight vectors: v to en-
code xi into the single number zi , and w to decode xi as zi w.
We’ve now seen that for any given w, the best choice of zi is
wT xi . In other words, we can set v equal to w and use it
to encode and to decode.

Note that an important requirement for this result (and its gen-
eralizations coming up) is that w is a unit vector.

So, after all that, we can finally state precisely what we’re look-
ing for. Given w our reconstruction is xi0 = zi · w = wT xi · w.
This means we can state our goal as the following constrained
optimization problem:

X
argmin kwT xi · w - xi k
w
i
such that wT w = 1 .

How do we solve this? This is a simple problem and there are


fast ways to solve it exactly. But we’ve done a lot of math already,
and it’s time to show you some results, so we’ll just solve this by
gradient descent for now. Basic gradient descent doesn’t include
constraints, but in simple cases like these, we can use projected
gradient descent: after each gradient update, we project the pa-
rameters back to the subset of parameter space that satisfies the
constraint (in this case simply by dividing w by its length).

If you don’t know how gradient descent works, you can just
imagine a procedure that starts with a random choice for w
and takes small steps in the direction that the function above
decreases the most.
16 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

We start by initializing w to some arbitrary direction. Here’s what


the projections of the income data onto that w look like.

The (squared) sum of the lengths of the blue lines is what we


want to minimize. Clearly, there are better options than this
choice of w. After a few iterations of gradient descent, this is
what we end up with.

You can think of the blue lines of the reconstruction error as


pulling on the line of w and of w as pivoting on the origin.
1.2. ONE-DIMENSIONAL PCA 17

For any dataset (of however many dimensions), there is a unique,


optimal line w. It’s called the first principal component.

If you’ve read other descriptions of PCA, you may be wondering


at this point why I’m not talking about maximizing the vari-
ance. This is an alternative way to define PCA. We’ll discuss
it in the next chapter.

What can we say about the meaning of the elements of w? Re-


member that it does two things: it encodes from x to z and it
decodes from z to x 0 . The encoding is a dot product: a weighted
sum over the elements of x.
In the first example of the income data, before we added the
noise, the second feature was always exactly three times the first
feature. In that case, we could just remember the first feature,
and forget the second. That would be equivalent to encoding
with the vector (1, 0). The compressed representation z would be
equivalent to the first feature and we could decode with z⇥(1, 3).
Or, we could encode with (0, 1) and decode with ( 13 , 1).
Why are the encoding and decoding vectors different in these
cases? Because when we proved that they were the same, we
assumed that they were unit vectors. Our encoding vector is a
unit vector, but the corresponding decoding vector isn’t. PCA
provides solution for which the encoder and the decoder are the
same. It takes apmixture ofpboth features, in different proportions
(in our case 1/ 10 and 3/ 10) . There are a a lot of perspectives
on exactly what this mixture means. We’ll illustrate the first by
looking at a higher dimensional dataset.
We’ll use a dataset of grayscale images of faces produced by
AT&T Laboratories Cambridge called the Olivetti dataset.1
We will describe each pixel as a feature with a value between 0
(black) and 1 (white). The images are 64 ⇥ 64 pixels, so each
image can be described as a single vector of 4096 real values.

Note that by flattening the images into vectors we are entirely


ignoring the grid structure of the features: we are not telling our

1 https://scikit-learn.org/stable/datasets/real_world.html#

the-olivetti-faces-dataset
18 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

A small sample of images from the Olivetti data.

algorithm whether two pixels are right next to each other, or at


opposite ends of the image.

The Olivetti data contains 400 images, so we end up with a data


matrix X of 400 ⇥ 4096.

4096

400 X

feature j
instance i

This is a data scientist’s worst nightmare: data with many more


features than instances. With so many features, the space of pos-
sible instances is vast, and we only have a tiny number to learn
from. Our saving grace is that, like the income/salary exam-
ple, the features in this dataset are highly dependent: knowing
the value of one pixel allows us to say a lot about the value
of other pixels.
For instance, pixels that are close together more often than
not have similar values. The images are often roughly symmet-
ric. All faces will have mostly uniform patches of skin in roughly
1.2. ONE-DIMENSIONAL PCA 19

the same place, and so on.


In short, while our dataset is expressed in 4096 dimensions,
we can probably express the same information in many fewer
numbers, especially if we are willing to trade off a little accu-
racy for better compression.
The procedure we will use to find the first principal compo-
nent for this data is exactly the same as before—search for a unit
vector that minimizes the reconstruction loss—except now the
instances have 4096 features, so w has 4096 dimensions. First,
let’s look at the reconstructions.

Reconstruction from a single principal component. Originals


on the left, reconstructions on the right.

These are not very impressive yet, but to be fair, we’ve com-
pressed each image into a single number, we shouldn’t be sur-
prised that there isn’t much left after we reconstruct it. But that
doesn’t mean that 1D PCA doesn’t offer us anything useful.
What we can do is look at the first principal component in
data space: w is a vector with one element per pixel, so we can
re-arrange it into an image and see what each element in the
vector tells us about the original pixels of the data. We’ll color
the positive elements of w red and the negative values blue.
It looks like this:
20 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

If we think of this as the encoding vector, we can see a heatmap


of which parts of the image the encoding looks at: the darker the
red, the more the value of that pixel is added to z. The darker
the blue, the more it is subtracted.
If we think of this as the decoding vector, we can see that the
larger z is, the more of the red areas gets added to the decoded
image, but the more of the blue areas get subtracted. That is,
two red pixels are positively correlated, and a red and a blue
pixel are negatively correlated. A bright red pixel and a light red
pixel have the same relation as our monthly salary and quarterly
income: one is (approximately) a multiple of the other.
Another interpretation of the principal component is that it
places the images in the dataset on a single line, which mean it
orders the images along a single direction.
To see if there’s any interpretable meaning to this ordering,
we can try moving along the line and decoding the points we
find. We can start at the origin (the vector 0). If we decode that,
we get the mean of our dataset (the so called mean face). By
adding or subtracting a little bit of the principal component, we
can see what happens to the face.

A few things are happening at once: the skin is becoming less


clear, the lines in the face become more pronounced, the glasses
become more pronounced and the mouth curves upward. Most of
this is consistent with moving from a young subject to an old one.
We can test this on the faces in our dataset as well; our prin-
cipal component is a direction in the data space, so we can start
with an image from our data, and take small steps in the direction
of the principal component, or in the opposite direction.

You can think of this as manipulating the single-number latent


representation z by adding some small amount ✏. If we decode
such a point, we get (z + ✏)w = zw + ✏w = x 0 + ✏w. We then
just replace the reconstruction x 0 by the actual point x. Note that
1.3. N-DIMENSIONAL PCA 21

we depend on the linearity of our transformation: for nonlinear


variants of PCA, this trick won’t work anymore.

Here’s what we get.

Note the manipulation of the mouth, in particular in face 41. As


the corners of the mouth go up, the bottom lip goes from curving
outward, with a shadow under the lip to curving inwards, folding
under the teeth, with the shadow turning into a highlight.
Here we see the real power of PCA. While the reconstructions
may not yet be much to write home about, the principal compo-
nent itself allows us, using nothing but a linear transformation
consisting of a single vector, to perform realistic manipulation of
facial data, based on a dataset of just 400 examples.

1.3 n-Dimensional PCA


Enough playing around in a one-dimensional latent space. What
if we want to improve our latent representations by giving them
a little more capacity? How do we do that in a way that gives
us better reconstructions, but keeps the meaningful directions
in the latent space?
Let’s start by updating our notation. xi is still the same vector,
row i in our data matrix X, containing m elements, as many as
we have features. zi is now also a vector (note the boldface).
zi has k elements, where k is the number of latent features, a
parameter we set. In the example above, k = 1. We’ll drop the
i subscript to clarify the notation.
22 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

Let’s say we set k = 2. If we stick to the rules we’ve followed


so far—linear transformations by unit vectors—we end up with
the following task: find two unit vectors w1 and w2 and define
z = (z1 , z2 ) by z1 = xT w1 and z2 = xT w2 . Each latent vector
gives us a reconstruction of x. We sum these together to get
our complete reconstruction.
We can combine the two vectors w1 and w2 in a single matrix
W (as its columns) and write

z = WT x
x 0 = Wz

Or, in diagram form:

This would already work fine as a dimensionality reduction method.


You can think of this as an autoencoder, if you’re familiar with
those. However, we can add one more rule to improve our re-
duced representation. We will require that w2 is orthogonal to
w1 .

This decision is important, and has many useful consequences.


We’ll save those for later. For now, we’ll just take it at face value.
1.3. N-DIMENSIONAL PCA 23

In general, each component wr we add should be orthogonal to


all components before it: for k = 3 we add another unit vector
w3 , which should be orthogonal to both w1 and w3 .
We can summarize these constraints neatly in one matrix equa-
tion: the matrix W, whose columns are our w vectors, should
satisfy:

WT W = I

where I is the k ⇥ k identity matrix. This equation combines


both of our constraints: unit vectors, and mutually orthogonal
vectors. On the diagonal of W T W, we get the dot product of
every column of W with itself (which should be 1 so that it is a
unit vector) and off the diagonal we get the dot product of every
column of W with every other column (which should be 0, so
that they are orthogonal).
How do we find our W? The objective function remains the
same: the sum of squared distances between the instances x and
their reconstructions x 0 . To satisfy the constraints, we can pro-
ceed in two different ways. We’ll call these the combined problem
and the iterative problem.
The combined problem is simply to add the matrix constraint
above and stick it into our optimization function. This gives us

X
argmin kWW T x - xk2
W x
such that W T W = I

The iterative problem defines optima for the vectors wr in se-


quence. We use the same one-dimensional approach as before,
and we find the principal components one after the other. Each
step we add the constraint that the next principal component
should be orthogonal to all the ones we’ve already found.
24 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

To put it more formally, we choose each w1 , . . . , wk in sequence


by optimizing

8 P
>
>
< argmin x kxT w ⇥ w - xk2
w
wr = such that wT w = 1,
>
>
: and wT wi = 0 for i 2 [1 . . . r - 1]

These approaches are very similar. In fact, they’re sometimes


confused as equivalent in the literature. Let’s look how they
relate in detail.
The vector w1 defined in the iterative problem, is the same
vector we found in the one-dimensional setting above: the first
principal component. If the two problems are equivalent (i.e.
they have the same set of solutions), this vector should always be
one of the columns of W in the combined problem.
To show that this isn’t guaranteed, we can look at the case
where k = m. That is, we use as many latent features as we
have features in our data. In this case, the first vector w returned
by the iterative approach is still the first principal component, as
we’ve defined it above. However, for the combined approach, we
can set W = I for a perfect solution: clearly the columns of I are
orthogonal unit vectors, and WW T x - x = IIT x - x = x - x = 0,
so the solution is optimal.
In short, a solution to the combined problem may not be a so-
lution to the iterated problem. What about the other way around,
does solving the iterated problem always give us a solution to the
combined problem? Certainly the vectors returned are always
mutually orthogonal unit vectors, so the constraint is satisfied.
Do we also reach a minimum? It turns out that we do.

Optimality of the iterative approach A solution to the iterative


problem is also a solution to the combined problem.

We will prove this in the second chapter. For now, you’ll have to
take my word for it. The combined problem has a large set of
solutions, and the iterative approach provides a kind of unique
point of reference within that space.
1.3. N-DIMENSIONAL PCA 25

solutions to the solution to the


combined problem iterative problem

We can say more about this later, but for now we will equate the
iterative solution with PCA: the W which not only minimizes the
reconstruction error as a whole, but also each column of W min-
imizes the reconstruction error in isolation, constrained to the
subspace orthogonal to the preceding columns. The combined
problem does not give us the principal components.
Using the iterative approach, and solving it by projective gra-
dient descent, we can have a look at what the other principal
components look like. Let’s start with our income data, and de-
rive the second principal component.

This is a bit of an open door: in two dimensions, there is only


one direction orthogonal to the first principal component. Still,
plotting both components like this gives some indication of what
the second component is doing. The first component captures
the main difference between the people in our dataset: roughly
their monthly salary. The second component captures whatever
is left; all the noise we introduced like end of year bonuses and
alternative sources of income.
Note how PCA reconstructs the original data points. Given the
components w1 , . . . , wk , we represent each x as a weighted sum
26 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

Reconstruction by principal components in the case where


k = 3. Each latent variable zr tells us how much of the r-
th principal component to add to our reconstruction.

over these vectors where the latent features z1 , . . . , zk are the


weights:

x 0 = z1 w1 + z2 w2 + . . . + zk wk
That’s it for the income dataset. We’ve reached k = m, so we
can go no further.
Let’s turn to the dataset of faces, where there are many more
principal components to explore.
If we compute the first 30 principal components, we get the
following reconstructions.

You can still tell the originals from the reconstructions, but many
of the salient features now survive the compression process: the
direction of the gaze, the main proportions of the face, the basic
lighting, and so on. By looking at the first five principal compo-
nents, we can see how this is done.
1.3. N-DIMENSIONAL PCA 27

The first one we’ve seen already. It’s flipped around, this time,
with the blues and reds reversed, but it defines the same line in
space. Note that the magnitude of the higher PCs is much lower:
the first principal component does most of the work of putting
the image in the right region of space, and the more PCs we add,
the more they fine-tune the details.
The second PC captures mostly lighting information. Adding
it to a picture adds to the left side of the image, and subtracts
from the right side. We can see this by applying it to some
faces from the data.

The third PC does the same thing, but for top-to-bottom lighting
changes. The fourth is a bit more subtle. It’s quite challenging
to tell from the plot above what the effect is. Here’s what we see
when we apply it to some faces.
28 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA

The PC seems to capture a lot of the facial features we associate


with gender. The faces on the left look decidedly more ”female”,
and the faces on the right more male. It’s a far cry from the
face manipulation methods that are currently popular, but con-
sidering that we have only 400 examples, we are only allowed
a linear transformation, and that the method originated in 1901
(Pearson, 1901), it’s not bad.

1.4 Applications of PCA


Before we finish up, let’s look at two examples of how PCA is
used in research.
Let’s start with a problem which crops up often in the study of
fossils and other bones. A trained anatomist can look at, say, a
shoulder bone fossil, and tell instantly whether it belongs to a
chimpanzee (which is not very rare) or an early ancestor of hu-
mans (which is extremely rare). Unfortunately such judgements
are usually based on a kind of unconscious instinct, shaped by
years of experience, which makes it hard to back it up scientifi-
cally. ”This is is Hominid fossil, because it looks like one to me,”
isn’t a very rigorous argument.
PCA is often used to turn such a snap judgement into a more
rigorous analysis. We take a bunch of bones that are entirely
indistinguishable to the layperson, and we measure a bunch of
features, like the distances between various parts on the bone.
We then apply PCA and plot the first two principal components.
Here is what such a scatterplot looks like for a collection of
scapulae (shoulder bones) of various great apes and hominids.
1.4. APPLICATIONS OF PCA 29

This particular figure is from Young et al. (2015) (reproduced


with permission) which is available online, but the literature is
full of images like these. Here, the authors took scans of about
350 scapulae. We can clearly see different species forming sep-
arate clusters. If we find a new scapula, we can simply mea-
sure it, project it to the first two principal components, and show
that it ends up among the Homo Ergaster to prove that our find
is special. What’s more, not only can we tell the Hominin fos-
sils apart, we see that they seem to lie on a straight line from
chimpanzees to modern humans, giving a clue as to how hu-
man evolution progressed.

This analysis is based on full 3D scans of the bones, but it also


works if you measure a number of features by hand.

Let’s add one final example, to really hammer home the magic of
PCA. When large-scale genome databases began to be gathered,
one of the first things researchers did, was to perform princi-
pal component analyses.
All you need to do is to extract some features from each DNA
south of Europe between Italy and Greece and in the Near
inform on previous European population genetics research?
East populations captured in this dataset. The spread of sam-
3) What is the diversity of demographic histories captured in
ples in this genetic space from each region varies (SI Appendix,
this sample of European haplotypes?
Supplementary Data 2), with samples from most regions form-
ing a primary cluster reflective of their common ancestry. Some
Results regions appear more heterogenous such as Germany or Malta
(SI Appendix, Supplementary Data 2 and Figs. 2.3 and 2.5),
A European Sample from the UKBB. Investigating the genetic suggestive that a substantial fraction of individuals in the data-
andscape of Europe, we subsetted individuals form the UKBB set have a recent genetic ancestry that does not match most
27) who sample European genetic diversity across the continent. individuals with the same place of birth label, possibly due to
30 CHAPTER 1—A FRIENDLY INTRODUCTION TO PCA
This process included selection based on UKBB phenotype modern economic or other recent migration within the conti-
data as well as initial genetic analysis and is described fully in nent. Due to this heterogeneity in birth-location label versus
Methods. In total, 5,550 individuals from 47 European countries/ principal component (PC) coordinates, using a separate PCA,
egions (henceforth regions) that spanned the continent were we projected our European sample to western Eurasian referen-
sequence. For instance, you can identify a few hundred thousand
used to investigate the European genetic landscape (Fig. 1). ces from the Human Origins dataset (30). We found agreement

important genetic markers, and use the presence or absence of


A
Geographic
each as a binary feature.
Region
Near East
SE Europe Biobank (UKBB) (27). Therefore, we aimed to address several This European sample refines and expands upon previous
research questions in European genetics: work (24). A principal component analysis (PCA) of the allele-
S Europe
E Europe
C 1) What sample of European
FI FI FI LV DE
FI
genetics is found in the UKBB
frequency-based genetic relationship matrix using PLINK
FIFI
FIFI FIFI FIFI
FIFIFI LV
LV (28, 29) shows that the genetic landscape of Europe is one pri-
FI
dataset?FIFIFI
FI FI
FI
FI
FIFI
FI
FI
FI LV
FI FI LVLV LVLV
0.02 FI FIFI
FI FIFIFI
FIFIFI
FI
FIFIFI
FI
FI FI FI
FI
FI
FI
FI
FI
FIFI
FI
FI
FI EE FILTLT LTLT LV LV methods, how does the genetic
LV marily of gradients. These gradients link genetic regions to one
C Europe
FIFI FIFI
FIUtilizing
FI2) FI
FI FIFIFI FI
FIFI FIFI
FI
EE
FI FIFI
FI FI
EE
LV
chunk-based
FI
FI
FI
FI RU EEFI
LVFILT FILV
LVEE LTLT
RU
LV
LV
LT
LT
haplotype
FI
FI FI
FI FIFI
FI
FI FINOLVEE DEwithin LT
LTLV
LTLT
LT LVLT
LT another typically by land but also by sea, as is evidenced in the
N Europe FIFI
FI FI FI
FIFI FIFI
FISE
structure
FI
NO FI foundLT
EE LV LTEE
LT this large European sample match and
SE FIFI SE FI EELT LT
LTLT LV
LT LT
LTLTLT
LT
LV
LTLV
south of Europe between Italy and Greece and in the Near
FIFIFI
FINO FI
NO
NO FI
FIinform
FI EE on previous EE
EE LT
LVLT LTRU
LV
LV
LV
LT
LT
LV
LT
European
LV population genetics research?
W Europe
SE SE
SE FI
FI DK
FI
SE SE
SE
SE
SE FI FI LT LT LT LV
LV
LTLTLT LTLT
LT East populations captured in this dataset. The spread of sam-
SE
NO NOSE SESESE SE
NO
SE
SE
SE
SE SE
SE
FINO FI FISE
SE SESENO
SE3) What LV
SE is the PLLTRU
RULT
LV
RU
LV
LVLVLT of demographic histories captured in
RU
diversity
LT
NODK NO NO
NO NOSESE SE
NO SE
SE SE
SE
SE SESE SE SESESE SE
SE SE LTRU LTLTLV LV LV
Brit. & Ire. IS ISDK
NO
NO
NO DKNO
IS
DK
NO
NO
NO
SENO
DK
NO NO
SE
NO
SE
NO
SENO
SESE
SE
NO
SE
DK
SE
SE
SE SE
NO
SE
NO
NO
SE
SE
NO
SESE
SENO
SE
SE
SE
SE
DKSENO SE
UAFI
SE
SE
SE
SESE SE
SESE
SE
SE
SESESE SE
SE this
PL
DE LV
PL
sample PL LV
RU
LT
PLRU
RU
DK
RU
of
LT
PL
RURU RU
European
RU haplotypes?
ples in this genetic space from each region varies (SI Appendix,
DK IS NONL
NO
DK
NO DK
IS
NO NO
DK
DKDK
SE
DK
SE
NONO
NO
NO
DKDK
NO
NO
DK SE
NO
NO
SESE
NO
NO
NO DK
SE
NO
DK SE
SE SE SE SE
SE
SE SESE DE PLPL PL PL
PL
PLPLPLRU PLPL
RU PL RU RU
RU
BY
LTRU RU
RU LV RU LT LT
IE IE IS NL
IS NO
IS
NL
NL
IS
DK
IS
NL
DK DK
IS
DKIS
DK
IS
DK
NL
NO NO
DK
NO NO
DK SE
DKNO
DK
NO
NO
DK
DK
DK
DK DK
DK
NO
NO
NO
SE
DKSE
NO
DK
NO
NO
NO
SE
NO
DK
NO
DK
NO
SE
DK
DK
SE
DK
DK
DK SE
SEDK SE
SE
SE
SE SE
SE
NO
SE SE
SE
SE
NO SE
SE NOSC DK FIIM SC DEAT DE PLPLPL
PL
PL PL
PL
PLPL
PL
PL
PL
PL
PL
PL PL
PL PL
PL
PL
PL PL
RURURU
RURU
RULV
RU
BY
BY
LV
RURU
RU GR
RU
RU Supplementary Data 2), with samples from most regions form-
IE IE IE CY NONL
NI SE
NL
NO
GR
DK
DK
NONO
ISNO
DK DK
ESDK
DKNO
SE DK
DK
NO
DK
DK
SE
DK
NL DK
SE
SE
IS
DKDKSE SE SE DK DE CH PL
PL PL
PL PLPLPL
PL PL
PL PL
RU
PL
PL
PLRU
RUFI
PL
RURU
UA
LTRU
LV UA
RU
RURU
UA LV
NI SC IMIEIE
IE
IE
NI OKIE
SCIE IE
SC
OK NL
NL IS NL
NO DK
NO
DKIS
DK
DK DK
NO
DK NO
NO
DK
DK
DE DK
NO DK
SE
NO
DE
NO
DK
DK DK
DK
DK
DK
DK DK
DK
NODKDK
DK
SEDKDK
SE
SE DK SE
DK
SENL
SE SCFI
DE SENL
PL PL PL
PLPL
PL PL
PL PL
PL PL
PL
PLPLUA
RU
RU UA
RU
EE
RU
PL
RU
PL
PL
UA
RU BY
UA
RU
RUUA
RU
UA RUEE
RU
RU
UA
RU ing a primary cluster reflective of their common ancestry. Some
IE
SCIE
NIIM
IE
NI
NI
IE
IE
IE
IE
SC
NI
IE
IE
IE
IE
NI
NI
IE
SC
IE
NI
IE IE
FR
NI
IE
IE
NI
IE
NI
IE
NI
SC
IE
IE
IM
FO
SC
GI
NI
NI
NI
IE IE
BE
SC
IM
SC
WA
IESC
EN
SC
DE
NI
IE
IS
NI
IENI
DE
SC
IE
EN SC
CI
NL
IE
UA
IE
GR
CY
IMIM
IM
NI
SC
OK
IM
WAMT
NL
OK
NI
SC
NI
SC
AT
NL
OK
DK
CH
MT
IMOK
NI
WA
NL
NL
SC
NI
NI
IS
DE
OK
FO
DE
NL
NL
FO
NL
NI
SC
DENL
FO
NONL
DK
EN
NL
NLNL
NL
NL NO
CY
NL
DK
NO
NO
NL
DK
NL
DK
NL
DE
NL
FO
DK
NL
BEDK
SE
NL
NO
DK
DK
DK DK
NO
DK
DK
NO
SE
DK
DE
NO DK
SE
DK
DK
DK
BE
DKNO
SE
DK
DK
DK
DK
DKDK
DK
SE
DKDK
DK
DK
SE
DK
DEDK
DK
SESE
SESC DK SEDK DE
DKDE NO SE DE PL
DE DE CZ PLCZPL PL
PLPL PL
PL PL
PL
DE
PL
AT
PL
PL
PL
PL
PL
UA
PL
PL
PL
PL
PL
EN PL
PL
RU
PL
PL
PLPL
UA
UARU
RU
RU
UA
RU
PL
ATPL
RU
RU
RU
UA
RU
UA
UA
RU PL
PL
RU
RU
UA
RU
UA
PL
RU
LT
RURU
UA
RU
RU
RU
RU
RU
LV
RU
LVRURU
LT RU
SC
NI
IE
IE
IEIE
NI IE
DENI
IE
IE
WA
MT
IE
NISC
IE IE
IM
NI
IE
IE WA
EN
SC
NI
IE IE
IE
IE
CY
NI WA
SC
EN
WA
WA NI
EN
OK
SC
NI
DE NI
IE
WA
SC
SC
IE
EN
NIIE
CIIE
WA
CI
NI WA
NL
SC
MTOK
WA OK
GI
SC
CY
MTDE
NI
NL
CI
OK
IM
NI DK
AT
EN
DEWA
NL
SCNL
WA
NL
EN NL
NO
DK
NL NL
NL
NL
DK
DK SE
DE
DK
NL
DEITDK
NO NL
NO
DE DK
DKDK DE
DK
AT DE PL IT FR
DK
DE WA SE SE FRDE PLCZ
DE CZ
PL
PLPL PL
PL
PLDE
PL
PL PL
UAUA
PLRUPL
PL PL
RU
UA
RU
RU
UA
RU UA RU
EN
IENI
IE
NI CH
WA IM
SC
NI
IE
NI
IE
MT
IE NI
IE
NISC
IEIE
NI
IE
WA NI
EN
IE
NIIE
SC
IE
MTOKNI
IM
SC
SCNI
SC
IE
NI
NI
IE
CI IM
IEOK
NI
WA
DE
IE
NIEN
PT
EN
WA OK
EN
NI
INL
IE
NI
SC
SC E
IM
DENI
IM
ENOK
IM
NI
EN
IE
GI
IE
OK IM
IE
SC
NI EN
OK
NI
NL
GI
IE
EN
WA IM
EN
FO
NI
ISGR
DE
MTPT
CH
WA
MT MT
EN
NL
DE
CY
CIEN
NO
NL
EN
BE
CY
NI
MT NL
GI
BE NL
DK
NLIE
NL
NL MT
NL
NL SC
NL NO
NO DK DE PL PLResults
PL
RU PLPL PL PL
RU
PL UAUA
RUUA RU
RU regions appear more heterogenous such as Germany or Malta
Sample Count IE
SE
IE
DE
SC
IE
EN
EN
CI
SC
IE
IE
NI
IE
IE
NI
NI
SC
IE
SC
IEIE
NI
IE
WA
NI
SC
IE
NI
IE
CYIE
SC
WA
IE
SC
IE
WASC
WA
WA
EN
IE
GI
WA
NI
SC
FR
SC
SC
SC
IE
WA
CH
EN
WA
SC
DE
EN
NI
WA
WA
NI
NI
SE
DEWA
SC
BE
DE
MT
SC
IE
NI
IE AT
OK
SC
SC
NI
NI
WA
OK
DE
EN
WA
NI
WA
FR
WA
AT
NI
WA
EN
IE
SC
WA
SC
IE EN
DE
SC
IM
CY
CI
IE
NI
NI
MT
WA
WA
EN
SC
MTEN
SC
IM
MT
OK
IM
SC
NI
MT
IE
SC
EN
SC
WA
DE
MT
EN
IE
GI
WA
DE
SC
NIEN
CI
CI
NI
O
EN
CY
SC
EN
MT
WA
SC
IM
IMNI
DE
BG
CY
WA
SC
NI
OK
TR
IE MT
NL
NI
OK
SC
AT
SC
MT
EN
WA
NI
TR
NO
DE
NI
MT
CY
IEEN
TR
IE
ES
SC
OK
WA
NI
MT
OK
CH
SC
NI
SC
IE
IE
ENNI
GI
EN
SC
MT
IE EN
WA
GR
IE
IM
OK
CH
IE
CH
DESC
NI
GI
CY
DESE
K
WA
NL
EN
SC
MT
MT
IE
WA
DE
NI
IM
NIEN
NI
WA
SC
MT
CI
IE
CY
EN
GI
IM
GI
NI
MT
IM
EN
DE
IM
MT
CY
NL
MT
EN
CI
NIWA
WA
SC
NL
MT
SC
WA
CY
EN
WA
PT
CY
NL
MT
CI
SC
AT
NL
WA
NI
EN
IE
NI
FO
NL
NI
EN
IM
NL
DE
EN
EN
NI
DE NL
WANL
CH
IM
BE
GI
GI
MT
SC
NL
EN
DE
EN
NL
NI
DE
NL
WA
FR
NL
MT
AT
NL
NL
DK
EN
BE
EN
NL
NL
CY
NL
NL
DK
NO
FI NL
NL
NL DE
DK DE
NLAT
ENDE DENO DE DE
DE
DK DE
DE DE
DKDECZCZDEFI
DE ENSE
DEDE AT
DE CZ
DE CZ CZPL
FR PL
CZ
DE
CZ PLSK
CZ
SK
CZPL
PLPLPLUA
PL
PL
PL
PL
PL
PL
RU
PL RU
UA UA
RU
RU
RU
UA
RU
UA RU
EN IE
SC
IE
IE
AT
SC
NI
SC
IE
CH
DEWA
IE
WAEN
DE
IE
BE
IE
SC
WA
DE NI
IE
NI
AT
CY
CI
IE
GI
CI EN
OKWA
IE
NI
IE
SC
SC
NI SE
EN
SC
EN
WA
WA
NI
NL
SC
IE
WA
NO
ATDE
WA
IMCI
CY
SC
SC
FR
EN
WA
WA
NI
NI
IE
WA
SC
AT
MT
WA
MT
NI
NI
MT GI
IM
CY
MT
WA
NI
WA
SC
MT
CY
FR
IE SC
SC
IE
IM
FI
WA
GI
FR
EN
CI
DE
IECY
CI
MT
IM
SC
DE
WA
BE
EN
EN
IM
CY
NI
NI
PT
WA
SCMT
SE
DE
MT
DE
CI
IE
EN
WAEN
SC
CY
SCWA
MT
IM
WA
NI
EN
WABE
CY
SC
NI
MT
DE
NI
SC
MT
EN
EN
AT
WA
WA
CH
MTEN
EN
MT
SC
EN
WA
CY
CYCI
EN
MT
WA
CI
NI
NL
GI
MT
DE
CY
CY BE
EN
IE
SCBE
NI
CY
BE
MT
IM
EN
GI
CI
IE
NL
OK
IM
CI
CI
WA
GI
CY
DEEN
IM
SC
CH
GI
AT
ENGI
NI
CY
NI
BE
WA
SC
AT
NL
MT
CI
EN
WA
MT
MTWA
CY
EN
AT
WA
CZ
GR
MT
EN
EN
DEMT
AT
CH
EN
CI
IM
NI
IM
AT
CY
DE
EN
EN NL
OK
WA
MT
OK
BE
SC
MT
MT
GI NI
NI
CH
GR
NL
SC
CY
NL
IM
EN
NI
NL
NL
OK
CY
WA
NL
NL
MT
AT
SC
OK
WA
NL
EN
BE DE
IM
CI
SC
MT
EN
NI
CY
BE
DE
NL
CI
GI
NLNL
NO
NL BE
NL
EN
NL
DE
NL
SE
EN
NL
NL
NL
NL
BE
DE
MTMT
SC
NL
NL
NLDE
NLDK DE
DK
CH DE DE DEDK
SE
CZ
CZEN
AT
DE CZ PL
DEAT
DE CZ CZ
CZSKCZ
CZ
CZCZ
CZCZ
CZ
PL
PL
CZCZ
CZ PLPL
CZ
SK
PLPLPLUA
PL
UA UA
UA
PL
UA RU (SI Appendix, Supplementary Data 2 and Figs. 2.3 and 2.5),
CY
DE
SC
IE
NO
SCNI
OK
WA
NI
SC
NI
SE
SC
WA
NI
SC
DE
FR
WA
FR
WA
EN
CY
WA
CI
PT
EN
WA
DE
SE
IM
NI
MT
MT AT
SC
SC
NI
WA
ENTR
CY
WA
BE
AT
CI
WA
SC
DE
DE
WA
GI
WA
CY
SC
DE CY
SC
MT
FR
CI
EN
NI
MT
CI
SC
PT
IM
NL
GI
SC SC
GI
NL
AT
CY
NI
NI
MT
FR
CI
WA
IM
EN
CY
CI CI
EN
SC
OK
AT
WA
DEMT
BE
GI
NI
WA
MT
CI
WA
DE
EN
DE
SC
GI
CI
WA
DE WA
BE
MT
NI
CY
WA MT
FR
BG
WA
EN
DE
CI EN
BE
CI
WA
CI
CH
MT
EN
EN
AT
CY
FR
IM
CY
EN
SC
DE
CI
MT
FR
FR
SC
CY
MT
SE
CY
CI
CI
AT
MT
SC
MT
OK
CY
MT
NL
BE
WA
EN
NL
EN
DE
OK
EN
BE
CI
SC
DE
WA
IM
WA
EN
NL
NL
GI
CY
EN
BE
DE
EN
BE
IE
CY
CY
WA
DE
BE
EN
GI
CY
IM
MT
CI
CY
NL
CI
NLCY
SC
NL
EN
CI
NL
MT
IMMT
SC
DE
CI
CY
IM
NL
NL
DE
NL NL
NL
NLNL
WA
BECY
NL
WA
CY
NL
NL
DE
NO
BEDEDE
DE
AT
NLAT
CH AT
CY
DE CYDE
DK
DE CZ SE
ATAT
AT
RO NLCZ
CZ PL
AT CZ
AT DE DE
CZ
CZPL
AT
CZCZCZ
CZ
CZ
DE
CZ
CZ
CZ
CZ
CZ CZCZ
CZCZ
CZ
CZ
CZ
DECZ
HR PL
PL
CZ ATCZ
HU
PL
SK
RS
CZPLHUPLCZ
SK
SK
CZ
SK SKUA
SK UA
PL PL RU
RUUALT UA
RU
RU from the UKBB. Investigating the genetic
10 FR NI
IMNICI
OKCH DECI
NL
EN
CI MT
EN
BE C
WA
FR
EN
BECY
BE
CHCIBE GR
INL
NIDE
BEBE
BE DEDK AT ATAT
DE ATPL
ATAT PL
CZ CZ
SK CZPLARU European
UA UA Sample
UA suggestive that a substantial fraction of individuals in the data-
DE UA
WACY EN IE WAWAMT
SC
IMWA
EN NO
MT
MT BE
EN
CI
FR
CY
EN CI
MT IM
EN
EN CI
WA IT
MT
EN
GI MT
CI
NL PT
DE EN
BE
CI TRNL
CH
NLNL DE NLDE HU
DE AT NO IM CZ
CZAT
CZCZ CZ AT CZBAPL RU
CY
CH GI
EN
WA
FR
NI
DE
CI
CI
DE WA
AT
OK
FR
SC
CIDE
MT
FR
GI
CI
WA
BE
IM
FR
CI
WA
DE
CI
CI
GI
TRFR
FR
BE TR
CI
NL
CI CI
EN
CI
MT
CI
BE
BE
DE
CI
NL
IE
EN
BE
BE
AT
NL
BE
BEBE
FR
DE
BE
BE
MT
NO
CY
NLCI
BE
BE
CI
NL
CI
EN
BE
NL
CY
EN
BE
CH
BE
CI
BE
NL
AT
DK
NL
NO
BE
BEDE AT
AT
BE
NL
ATAT
AT
DE
DE
DE
FR
DE
BE DE
ATDEAT
CH DE
ATCZ
DE
BE
AT
AT
AT
AT
AT
AT CZ
CZ
HU
CZ
AT
ATCZ
CZ CZ
CZDK
AT
CZ
CZ
CZCZ
HU
SKSK
SK
SK
SK
SK
HU
HR
CZ
SK
CZ
HU
SK
SK
SK
CZ HUUA
UA
SKSKRUUA
PL LT DK
WA CI FR
CH FRCI
CI
FR CI
CI
CYCI
BEWA
EN
FR
IT
CI BE
BE
CI CI
WA BE
BE BE BEDK CIAT SE SEDE
AT CZ
AT ATCHHU
ATAT
AT CZ
HR
AT
AT AT
ATHU HU CZ SKSKHU
SK
HR
HU landscape
HRPL
LVPL LV of Europe, we subsetted individuals form the UKBB set have a recent genetic ancestry that does not match most
25 CI
ATCYCI IE
BE
CI
BE NLCI
BE
CH
BENL
FR
MT
FR
CICH CI
NL
DE
BE
FR
BEBE
BEBE BEBE
BE
CH
CHNL DE
MT CZ
FR
AT PL
HU ATIT AT AT
RS
AT AT
ATAT
AT
HU AT
AT AT
AT HR
HU HU
HR
SISI
HR
HUHU HR
SI HU HU CZ
CZ UARU RU
BE
FR FR GI CI
BE
BEBE
FR
FR
CI
WA
CHGIFRBE
BE
NL
BE
BE
CH
DE BE
BE
BEBE
BE FRFR AT CH
BE ATDE NL
DESE BECZ
AT
DE AT
AT AT AT
AT
DE
AT
DE AT
CZ
ATATAT AT
NL AT
AT
HU
AT AT
CH
AT
HU
HU
AT CZ
AT
AT
AT SI
AT
SI
CZ RO
SI
HU
SI
HU
SI BA
SI
SK
SK
HU PLPL who
(27) GR sample RU
European genetic diversity across the continent.
CI
FR
IE
FRFRBE
PT
FR
BE
BE
FR
BE BE
PT
CY
BE
FR
FR
CH
RU
BE
BE
FR
MT
FR
IT
FR
NL
MT
BE
BE
FR
BE
CH
BE BE
CH
BE
BE CI
CH
BE
NL
FRCI
EN
CI
CH
CH
NIAT CH NL
ATMTAT AT SC AT RS
AT
AT
DKAT
HU
AT
ATAT
AT
HU
AT
BA HUHU
AT
HU
SI
AT
HU
CZCZ
HU HU
AT
HR
HU HU HU
HU HRHU
SK
HR
ROSKBA
UARUBG
individuals with the same place of birth label, possibly due to
50 0.00 FR FR BEFR
CHCHBE
IMMT
WAATBE GR
NI
MT DE CHDE
AT
CH AT ATHU AT
ATAT
CH
DE
HU HR
AT NLHUHU
AT HU HUHU
HU HUHU
HU
HR
HU HU This
SI
SI UA
FR SCFRFR
FR
FR
BE
FR
GI
DE CY
FR
BE FR
BE
IMCH
BE
FR
BECH
CH CH
FR
CH
DE
CH DE
FR DERS AT AT DK ROAT
AT AT
ATATAT AT
HU HUHU
HURORO
HR HU
ROBA
HUROCZ BA process included selection based on UKBB phenotype modern economic or other recent migration within the conti-
FR FR ESFR FR
FR FR
AT
CH
DE
FR
CH FR
FRCH
FR
CH
CH
FR
BE
CH
WA
PT
SC
CHCH
CH
DE
CH
CH
CH
CH
CH
CHCH
DE
CH
GI
CH
CH
CH
CH
BE CH
ATIT
AT
CH CH
AT AT SE DE ATCH AT
AT HU
HURO
HURS
HU
HR
HUIT BA HU
HU
HR RO RO FI
HR
100 FRES ES FR
CH FR
BE
CH FR
AT BECH
CH
CH
CH
CH
FR
CH
CH
CH
CH
FR CH
CH
BE CH
CH
CH
CH CH
CH
CH
CH
DECH DE AT IT
AT NO DE AT AT
AT HU HU
HUHR
IT ITBA HUHRRO
BA data
BABY
RORU
as well as initial genetic analysis and is described fully in nent. Due to this heterogeneity in birth-location label versus
FR
ENFR FR
GI FR
CH FR
FR
GI
CH
BE
CH FR
ITCH
DE
CH CH
CH
MT
CH
CH FRNL
DE CH HR
AT ATRO
AT HURS BA
RS
HR BA
RS
Principal Component 1

ES FRFR
FRFR
DE
GI FR FR MT
CH
AT
FRCY
FR
FR
CH
CH
FR
FR
CH
ENIT
CH SE
CHDKNL BE SEAT AT
HU
ES HRBA
BARO
BA
BAHR BA
BA
BA BAMethods.
RU In total, 5,550 individuals from 47 European countries/ principal component (PC) coordinates, using a separate PCA,
FR ES FR CHFR FR
FR
FR
FR FR
CH AT TR RS
HR
HR HRBA
HR
RSHR
BA
RS BA
RS RO
150 ES FR FR
FR FR GI
FR
GI
FR
FR
FR
CHFR
GI
GI FRCH
CH
FR
FR
CH
FR
CHCH CHIT AT ATCYCH IT HUBARO
HU RS
HR
RO
RS
BARO
BA
RS
BA BABA
RS
HR RU
BA RU
ES FR FRFR
FR
CH
FRFR
FRFR
CH
FR
IT
FR IT CH
MTCH
FR
SC
BE
WAEN
FR TRAT NL GR CH
FR IT CZ
RO RS
ROHR RS
RS
RS
RS
BA
HR HR
RS
PLRU
HRHRregions (henceforth regions) that spanned the continent were we projected our European sample to western Eurasian referen-
ESES ES ESFR FR FR GR CH AT RORORS HR
BA MK
BA CZRUto investigate the European genetic landscape (Fig. 1).
200 ESESES ESES FR ES
FR
FR FRCH DE
FR
CH CHGR CH RORSATRO
RS
HRBARS
RS
HR
RS
RO
BG
RS
BA
RS
RO
HR
RS
MK
RO
RS
RSRO
HR
RS
HR
HR
HR
RO RO
MK
BA used
RO
RU ces from the Human Origins dataset (30). We found agreement
ES ES FR GR
CH TR
IT MK
BGRS RO
BGRSHR RU
ESES ES ESES ESES FR FR
ES CH FR
ITCHCH SE TR TR RO
RORORS RO
RO RS
HR BA
MK
RO RSRS
BG MK
ES
ES
ES ES ES ES
FR
PT
ES GIFR GIFRFR IT NO RSRO BG HURU
ESESESES
ES ES
ES
ES ES
ES
ES ES ES
GI
ESPT ESES FR
FR
ES ITFR
IT FR IT IT IT RO RS BABGBG
BG
RS
RS
BG RU
CH
HUPL
BG
MK UA
RU
ES
ES
ES
ESES
ES PT
ES
ES
ES
ESES
ES
PTPT
ES
ES
ES
ES
GI
CH
ES ES
PT
ES
PTES
ES
PT
ES
ES
PT
ESES
ES
ES
PT PT
CH
PT
PT
ES
PT
PT
FR SC
PT
ES MT AT
NI
GRIT NL IT ITHU CH RO RO
BG
BG
RO
RU
RO
BG
BG
BG
RO
BG
BG
RO
BG
BGRO
PL
RO RURU
RUPL
ES PT PT
PT PT
PT
ES
PT PT
ES
ES PT
ES PT
PT ES
PT ES
PT ITIT
SCMTBE ES
BE FR ITHU RO BG
RO RO
BG RU
ES ES
ES ES ES
PT
PT
ES
PTES
ES
PT
PT
PT
ES
ES
PT
ES
ES
ES
PT
PT
PT PT
ES
PT
PT
ES
ES
ES
PT
PTPT
PT
ES
ES
PTPT
PTPT
PT
DEES
ES
PT
ES
PT MT
PT
CYMT
MTATAT
NLIESCNL
IT
IT CH
ROBA BG
ROBG
BGBG
BGBG BG
BG
BGROBG A TR
B PT
PTESES
PT
ES
PT
PT ES
GI
PT ES
ES
PT
ESPT
PTES
GI
PT ES
GI
PTPT
FR
GI
PT
ES PT
PT
PT
ESPT
ES
ES ITFR
PT
PTFR PT FR MT
MT CH
BE
IT IT CH AT HU BG
BA BG
BG BGRU
ES PTES
ES ES
PT
ES
GI
PT ESGI
PT
PT
PT ES
PT
GI
PT
PT
PT
PT PT FR
PT
GI MT PT
IT ITFRIT
IT IT BG
MKBG RO
BGTR
BG
BG
RO BG
Turkey
Near East ES PTES
ESES
PT
FR
ES
PT
PT
PTPT
PT PT
ES PT
ES
ES ES
PT ES
PT
PT PT
PT
PT
GI
PT
PT
GI
GI
PT
PT
ES
PT
GI
PT
GI
PT
GI FR
IT IT
IT
IT MT PT
ITIT IT
IT
MT
GR IT DEITWABE BE IT CZ RS MKBG
BG
BG
BG BG
BGBG
BGBG
TR
BGBG
TRBG
TR Geographic
Cyprus ES PT PT PT
PT
ESGI
PTPT
PT IT
PT ES
IT PT PT ITITCY NL
MT AT BG BG
BGGRTR TRHU
ES
PTPT GI
GI IT IT
IT IT
FR
IT ITCYIT
IT IT
CY PT
PT
IT CZ MK
TR MK TR Region
Slovenia PT GIGIIT IT FRFR CH GR GR MKKO CZ
KOMK HU
BG MK UA
Serbia/Montenegro ITIT
IT
ITBEIT IT KO ROBGBG BG
Romania ES
ESESPT
ESES GI GIIT
ES
PT
IT CY IT CY
IT IT ITIT RS
TR
GRGR
MK GRBG
ROTRTR HU Near East

ITIT IT GR
IT IT TRTRBG
North Macedonia FR MK
GR
ITIT GR KOKO
SE Europe
Kosovo
SE Europe −0.02 IT
GI IT IT AL
RS GRHU GR GR
Greece IT
ITIT IT
FR IT
ITIT ITFR
ITIT GRGR
AL GR ALGR GR S Europe
C
Croatia
MT ITITIT
CH IT
FR KO GRGR AL
GR GR
GR
RS GR
GRGR TR TR E Europe FI FI FI LV DE
FI
GR
FIFI
AL
FIFI
Bulgaria IT GRGR GRCH
GR FI
FIFIFI
FI
FIFI
FI
FI FIFI
FIFI
FI
FI
FI FI
FI
FIFIFI
FI
LV
LV
LV
LVLV LVLV
Bosnia and Herzegovina IT
ITFR ITITIT IT AL AL
ALAL GR
GR GR
GR GR TRTR 0.02 FI FIFI
FI FI
FIFIFI
FI
FI
FIFIFI
FI
FI FI FI
FI
FI
FI
FIFIFI
FI
FI
FI
FIFILT
EE LT LTLT LV LV
LV
GR
MK
AL
MK KOGR
GR
HU GR GR
GR TR
C Europe FI FI
FI
FIFI FIFI
FI
FI FI
FI
FIFI FI FI
FI
FI FIFI
FI FIFI
FI
FI FI
EE
LV
FI
FI
FI RU FI
LVFILT
LV FILV
EE LTLT
RU
LV
LT
LT
Albania IT ITBE ITIT CY GR GR GR
GRAL
GR GR
GR TR TRTR TR CZ FIFIFI FI
FIFI FI
FI
FI
FIFISE
FI
FIFI
FI
FI
EE
FI
FI LV EE EEDE LT LT
LT
EE LV
LTLT
LTLV
LVLT
LT
Spain
FR IT FR IT
IT ITBE GR
ITIT CY GR GR
GRGR TR TR TR N Europe SE
FIFIFI
FI
FI
FIFI SE FIFI FI NO
FI FI EEFI
FINO EE
EE
LT
EE
EE
LV
LT LT
LTLT
LV
LT
LV
LT LT
LT
RULTLTLT
LT
LV
LTLV
Portugal MT ITIT IT
CH ITGR TR GR
GR GRTR UA
TR
TRTR TR SE SE SE
FINO
SE FI
NO
NO FI
FI FI LT LT LV
LT LT
LV
LT
LV
LT
LV
LT
LT
LV
LTLTLV
LTLT
IT GICY
ITMT
MT
ITITIT IT GR GRGR
BG TR RU
TR W Europe NO SESESE SE SE SE
SE
SE SE
SE
SE SE NO FI DK
FI LV
LT
LV
RULT
LVLT
RU
GRGRGR
SE SE
S Europe IT
ITIT GR NO SE SESE NO SE LV PL LV
LV
LT
Downloaded from https://www.pnas.org by 185.238.131.78 on July 1, 2023 from IP address 185.238.131.78.

Malta IT IT IT ITITGR GR NO SE SE LTRU


TR TRTR TR TR Brit. & Ire.
SE SE RULT
IT
IT IT
IT ITITIT IT
IT GR
GR GR IS ISDK
NODK
NO
NO
NO NO
NO
NO
NONO
DK NO
SE
NO
NO SE
NO
SESE
SE
SE
SE
NO
SE
NO NO
NO
SE
SE
SE
SESE
SE
SE
DK
SE
SE
NO
SE
SESE
SESE
SESE
SE SE
SESE
SE
SE
SE
SE
SE SE
SESE SE
SE FI FI FI DE LV
PL LTRU
LT
PLRULTLTLV LV
PL
RU
DK RU
LV
Italy IT IT IT
IT
MT ITIT GR
GR GR
CZ IE IE IS
DK
NOISISNO
DK
IS NONO
NL
NO
DK
NO
DK
DK
IS
NO
IS
DK
NO
NO
NO
SE
DK
DK
DK
NO
SE
DK
SE
NO
NO
SENO
NO
DK
SE
NO
DKSE
DK
SE
NO
NO
DK
NO
NO
NO
SE
NO
SE
SE
NO
NO
SE
SE
NO
NO
NO
SENO
SE
DK
SE
DK
NO
SE
DK
SESE
SESE
NO
SESE
SENO
SESE
SENO
SE
SE SE
SE
SE
SE SESESE
SESE
SE UA FI
AT DE DEPLPL
PL PL
PL PL
PL PL
PL PL
PL
PL
PL LV
RU
PL
RUPL
RU
PLRU
PL PL
LT
RU
RU
RU
LT
RU
RU
RU
RU
LV
RU
RU
BYRU
RU LVRU
RU
LT
GR
Gibraltar ITIT ITIT
IT IT ITITITIT GR
ITGRGR
GR GR
GR GRFI TRTR CY
NL
IS IS
NL
NL DK
IS
NL
DKDKIS
DK
IS
DK
NL
SE
NL
NO
NO
GR
NO
DK
NO
NO
ISDK
NODK
DK
DK
DK
DK DK
DK
NO
NO
DK
DK
DK
ES
SE
NO
DKSE
DK
NO NO
SE
NO
NODK
NO
DK
NO
DK
DK
SE
DK
DK
DK
DK
SE
SE
SEDK
SE
SE
SE
SE SE
SE
NO
SE SE
SE
SE SE
SESE
NO
SC DK FI
DK IM SC DE PLPL PL PL
PL
PL PL
PL
PL
PL
PL PL
PL
PL
PLPL
RUPL
PL
PL PL
RU
PL
PLRU
RUFI
PL
UARU
RU
RU
RU
RU
RU
BY RU
BY
LV
RU
RU
UA RU
RU LT

ITIT
IE IE NONL
NI DK
DK DK DK
DKNO
SE DK
NO
DK SE
ISDK SE DE CH
PL PLPL PLPL
PL PL
PL PL
PL RU LT RU
RURU LV
Ukraine IT ITIT ITIT BE GRGR BYGRCZ
RUSE TRTR
NI SC
NI
NI
IE
IEIE
IM
NIIEIE
IE
IE
NI
NI
IEOK
IE
IM
IE
IM
IESC
IE
SCIE
BE
SC IE IE
OK
GR
CYIM
IMNL NL
NL
NL NL
FO
NL
NI IS NL
NO
EN
NL
NLNL
DK
NO
DKIS
DK
DK
DK
NO
DK
NO
DE
DK
NL
DK
NL
DE
NO
NO
DK
DK
DK
DK
DK
NL
DK
NO
NO
DKDK
DK
DK
NO
DK
SEDK
SE
NO
DE
NO
DK
DK DK
DK
DK
DK
SE
DK
DK SE
DK
DK
DK
DK
DK
DK
DK
NO
DK
DK
DK
DK
DK
SE
DK
DKDK
DK
DK
DE
SE
SE
DK
DK
SESE
SCSE
DK SE
DKSENL SCFI
DE
DE
SENL
PL NO SE DE PL PL PL
PL PL
PLPL
PL
PL
DEPL
PLPL
PL
PL PL
PLPL
UA
PL
PLUA
RU
RU
RU
PL
UA
UA
RU
EE
RU
RU
RU
RUPL
RU
PL
UA
RU
PL
LV
PL
PL
RU
RU BY
UA
RU
RU
UA
RU
LT
RUUA
UA
RU
UA
RU
RU
RU
RURUEE
RU
UA
LVRURU RU
IT
IT IT MT BY GR GR GR TR TR TR TR Sample Count
IE
SCIE
NI
SC IM
IE IE
IE
IE
SC
NI
IE NI
IE
SC
IE
IE
NI
IE IE
FR
NI
IE
IE
NI
IE
NI
SC
FO
NI
NI
NI
IE IESC
WA
IE
SC
GISC
EN
SC
DE
NI IS
NI
NI
DE
SC
IE
EN
IE SC
CI
NL
IE
UA
IEIM
OK
IM
WA NI
SC
MT
NL
OK
WA
NI
SC
NI
SC
AT
OK
DK
CH
MT
IMOK
NI
WANL
SC
NI
NI
IS
DE
OK
FO
DE
NLSC
DENL
FO
NONL
DK NL
NL
NL
CY
NL
NO
NL
NL
DK
NO
NO FO
DK
NL
BEDK
SE
NL DK
DK
NO
SE
DK
DE
NO DK
DK DK
BE
NO DK
DK
SE
DK
DK DK
SE SE DK SEDK DKDE DE DE CZ PLCZ PL PLPL
PLPL
PL
AT
PL
PL
PL
PL
UA
PLPL
EN PL
PL
PL
PL
PL UA
PL PL
AT
RU
UA
UA
RU
RU RU
RU
RU
UA
RU
PL
UA
PL
RU
UA
PL RU
RU
RU
RU
LV
RULT
MTCZDK
NI IE SC
IE WA NL DE PL PL
Russia IT ITIT MT
IT ITMT
MT
MT RU
RU
GR
RO RO
RU
GR
UA
BE TRTR
TR TR
IE
IE
IE
EN
IENI
NI
IE
IE
NI
IE
WAIE
CHIM
SC
NI
IE
NI
IE
MT
IE
IE
NI
IE
IE
WA
MT
IE
DE
NI
IE
IE
NI
IM
NI
IE
SC
IE
NI
SC
IEIE
NI
WA
IE
WA
EN
SC
NI
IE
IE
NI
IE
NIIE
SC
IE
IE
MT
WA
WA
IE
IE
IE
CY
NI
EN NIWA
SC
EN
WA
WA
DE
IM
SC
SC
OK
IE
NI
NINI
NI
SC
IE
CI
WA
SC
SC
DEIE
SC
NI
EN
OK
SC
NI NI
IE
SC
IM
WA
DE
IE
NI
IE
ATENCI
SC
WA
IE
EN
NI
OK
NIOK
EN
NI
INL
IE
IM
PT
EN
NI
SC
WA
SC
FR
WA E
IM
DE
OK
SC
IE
WA
CI
NI
NI
IM
EN NL
SC
MT
NI
EN
IE
GI
IE
OK
CI
CI
OK
WA
OK
IM IM
IE
SC
NI
EN
WA
NI
MT
OK
GI
SC
CY
MT
EN
EN
OK
NI
NL
GI
IENI
IS
MT
WA
CH
SC
DE
NI
DE
NI
NL
CI
OK
IM
NI
IM
FO
GR
DE
PT
CH
MT
CY
SEENDK
AT
EN
DE
MT
NL
DE
CY
CI
MT
GI
WA
NL
SC
EN
NO
NL
EN
BE
CY
NI NL
GI
BE
CY
NL
WA
NL
EN
DK
NL
FO
NL
DK
NL
NL
IE
NL
NL NL
NL
DK
DK
MT
NL
NL
EN SC
NL
MT
SE
DE
DK
NL
DENO
NL
NL
DK
NO
IT
NO
NL
DK
NONL
DEDKDK
DK
NL
DE
DK DE
AT
DK
DE NO
WA
PL
SE SE
IT CZFR
DE FR
EN
DE
DE AT
PLCZ
DE PL
PLPLPLPLPL
CZ
PL
PL
RU PLPLPL
PL
PL
PLPL
PL
PL
PL
PL
UA
RU
PL
PL
UA
PLRU
UAUA
UA
RU
RU
UA
UA
RU
UA
RU
RU
RURU UA
RU RU

TRTR
NIEN
EN
CINI WA WA
EN NI
IE OK
SC SC
NI
MTNI
O
EN NL
NI
OK
SC
AT K IM NI NLAT SK
IT
GR ITIT SC
EN GR
GRGR
SE
IE
DE
SCSC
IE
IE
SC
NI IE
NI
SC
IENI
IE
NIIE
IE
CYIE
SCSC
IE
SC IE
GI
WA
NI
SC
FR
WA
SC
IE
DE
WAEN
NI
SC
SCWA
NI
NI
SEWA
SC
BE
DE
MT
SC
NI
OK
WA
NI
NI
WAAT
NI
WA
EN
IE
SC
WA
SC
WA
IE EN
DE
SC
IM
DEEN
EN CY
CI
IE
NI
NI
MT
WAWA
IE
GI
CY
IE
SC
MT
WA
SC
EN
SC
IE
WA
DE
SC
NI
OK
EN
TR
IE
SC
EN
NO
DE
IM
IM SC
MT
BG
CY
WA
NI
WA
IE
NI
MT
CY
EN
TR
IE
ES
DEMT
MT
EN SC
OK
WA
NI SC
IE
EN
NI
GI
EN
SC
MT
IE EN
WA
GR
TRIE
DE
IM
OK
CH
IE
CH
DE MT
SC
NI WA
NL
EN
SC
MT
MT
IE
WA
NI
IM
NI GI
NI
MT
IM
EN
MT
EN
CI
EN
NI
WA
SC
CI
IE WA
NI
EN
DE
IM
MT
CY
NLWA
SC EN
WA
PT
CY
NL
MT
SC
AT
NL
WA
NI
EN IM
NL
DE
CI
EN
EN
NI
DE NL
WA
BE
GI
GI
MT
SC
NL
NI
DE
NL
CH
IM
NL NL
NL
DK
EN NL
CY
NL
BE NO
FI NLNL DE DE
AT
ENDE DE DE DE
DE
DK DE
DE DE
DKDECZ FI
DEDE SE
DEDE CZ
DE CZ CZPL
FR PL
CZPL CZ
SK PLPL
PL
PL
UA PL
RU
PL RURU
UA
UA
RU
RU
UAGR
IE WA
SC DEMT GICY NL EN UA
Lithuania
E Europe MT MTMT
IT MT
MT GR UA
HU
GRSKDE TR TR
EN IE
IEIE
SC
IE
IE
AT
SC
NI
SC
IE
CH
DEWA
IE
WAEN
DE
IE
CH
EN
WA
DE BE
IE
SC
NI
IE
NINI
OK
CY
CI
IE
CI
EN
SC
MT
WA
EN
WA
SC
GI
SC
NI SE
EN
SC
EN
AT
WA
NI
IE
NO
IM
MT
NL
SC
EN
WA
NO
AT
IE
NI
IENI
WA
IM
WASE
WA
WA
SC
NI
NI
WA
IE
DE
MT
NI
MT
CI
CY
SC
SC
FR
EN
SE
OK
EN
NI
WA
IE
SC
AT
MT
GI
IM
CY
MT
WA
NI
WA
SC
MT
CY
FR
IE SC
IE
WA
GI
FR
EN
CI
DE
IECY
CI
MT
IM
SC
DE
WA
BE
SC
EN
IM
FIIM
EN
CY
CY
WA
SC EN
WA
NI
WA
NI
NI
CY
PT
MT
DE
MT
DE
CI
IE
EN
WA
DE
NI
SC
MT
EN
EN
SC
SCWA
IM
CY
EN
EN
AT
WA
WA
CY
CY
WA
CH
CY
MT
SC
NI
MT
BE
SC
MTWA
CY
CI
EN
MT
WA
CI
NI
NL
GI
MT
WA
DE
EN
CYIE
SC
MT
SC
CY
WA
EN
EN
CI
MT
ENIM
SC
CH
GI
AT
EN
BE
NI
CY
BE
MT
IM
SC
AT
EN
GI
CI
SC
IE
WA
GR
MT
BEEN
EN
IE
NI
NL
OK
IM
CI
WA
CY
DEEN
BE
WA NL
EN
GI
GI
NI
CY
NIMT
CI
EN
WA
CY
EN
AT
EN
DE
NI
IM
AT
CY
DE
EN
WA
BE
SC
MT
MT
GI
NL
WA
MT
MTMT
AT
CH
EN
CZ
IM
EN
BE
DE
NL
EN
OK
MT
OK
NI
NI
CH
NL
GR
SC
CY
CI
NL
FR NL
BE
OK
CY
WA
NL
BE
MT
WA
FR
IM
MT
AT
NI
SC
NL
EN NL
NL
OK
WA
DE
IM
CI
SC
MT
EN
NI
CY
DE
NL
CI
GI
NL
EN
EN
NL
NO
NL BE
NL
NL
EN
NL
DE
NL
SE
NL
NL
NL
BE
DE
MT
EN MT
SC
NL
NL
NLDE
NL
NLDK DK NL
DE
DK
CH DE DE DK
DE SE
CZ
CZEN
AT
DE CZ PL
DE
DE AT CZCZ
CZ
CZ
CZ
CZ
SK CZ
CZDE
CZ
CZ
CZ
PL
PL
CZ
CZCZ
CZ PLPL
CZ
CZ
SK
PL
PL
PLPLUA
PL
UA UA
UA
PL
UA
PL RU
RU RU
Latvia MT MT MT
MT IT
MT RO
SK CY
HU EN
RU
TR
PL
BE GR TR TR TR CY
DE
SC
SC
SC
NINI
OK
WA
NI SC
WA
NI
SC
DE
MT
WA
FR
WA
EN
CY
WA
CI
PT
EN
WA
DE
SE
FRIM
NI
MTAT
SC
SC
NI
WA
EN
SC
DE
DE
WA
GI
WA
DE
TR
CY
WA
BE
AT
SC
MT
NI
EN
CY
CI
CY
SC
FR
CI
EN
SC
PT
IM
NL
SC
NI
NI
EN
SC
NI
MT
CI
SC
GI
NL
AT
CY
WA
GI
DE
DE
SC
MT
FR
CI
WA
IM GI
CI
WA
DE
NI
CY
WACI
OK
AT
WA
DEMT
BE
GI
NI
WA
MT
CI FR
BGCI
WA
WA
BE
MT
MT
WA
EN
DE
CI
CI
CH
MT
EN
EN
CY
EN
FR
IM
CY
EN
SC
DE
MT
CI
FR
SC
CY
MT
CI
CI
AT
ATMT
OK
CI
CY
NL
BE
SE
CY
WA
EN
NL
MT
SC
EN
DE
OK
EN
BE
SC
DE
WA
IM
WA
EN NL
GI
CY
EN
BE
DE
EN
BE
IE
CY
CY
WA
DE
BE
EN
IM
GI
CY
IM
MT
NL
CI
NLCY
SC
EN
CI
NL
MT
MT
SC
DE
CI
CY
IM
NL
INL
CI
CY DE
NL NL
NL
NL
WA
NLBECY
NL
WA
CYNL BEDE
NODEDE
DE
AT
NLAT
CH AT
CY
DE CYDE
DK
DE CZ SEAT
ATATNL
RO CZ
CZ PLCZ
ATAT DECZ
DE
AT
CZCZ
CZ
DECZ
CZ
CZCZ
CZ
CZ
CZ
CZ CZCZ
CZCZ
CZ
CZ
CZ
DECZ
HR PL
PL
CZ AT
CZPLHUCZ
PLHU
SK
RS
PL CZ
SK SK
CZ
SKRU
UA
SK
PLSK UA RUUALT UA
RU
RU
TR TR
CI NL CZ
MT MT MT
MT MT
MT MT
MT
IT MTMT
GR CZ RU
HU
RU
AT PL
CI GR TR TR
TR TR TR 10 WACY EN FR
IE
CYNI
IM
WANI
WA
GI
CI
OK
MT
SC
IMWA
EN
FR
NI
DE
CH
NO
MT
MTMT
CY
EN
WA
AT
DE
BE
EN
CI
FR
MT
FR
GI
CI
WA
BE
CI
NL
EN
CI
CI
MTEN
EN
WA
DE
CI
CI
GI
TR
EN
BE
IM
CI
WA
FRTR
CI
C
FR
MT
EN
BE
IT
MT
EN
GINLNL
WA
CY
BE
BE
CHCI
MT
CI
MT
CI
BE
PT
DE
CI
NL
IE
EN
BE
BE
AT
ENGR
NI
BE
NL
CI
MT
NO
CY
NLCI
BE
DE
BE
TR
BE
BE
NL
CY
BE
BE
NL
CH
NL
NL
AT
CH
CI
NL
BE DEAT
NL
NL
AT DE
DEDE
HU
DEDK
DE
AT
AT
DE ATAT
DE
DEAT
CH DE
ATCZ NO
AT ATPLAT
AT
IM AT PL
HU
CZCZ
AT
CZ AT
CZ
CZ PL
CZ
CZAT
CZ
CZ
CZCZ
CZ
SK
CZ
SK SK
SK
SK
CZ
AT
SK
SK PL
UA
UA
SK
CZBA UA
DE UA
UA
PL UADK
RU
Estonia MT HUTR
FRCY
HU
HR
HURO TRTR WACH
EN
WA CI
CI
DEOK
FR
SCDEIM
FR
CIFR
BE NL
CI CI
EN
CI BE
BE
DE NL
BE
BEBE
FR
DE
BE
BE EN
BE
CI
NL
CIBEBEDK
NL
NO
BE DEAT AT
AT DE
DE BE
SEAT IT
DE
BE
AT AT AT
ATCZ
CZ AT
CZ CZ
CZDKHU
SK
SK SK
HU
HR
CZ SK
CZ HU SK UA
PL
RU LV LT
MTIT
CI CI CI
CICI WAEN
BE CI BE BE CIFR DE HU CZAT CZ
HU SK PL
PL
Belarus ITMT
MT
MT MT
HUCY FR
CZ
CH GR EN
HU TRTRTRTR TRTR
25
FR
CH
CI FR
CI
ATCY
CI
FR
CI CY
IE
BE
CI
BE
BE
CI
NL
CI
CI
BE
EN
FR
IT
CI
BE
CH
BE BE
CI
NL
FR
MT
FR
CH CIWA
BE
BE
NL
DE
BE
FR
BE
BE
BE BE
BE
BE BE
BE
CH
BECHNL BEDK
DE
MT
AT CH
CZ
FR
AT AT PL
HU ATAT
SEAT
AT
RS
AT
AT
DE
ATCZ
AT
ATAT
CH
AT
AT
HU AT
AT
AT
AT AT
AT
AT
HR
AT
ATAT
HR
HU
AT
HU
HU
HU
HRSI
HR
HU
SI HU
SI
CZ
ROHR
SI
BA
SK
SK
HR
HU
HUHUHU
HR
CZ
CZ
LV
HU UARU RU
MT MT
MT MT CHHU
GR SK
GR SECY TR TRTR
TR TR
CY CI BE
FR FR GI
IE
FRBE
CI
BE
BE
BE
BE
FR
FR
FR
CI
WA
CHGI
BE FRCH
DE
BE
NL
BE
CY
BE
FR
FR
CH
BE
BE
BE
MT
FR
IT
BE
BE
FR
BEFR
BE
CH FR
CICI
BE
CH ATDE
CH NLNL
DESE BECZ
AT
DEAT AT
AT
SC
AT
AT
AT
DE
AT ATAT
AT
AT
CZ
AT
AT
ATAT
AT
NL AT
AT
HU
AT
HU
AT
BA
AT
CH
AT
HU
HUHUHU
AT CZ
CZ
AT
AT
AT
SI
AT
HU
AT
CZSI
HU
SI
HU
SI
HU
SI SI
SK
SK
HU HRHU
SK RO
SK PL
BA PL GR
UARUBG
RU
Switzerland MT MT MTMT
DK CZ
MT CY
HUHU CY TR
TR TRTR 50 0.00 FR FRFR
FR
PT
FR BE
BEPT
FR
CH
RU
BE
BE
FR
CHBE
IM
BE
FR
NL
MT
BE
BE
MT
WA
CH
AT
BE
BE
BE
BEBE
CH
BE
NL
FR EN
CI
NI
CH
NI
GR AT
DE
ATMTAT
CHDE
AT
CH
AT
AT ATHU RS AT
DKAT
HU AT AT
AT
AT
CH
DE
HU HR HUHU
NL
AT HU HU
CZ
HU
HU
HU
HU
AT
HR
HU HU HU
HUHU
HR
HU HR
HUHU
SI
SI UA
MT MTNLCZCZGR TR TR TR FR FRFR BE
FR FR
BECHBECH CH MT
DE DE DERS DK ATAT
AT AT
AT AT
HU HUHU RO
HR
RO HU
RO CZ BA
TRTRTR
CH AT HU BARO
Slovakia
GI MTTR
MT TR
CY FR FR
FRES
ES SC
FRFR FR
FR
FR
GI
DE
FR
AT
CH
CY
FR
BE
CH
IM
DE
FR BE
FR
FR
FRCH
FR
CH
CH
FR
BE
CHSC
CH
WA
PT
CH
FR
CH
CH
CH
CH
DE
CH
CH
CH
CH
CH
CH
CH
CH
CH
FR
CH
DE
CH
GI
CH
CH
CH
CH
BE CH
ATIT
AT
CH CH
AT
AT
AT
ATSE DE
AT NO ATAT
IT
RO
AT
ATCH AT
AT HU
HURO HU
HURS
HU
HR
HU
HUIT BA HU
HU
HR RO RO
RO FI
HR
CY TRTR
Poland 100 ES FR
CH FR
BE
CHATFR
BE FR
CHCH
CH
BE CH
CH
CH CHCH
CH
CH AT AT
DE HU
IT ITBA
HR HU BARORU BABY
MT CY CYCY TR TR FR
ENFR FR
GI FR
CH FR
CH
BE
CH
CH
CH
CH
FR
GI CH
FR
IT FR
CH
DE
CH CH
CH
MT
CH
CH CH
DECH
FRNL
DE CHDEHR
AT AT HU
ATRO HURS BA
RS
HR BA
RS HR

Principal Component 1
Hungary C Europe ES FRFR FR FR MT
CH
ATCY CH
CHCHIT SE HU
ES HR RO
BA BA
BABA RURO
CY GR CY TR
CY TRTRTR 150 FR ES
FRFR
FR DE
GI CHFRFR
FR
FR
FR
FR
FRFR
FR
FR
CH
FR
FR
CH
EN
FR
CH
CH
FR
CHDK
CHIT AT
NL BE SE AT AT AT TR RS
HR
HR
BARO
BA
BA
RSBA
HR
HR
RS
HR
BA
HR
BA
RS BA
BA
RS
BA BARU
Germany CY CY
CY CY
CY
CY CYCY
CY
CYCYCY
CYCY CY
ES FR
FR
FR
FR
FR
CH
FR
FR
FR
GI
FR
GIFR
FR
CH
GI
GI
CHFR
FRCH
FR
FR
IT CHCH BE ATCYCH FR IT
IT HUHU
CZ
HR
RO
RS
BARO
BA
RS
BA
RSRS
RS
RS
HR
HR
RS
BA RU
Czech Republic CYCY
CY
CYCY
CY
CY
CY
CY
CY CY
CY
CY CY
CY
CYTR ESES ES
ES ESFR FRFR
FR FRFR FR
IT CH
MTCH
FR
SCWAEN
FR
FR CH TRAT NL GR CH AT RO
RO
RORS
RS
RO
HR HR
BA RSBA
HR
PLRU
MK
BA HRHR
BA CZRU
−0.04 CY CY
CY
CY CY
CY
CY
CY CY
CY CY
CY
CYTR CY
TR
TR
200 ESESES ES FR FR
FR FRCH DE
FR
CH CHGR GR CH RORSATRORS
HRBARS
RS
HR
RS
RO
BA
RS
RO
HR
RS
BG
RS
RO
RS
RSRO
HR
RS
HR
HR
HR
RO RO
MKRO
RU
Austria CY CY
CY CY
CY
CYCY
CY CY
CY
CY CY
TR BG
TR ESES ES
ES ESES
ES ES FR FR FR CH SEGR
FR CHTR TRIT
TR MK
BG
RO RSMK
RO
BGRS
BA HR
RSMK RU
CYCY
ES RS BG
Sweden CYCY TRTRTR ES
ES
ES
ES
ESES
ES
ESES ES
FR
PT
ES GIFR ES CH
ESES FR
ITCH
GIFR FR
FRITIT IT
NO RO
RORORORSHR
RS
BG
MK
RO
RO
RS
BG
RU HURU
Norway CY CY CYGR TR ES ESES
ES
ES
ES ES
PTES
ES
ES ES
ES
ES ES
ES
ES
ESESES
GI
ESPT
ES FR
ESPT FR SC ITFR
IT GR IT
ITHU CH RO
RO RSROBA
RO
BG
BG
RS
RS
BG
BG
BGRO
CH
HUPL
BG
MK
PL RURU
UA
RU

CYCYCY
ESES
ESES ES
ESES
ES
PTPT
ES ES
GI
CH
ES PT
ES
PTES
ES
PTPT
ES
ES
PT ES
ES PT
CH
PT
ES ES
PT
PT PT
ES MT AT
NI IT NL IT BG
BG RU
RO
BG
BG
BG
RO
BGRO
BG
BGRO RUPLRU
ES PT
ES ES PT
PT PT
PT
ES
PT PT
ES
ES
ES
PTPT
ES
ES
ES
PT PT
PT
PT ES
ES PT PT
PT
DE ES
PTMTIT IT
SCMTBE
CYMTES
BE SC IT RO RO RO
BGBG
Iceland ES ES PTES
PT
PT
ES
ES
ES
PT PT
ES
ES
PT
ESPT
PT PT
ESES
PT
PT
PT PT
PT
ES
ES
PTPT
PT ES
ES
PT
ES
PT PT MTATAT
NLIE
IT
IT CH NL CH FR RO BABG
RO
BG BG
BG
BGBG
BG BG
ROBG TR
B
ES MT
Finland N Europe Turkey
Near East ES
PT
PT
ES
ESES
PT
ES
PT
PT
PT
ES
ES
PT ES
PT
FR
ES
ES
GI
PT
ES
ES
PT
ES
ES
ES
ES
GI
PT
PT
PT
PT
PTGI
PT
ES
PT
ES
PT
ES
GI
ES
GI
PT
PT
PT
PTPT
PT
FR
GI
PTPT
ES
PT
PT
ES
PT
ES
ES
PT
PT
ES
PT
PT
PT
PTPT
ES
ES
PT
GIPT
ES
PT
PT PT
PT
PT
PT
GI
PT
PT
GI
GI
ITFR
PT
FR
FR
PT
PT
GI
PT
GI
PT FR
MT
FR
IT
IT IT
IT
MT
PT
FR
PT
CH
BE
IT
IT
IT
IT
ITIT
IT
IT
DEITWA BE
AT
IT CZ
HUHU BG
RSMK BA
BG
BG
MKBG
BG
BG
RO
BG
BG
BG
BG
BG
TR
BG
BG
RO
BG
BGRU
BG
BG
TR
BG
BGBG
TR TR
Cyprus ES PT
PT PTPTESPT
PT GI
PT PT
PT
PTGI ES
PTPT
GI
IT ES
ITIT
PT IT
PT MT
ITITMT
GR CY NL
MT BE BGBG GR
TRBGTRHU
Faroe Islands PT ES
ES
PTPTGIPT
GI IT IT
IT IT
FR
IT ITCY ITIT
IT
IT ITPT
CY PTAT CZ BG MK BG
MK
GI FRFRIT TR
HU TR UA
Slovenia PT GR CZ BG
Denmark Serbia/Montenegro GIIT IT ITIT
IT
ITBEIT CH GR IT KO MKKOKOMK
BGBG MK
BG
Romania ES
ES PT
ESES
ES GI GI PTCY
IT IT
IT ITCYIT IT TRRS RO BG
GR HU
Netherlands ES IT GRGR
MK RO
GR TRTR
North Macedonia ITIT
IT IT
FR IT
GR KOKO
MK
GR TRTRBG
France W Europe −0.02 0.00 0.02 Kosovo
Greece 0.04 SE Europe −0.02 IT
GI IT
IT
ITIT IT
FR
ITIT
IT
IT
IT ITFR
RSAL
GRGR
GRHU GR
ALGR GR
GR
Belgium Croatia CH ITIT IT
IT
IT
FR IT KO GR AL GR
AL GR
GR
GR
Principal Component 2 MT IT GR RSGR
GRGR TR TR
Wales
Bulgaria
IT
IT
ALGR
ALGRGR
GR
GR
GR
GR
GR CH
GR GR TRTR
Bosnia and Herzegovina ITFR ITITIT IT AL AL
GRAL
MK
AL KO
GR GR
GR
GR GR TR
IT IT MK
GR GR HU GR
AL GR TR TR CZ
Scotland Albania
Spain
FR IT FR
ITBE
IT
IT ITIT
ITIT BE GR CY
CY GR GRGR
GR
GR
GR
GR TR TR TR TR TR
Republic of Ireland Portugal MT ITIT IT
CH ITGR TRGR GR
GR GRTR UA
TR
TRTR TR
CY MT
MT GRGR
BG TR RU
TR
Malta S Europe IT IT GI IT
ITIT
IT IT
IT
IT ITIT
IT ITIT ITGR GR GR
GR GR GR TR TR
Orkney ITIT ITITITIT IT GR
GR GR CZ TR TR
Northern Ireland Brit. & Ire. Italy
Gibraltar ITIT
IT ITIT
IT
ITIT
IT
IT
MT IT
IT
IT
IT
IT
IT
ITITIT
IT
ITIT
GR
ITGR
GRGR
GR
GR
GR
GR GR
GR
BY
GR
GRFI
CZ
TR
TRTR
Ukraine IT IT BE
IT
IT IT MT BY GR
GRRUSE
GR GR TR TRTRTR TR TR
Isle of Man Russia IT ITIT MT
IT IT
MTMTCZDK
MT
MT RU
RU
GR
RO RO
RU
UA
BE GR TR
TR
TR TR
Lithuania IT
GR MTIT
MTIT
MT
MT GR
ENGR
SC
UAGRGRGR TRTR
England Latvia E Europe MT MTMT IT
MT
MT
MT IT
MT RO
SK UA
HU
SK
GR
CY
HU
DE
EN
RU
TR
PL
BE GR TR TR TR
TR TR
MT MT MT
MTMT
MT MT
MT
IT MTMT
GR RU
HU
RU
CZ
AT PL
CI GR TR TR
TR TR TR TR
TR
Channel Islands Estonia
Belarus MT MT
IT MTIT
MT
MT
MT
HU
HU TR
MT
CY
HU
GR
CY
HU
HR
HU
RO
FRFR
CZ
CH GR EN
HU TR
TR
TRTRTR
TRTR TRTR
Switzerland MT MT MTMT
MT
DK CH SK
GR
CY SECY TR TR
TR
TR TR TR
CY
MT MT CZ
MT HU
HU TR
0 50 100 150 200 Slovakia
GI
MT MT
MTTR
MT CZ
NL CZGR TR CY
CYTR
TR
TR
TR
TR TR
TR

A PCA analysis of a European population, similar to the one


Poland
Hungary C Europe MT CY CYCY CY TRTR
TR TR
CY TR
Count Germany
Czech Republic
Austria −0.04 CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
GR
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CY
CYTR
CY
CY
CY
CY
CY
CY
CY
CY
TR
CY
CY
TR
CYCY
TR
TR BG
TR
TRTRTR

Sweden CY
CY
CY
CY CY
CY
CYCYCY
TR TR
TR
CY CY
Norway CY CY GR TR
Iceland
N Europe CYCYCY
Finland

ig. 1. A sample of European structure in the UKBB. (A) The number of individuals included from each European country analyzed. Countries are grouped by Faroe Islands

by Novembre et al. (2008), colored by ancestry. Adapted from


Denmark
Netherlands
W Europe −0.02 0.00 0.02 0.04 France

eographic region; these regions are chosen as a means of group representation and do not necessarily imply historical links. Sample sizes from each region Principal Component 2
Belgium
Wales
Scotland
Republic of Ireland

re also shown. Abbreviations are as follows: SE Europe (southeastern Europe), S Europe (southern Europe), E Europe (eastern Europe), C Europe (central Brit. & Ire. Orkney
Northern Ireland
Isle of Man

Gilbert et al. (2022).


England

urope), N Europe (northern Europe), W Europe (western Europe), Brit. & Ire. (Britain and Ireland). (B) The sample counts for each European 0 region.
50 (C)
100 The first
150 200
Channel Islands

wo PCs calculated by PLINK of 5,500 European individuals. Individual genotypes are shown by letters that encode the alpha-2 ISO 3166 internationalCount standard
Fig. 1. A sample of European structure in the UKBB. (A) The number of individuals included from each European country analyzed. Countries are grouped by
odes and are color coded according to geographic region. The median PC for each country/region of birth is shown as a label.geographic
Plots were generated using the
region; these regions are chosen as a means of group representation and do not necessarily imply historical links. Sample sizes from each region
gplot2 package (65) in the R statistical computing language (59). are also shown. Abbreviations are as follows: SE Europe (southeastern Europe), S Europe (southern Europe), E Europe (eastern Europe), C Europe (central
Europe), N Europe (northern Europe), W Europe (western Europe), Brit. & Ire. (Britain and Ireland). (B) The sample counts for each European region. (C) The first
two PCs calculated by PLINK of 5,500 European individuals. Individual genotypes are shown by letters that encode the alpha-2 ISO 3166 international standard
codes and are color coded according to geographic region. The median PC for each country/region of birth is shown as a label. Plots were generated using the

PCA has been used this way in genetics for apnas.org


long time, but it
ggplot2 package (65) in the R statistical computing language (59).
1 https://doi.org/10.1073/pnas.2119281119
2 of 11 https://doi.org/10.1073/pnas.2119281119 pnas.org

wasn’t until large datasets, annotated with country of origin, be-


came available, that a peculiar pattern was noticed.
This pattern was first found by Novembre et al. (2008), using
a dataset of 1 387 Europeans, measuring about 200 000 genetic
markers. They applied PCA, and plotted the first two principal
components. They then colored the instances by the person’s an-
cestral country of origin (the country of origin of the grandpar-
ents if available, otherwise the person’s own country of origin).
What they saw was something like a blurry picture of Europe.
The first principal component corresponds roughly to how far
north the person lives (or their ancestors did) and the second
principal component to how far east they live. This means that a
scatterplot shows up as a fuzzy map of Europe. If we sent a thou-
sand DNA samples off to aliens on the other side of the galaxy,
they could work out a rough image of the geography of Earth.
1.4. APPLICATIONS OF PCA 31

Note that while this result is impressive, it’s easy to misinterpret


what this means. These two principal components together ex-
plain no more than a few percent of the variance. That is, while
we can make a crude prediction of a person’s origin from their
DNA, we can predict almost nothing about their DNA from their
origin. We saw something similar in the example of the Olivetti
data: we could predict people’s gender and age from the first
few principal components, but when we reconstructed the photo-
graph from just this information, the result removed almost all
relevant details about the subject of the photograph.

Wrapping up, what have we learned so far? We have defined


PCA as an iterative optimization problem designed to compress
high dimensional data into fewer dimensions, and to minimize
the resulting reconstruction loss. We’ve shown one simple way
to find this solution, a laborious and inaccurate way, but enough
to get the basic idea across. We then looked at various practical
uses of PCA: analyzing human faces, human fossils, and human
DNA. We showed that in many cases, PCA magically teases out
high-level semantic features hidden in the data: the species of the
fossil, the location of the subject in Europe, or the subject’s age.
What we haven’t discussed fully, is where this magical prop-
erty comes from. We’ve shown that it isn’t just the compression
objective, since optimizing just that in one single optimization
doesn’t lead to the PCA solution. Among the set of all solutions
that minimize the reconstruction error, the PCA solution takes a
special place. Why that’s the case, and why it this should emerge
form optimizing the principal components one by one, greedily if
you will, instead of all together, we will discuss in the next chap-
ter. To do so, we’ll need to dig into the subject of eigenvectors,
the underlying force behind almost everything that is magical
about linear algebra.
C HAPTER 2 · E IGENVECTORS AND
EIGENVALUES

In the first chapter, we took what I think is the most intuitive


route to defining PCA: framing the method in terms of recon-
struction error. The solution method we used wasn’t very effi-
cient or stable, and some parts of the “why” question were left
unresolved, but we answered the “how” question well enough to
show the method in action and hopefully convince you that it’s
worth digging a little deeper.
We’ll start by tying up one of the loose ends from the last
chapter. There, we defined PCA in terms of reconstruction error,
but most other explanations instead define it in terms of vari-
ance maximization.
I started with the reconstruction error, since it requires fewer
assumptions, and the required assumptions feel more intuitive.
However, to explain the details of what happens under the hood,
the variance perspective is more helpful, so we’ll start by adding
that to our toolbelt.

2.1 Minimal reconstruction error is maximal variance


In the previous chapter, we defined PCA as a solution to the fol-
lowing problem: for a mean-centered set of instances xi find
a sequence of k unit vectors w1 , . . . , wk where each unit vec-
tor is defined by

8 P
>
>
< argmax i kxi T ww - xi k2
w
wr = such that wT w = 1 (a)
>
>
: and wT wj = 0 for j 2 [1 . . . r - 1] (b)

That is, each unit vector defines an approximation of xi , and the


objective is to keep the distance between the reconstruction and
34 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

the original instance as small as possible, while ensuring that (a)


the vector is a unit vector, and (b) the vector is orthogonal to
all preceding vectors.
The variance maximization objective takes the same form, but
instead of minimizing the reconstruction error, it chooses w to
maximize the variance of the data projected onto w.
As in the previous chapter, we’ll start with the one-dimensional
case. We will choose a single unit vector w and project the data
onto it orthogonally. That means that for every instance xi , we
get a single number zi which is defined by xi T w.
This time, our objective is to choose w so that the variance
among the set of zi is maximized. That means that in this image

we want to choose w so that the red points are spread out as


widely as possible.
Why is this the same as choosing the w that minimizes the
reconstruction error? It’s easy to see this if we draw a a dia-
gram for a single instance.
2.1. MIN. RECONSTRUCTION IS MAX. VARIANCE 35

Note that w is a unit vector, so for the length of the bottom edge
of the triangle we have kxi0 k = kwzi k = zi .
By the Pythagoras, kxi k2 = krk2 + zi 2 . The vector xi re-
mains constant, since that is given by the data. The only thing
we change is the direction of the vector w. If we change that to
decrease the reconstruction error krk, the distance zi must de-
crease. The sum of the squares of all zi ’s is the variance of the
data.
Thus, the first principal component is the vector w for which
the data has maximum variance when projected onto w. For the
sake of completeness, let’s work this into a proper proof. There’s
some technical stuff coming up, so we had better get into a more
formal mindset.

Equivalence of error and variance optimization. The vector w


that minimizes the reconstruction error among the reconstructed
instances xi0 , maximizes the variance among the projections zi .

Proof. The maximal variance objective can be stated as

1X
argmax (z̄ - zi )2 .
w n
i

The objective inside the argmax is simply the definition of vari-


ance. We can drop the constant multiplier 1/n, since it doesn’t
36 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

affect where the maximum is. We can show that z̄, the mean of
the projections is 0:
!
1X T 1 X
z̄ = w x i = wT x i = wT 0 .
n n
i i

The last step follows from the assumption that the data is mean-
centered.

Thus our objective simplifies to

X
argmax zi 2 .
w
i

From our diagram above, we know that kxi k2 = krk2 + zi 2 , or,


equivalently

kxi k2 = kxi - xi0 k2 + zi 2


zi 2 = kxi k2 - kxi - xi0 k2 .

Filling this in to the optimization objective, we get

X
argmax ||xi ||2 -||xi - xi 0 ||2 .
w
i

where kxi k2 is a constant we can remove without affecting the


maximum and removing the minus turns the maximum into a
minimum. Thus, we end up with

X
argmin ||xi - xi0 ||2
w
i

which is the objective for minimizing the reconstruction loss.

The rest of the procedure is the same as before. Once we’ve cho-
sen w1 to maximize the variance, we choose w2 to maximize the
2.1. MIN. RECONSTRUCTION IS MAX. VARIANCE 37

variance and to be orthogonal to w1 , we choose w3 to maximize


the variance and to be orthogonal to w1 and to w2 and so on.
In the previous chapter, we also defined a combined prob-
lem, which combined all the vectors together in one optimiza-
tion objective. We can work out an equivalent for the variance
perspective (the proof is in Section B.2 of the appendix).

Equivalence of combined optimization. The combined prob-


lem for reconstruction error minimization.
X
argmin kxi - xi0 k
W i
such that W T W = I

is equivalent to the following variance maximization problem


X
argmax zir 2 with zir = wr T xi
W i,r

such that W T W = I .

If we want to optimize a matrix W with mutually orthogonal unit


vectors for columns in one go, then maximizing the sum of the
variances in all latent directions is equivalent to minimizing the
reconstruction loss defined in the combined approach
One consequence is that since W = I was a solution of the
error minimization problem (with k = m) it must be a solution
for the variance maximization problem as well.
P This 2tells us something interesting. If we set W = I, then
i,r zir is just the sum of all the variances of all the features
in our data. For all solutions at k = m, this must be the total
variance among the latent features. For solutions to the problem
at some k < m the variance is less than or equal to this value.
Each principal component adds a little variance to the total sum
of variance, and when we have enough principal components to
reconstruct the data perfectly, the sum of the variances along the
38 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

principal components equals the sum total variance in the data.


Here’s what that looks like for the Olivetti faces data.

We plot the sum total of the variances of the data as a green


vertical line and the sum total of the PCA solution for increas-
ing k as a red line.

In this case, we have data that is wider than it is tall, so we reach


the ceiling before k = m, when k = n. This is to do with the
rank of the matrix X. We’ll shed some light on this in Chapter 4.

This shows that we can think of the total variance in all direc-
tions in the data space as an additive quantity in the data, which
we can call its information content. The data contains a certain
amount of information and the more latent dimensions we allow,
the more of that information we retain.

Don’t read too much into the word information here. It’s just a
convenient phrase to use. We could relate it to formal notions
of information content, like Shannon’s, but not without some
serious extra work.

If we keep all the dimensions, we retain all the information. If we


start with the first principal component and add components one
by one, we add to the total of squared variances in a sum that
eventually sums up to the total of squared variances in the data
(much like the reconstruction loss eventually goes to zero). Both
perspectives—retaining variance and minimizing reconstruction
2.2. EIGENVECTORS 39

loss—are formalizations of the same principle; that we want to


minimize the amount of information lost in the projection.

2.2 Eigenvectors
Let’s return to this image.

solutions to the solution to the


combined problem iterative problem

These solutions are the same for both perspectives: variance max-
imization and reconstruction error minimization. We have two
unresolved questions about this image.
First, how is it that the solution to the iterative problem (the
PCA solution) reaches the same optimum as the solutions to the
combined approach? Take a second to consider how strange this
is. Solving the iterative problem is a greedy search: once we have
chosen w1 we can’t ever go back. The process for the combined
approach solves all the vectors in sync. How is it that this abil-
ity to tune the vectors in concert doesn’t give us any advantage
in the optimum we find?
The second question, and the question we will answer first,
is what is the meaning of the PCA solution among the combined
solutions? How can we characterize this solution?
The answer to the second question can be summarized in one
phrase:
The principal components are eigenvectors.
Depending on your background, this will raise one of two ques-
tions. The eigenvectors of what? or, more simply What are eigen-
vectors? Let’s start with the second question, and work our way
back to the first.
40 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

2.2.1 What are eigenvectors?


The most common, and probably the most intuitive way to think
about matrices is as transformations of points in space. If we have
some vector x and we multiply it by a matrix A, we get a new
point y = Ax. If A is square, then x and y are in the same space.
A good way to visualize this is by domain coloring. We take
a large number of points, arranged in a grid, and we color them
by some image. This could be a simple color gradient, but we
can also choose a photograph or some other image. Following
Wikipedia’s example, we’ll use a picture of the Mona Lisa.

An increasingly fine-grained domain coloring using the Mona


Lisa.

If we apply the transformation A to each of these points, we can


tell what effect the matrix has on this space.

y = Ax y = Ax

y = Ax

All the points are mapped to a new position by A and poor Lisa
2.2. EIGENVECTORS 41

ends up squished and stretched in various directions. Transfor-


mations expressible in a matrix are linear transformations. These
are the transformations for which a line in the original image is
still a line in the transformed image. This means that we can ro-
tate, stretch, squish and flip the image in any direction we like,
but we can’t warp, bend or tear it.
In this language of transformation, we can very naturally de-
fine what eigenvectors are. The eigenvectors of a square matrix
A are defined as those vectors (i.e. points in the image) for which
the direction doesn’t change under transformation by A.
It’s simplest to see what this looks like for a diagonal matrix.
For instance in the transformation
✓ ◆
2 0
y= x
0 12
the matrix acts independently on the first and second dimensions,
squishing one, and stretching the other.

y = Ax y = Ax

y = Ax

In this image we’ve also drawn two vectors: one to the middle
of Mona Lisa’s left eye, and one to middle of the right. Since
Leonardo put the right eye dead center in the painting (not by
accident, I imagine), the red vector shrinks, but doesn’t change
direction. The green vector is affected by both the squishing
42 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

and the stretching, so its direction and magnitude both change.


Hence, the red vector is an eigenvector, and the green vector isn’t.
In a diagonal matrix, the eigenvectors are always the vectors
that point in the same directions as the axes, so they’re easy
to identify. In general square matrices, finding the eigenvec-
tors is more tricky.

See if you can find an eigenvector for the first transformation,


shown above. One trick is to start with a random vector, and
if it changes direction, transplant it back to the untransformed
image and iterate until the vector doesn’t change anymore.

Formally, a vector v is an eigenvector of A if the following holds


for some scalar :

Av = v .

This is just a symbolic version of what we said in words above:


if v is an eigenvector of A, then transforming it changes its mag-
nitude but not its direction, which is the same as saying we can
multiply it by a scalar instead of by a matrix. The scalar corre-
sponding to the eigenvector, in this case is called an eigen-
value of the matrix.

It’s not at all clear from the definition why these vectors should
be meaningful or special. For now, just trust me that eigenvec-
tors are worth knowing about.

To build your intuition, consider for a second the question of


whether a pure rotation matrix has eigenvectors (the zero vec-
tor doesn’t count). It shouldn’t take long to convince yourself
that a rotation in two dimensions doesn’t. At least, not usually.
There are two exceptions: rotating by 360 degrees (which is just
I) and rotating by 180 degrees. In both cases, every vector is
an eigenvector. In the first with eigenvalue 1, and in the sec-
ond with eigenvalue -1.
In three dimensions, rotation is a different story: try point-
ing straight up in the air and spinning around. The direction
of your arm doesn’t change as you rotate, so your arm is an
2.2. EIGENVECTORS 43

rotate by 27
eigenvector. Your nose does change direction, so your nose is
not an eigenvector.
We saw that when a matrix is diagonal, its eigenvectors are
aligned with the axes, so they’re easy to find. For other matrices,
we need to do some more work. One trick is to simply transform
the matrix so that the eigenvectors are aligned with the axes,
and then to transform it back again.
This is easiest to understand if we work backwards: given
some eigenvectors, find the transformation for them.
Here are some vectors drawn on top of the Mona Lisa. What
is the transformation for which these vectors do not change direc-
tion?

We have made things easy for ourselves by making the eigen-


vectors orthogonal. This means we can rotate the image to put
the eigenvectors on the axes. We can then do any stretching and
squishing we like along the axes, and rotate the image back.
44 CHAPTER 2—EIGENVECTORS AND EIGENVALUES
P
rotate by
P27° scale along
-1 the axes rotate by -27°
P
P-1 D P
D P-1
D

Note how, comparing the first and the last image, the red and
blue vector change their shape, but not their direction.
Any change we make to the direction of the vectors in the
first step is reversed exactly in the last step: the only permanent
change to any directions is made in the middle step. Therefore,
those vectors which don’t change direction in the middle step,
never change direction at all, and must therefore be eigenvec-
tors. These are the vectors that align with the axes in the middle
figure, since the middle transformation is along the axes. There-
fore, the vectors which are mapped to align with the axes by
the first step are the eigenvectors of the complete transformation
consisting of all three steps.
All three of these steps can be expressed as matrix transfor-
mations. We will call the matrix for the last step P. The first step
is then P-1 , since it needs to undo P exactly. This means that
we must require that P is invertible.

We could also call the first step P and the last step P-1 , of course,
but it works out more neatly if we call the last step P.

In the example, P is a rotation matrix, but the principle gener-


alizes to all invertable matrices.
We have seen the approach for the middle step already: it is
expressed by a diagonal matrix, which we’ll call D. The compo-
sition of these three steps is our transformation matrix A.
2.2. EIGENVECTORS 45

We compose transformations by multiplying their matrices, so we


have:

A = PDP-1
Note that for a transformation Ax = PDP-1 x, the rightmost
matrix is applied first, so the direction of the matrices is reversed
from the steps in image above.
Now we can work backwards: For any given matrix A, if we
can decompose it in this way, as the product of some invertible
matrix, some diagonal matrix and the inverse of the invertable
matrix, then we know three things:

• The eigenvectors of A are the vectors that are mapped by


P-1 to align to the axes. Any change of direction intro-
duced by P-1 is undone by P, so the only vectors whose
direction is unchanged are those mapped to the eigenvec-
tors of D (i.e. to the axes).

• The eigenvectors are also those vectors which to which axis-


aligned vectors are transformed by P.

• The eigenvalues of A are the elements along the diagonal


of D. Any change of magnitude introduced by P-1 is un-
done by P, so only the changes made by D remain. An
eigenvector mapped to axis i by P is scaled by Dii , which
is therefore the corresponding eigenvalue.

For a given matrix A, finding a diagonal matrix D and an invert-


ible matrix P so that A = PDP-1 is called a diagonalization or
an eigendecomposition of A.
So, given a diagonalization of A, which are the eigenvectors?
We use the second bullet point above. If we take an axis-aligned
vector and transform it by P, the result is an eigenvector.
We have one eigenvalue per axis, so we’ll look for one eigen-
vector for each. For the vectors to feed to the transformation,
we can just take axis-aligned unit vectors (also known as one-
hot vectors). Each will transform to an eigenvector. We can do
the transformation for all vectors in one go by concatenating the
vectors as the columns of one matrix. For the unit vectors this
46 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

simply results in the identity matrix I, and for the eigenvectors,


this results in a matrix we will call E. So we are looking the
matrix E for which

PI = E .

Removing I tells us that the eigenvectors we are looking for are


simply the columns of P.

2.2.2 The spectral theorem


Note that we were careful above to say if A can be diagonal-
ized. Not all square matrices can be diagonalized. A theorem
showing if and when a particular class of transformations can be
diagonalized is called a spectral theorem.

The set of eigenvalues of a matrix is sometimes called its spec-


trum, so methods and results using these principles often use
the adjective spectral. For instance, we have spectral clustering,
spectral graph theory and spectral ODE solvers.

There are many spectral theorems, but we’ll only need the sim-
plest. The spectral theorem for symmetric matrices. A symmetric
matrix is a square matrix A which is symmetric across the diago-
nal. That is, it has Aij = Aji , or equivalently AT = A. We’ll call
this the spectral theorem in the context of this book.
To state the theorem, we first need to define orthogonal ma-
trices. These are square matrices in which all columns are mu-
tually orthogonal unit vectors.
This should sound familiar, it’s the constraint we placed on our
principal components in the previous chapter. In the combined
problem, the constraint W T W = I is simply a requirement that
the matrix W be orthogonal.
Why? Remember that matrix multiplication is just a collec-
tion of all dot products of rows on the left with columns on the
right, so in this case all columns of W with all other columns
of W. On the diagonal of W T W, we find all dot products of
columns of W with themselves, which are all 1, because they are
all unit vectors. Off the diagonal we find all dot products of all
columns with other columns. These are all zero, because they
2.2. EIGENVECTORS 47

are all mutually orthogonal.

Geometrically, orthogonal matrices represent those transforma-


tions that do not change the magnitude of any vector: that is,
rotations and reflections.

A very useful property of orthogonal matrices is that their inverse


is equal to their transpose: W -1 = W T . This follows directly
from the fact that W T W = I, because the inverse of W is defined
as a matrix W -1 such that W -1 W = I.
This property makes orthogonal matrices especially nice to
work with, since we can take the inverse—usually a costly and
numerically unstable operation—by flipping the indices around,
which we can do in constant time, with no numerical instability.
We can now state the spectral theorem.

The spectral theorem. Call a matrix A orthogonally diagonaliz-


able if it is diagonalizable with the additional constraint that P is
orthogonal
A = PDPT .
A matrix A is orthogonally diagonalizable if and only if A is sym-
metric.

Proving this now would require us to discuss too many extra con-
cepts that aren’t relevant for this part of the story. On the other
hand, this theorem is very much the heart of PCA: everything it
is and can do follows from this result. We’ll take it at face value
for now, and answer the rest of our questions. The next chapter
will be entirely dedicated to proving the spectral theorem.
For now, just remember that if we have a square, symmet-
ric matrix, we can diagonalize it with an orthogonal matrix P
and a diagonal matrix D. The diagonal elements of D will be
the eigenvalues and the columns of P will be the correspond-
ing eigenvectors.
Note that the spectral theorem implies that there are n eigen-
values (since D has n diagonal values). Some of them might be
zero, but we need not worry about that at the moment. In up-
48 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

coming chapters, we’ll develop some concepts that help us char-


acterize what it means for eigenvalues to be zero.
Finally, notice that for any such diagonalization, we can shuf-
fle the eigenvalues around and get another diagonalization (we
just have to shuffle the columns of P in the same way). Since the
ordering of the eigenvalues in D is arbitrary, we usually sort the
from largest to smallest, calling the largest the first eigenvector
and the smallest the last eigenvector. As you may expect, we’ll
see later that these match the ordering of the principal compo-
nents. We’ll call the decomposition with the eigenvectors ordered
like this, the canonical orthogonal diagonalization.

2.2.3 The eigenvectors of which matrix?


Let’s get back to the PCA setting. Where do we find eigenvectors
in this picture? We have a matrix, the data matrix X, but it isn’t
square, and it’s never used as a transformation.
In fact, the eigenvectors that end up as the principal compo-
nents are the eigenvectors of the covariance matrix of our data X.

The principal components are the eigenvectors of the


covariance matrix.

Let’s start by reviewing what a covariance matrix is. When we


talk about one-dimensional data, we often discuss the variance:
a measure for how spread out the numbers are. We can think
of this as a measure of predictability. The more spread out the
points are, the more difficult it is to predict where a randomly
sampled point might be. If the variance is very small, we know
any point is very likely to be near the mean of the data. If it’s
very large, we are less sure.
The covariance matrix is the analogue to this for m-dimensional
data, like our dataset X. It tells us not just how spread out the
points are along the axes (the variance of each feature) but also
how spread out the points of one feature are, given the value
of another feature.
2.2. EIGENVECTORS 49

Consider the following 2D dataset:

The variance for both features is 1, so the data is pretty spread


out. It has high variance, and is therefore relatively unpredictable.
However, if I know the value of feature 1, suddenly the likely val-
ues of feature 2 become much narrower.
This is because the data has high covariance between the two
features: knowing the value of one, tells us a lot about the value
of another. Another way of saying this is that the two features
are highly correlated. Here are the different ways data can be
linearly correlated in 2D.
50 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

Three ways data can be correlated. The bottom row shows the
covariance matrices of each dataset (explained below).

Pay particular attention to the middle example: perfectly decorre-


lated data. In such data, the features are independent: knowing
the value of one tells us nothing about the value of the other. This
is an important property of good latent features. For instance,
in the Olivetti data from the last chapter, many of the observed
features (the pixel values) were highly correlated, but the latent
features we extracted by PCA (the gender, the age, the lighting)
were largely decorrelated. If the data is not biased, knowing the
age of a subject shouldn’t tell you anything about the way they
were lit or how feminine they appear.
The formula for the variance of feature j, as we’ve seen before,
is

1X
VarX (j) = (x̄j - Xij )2
n
i

Where x̄j is the mean of feature j, which is 0 if the data is mean-


centered. The covariance between features j and k is defined as

1X
CovX (j, k) = (x̄j - Xij )(x̄k - Xik )
n
i
2.2. EIGENVECTORS 51

These are both estimates. The distribution from which the data
was sampled has some invisible (co)variance, which we estimate
from the data by these formulas. For a maximum likelihood
estimate, we divide by n, for an unbiased estimate by n - 1.
For large data, the difference is negligible, so I’ll use the first to
keep the formulas simple.

For mean-centered data, these simplify to

1X
VarX (j) = Xij Xij
n
i
1X
CovX (j, k) = Xij Xik .
n
i

Note two things about these equations:

1. The variance is just the covariance of a feature with itself :


VarX (j) = CovX (j, j).

2. If we ignore the multiplier n1 , the covariance is the dot


product of one column of X with another.

This means that if we make a big matrix with all covariances


between features j and k, we can compute that matrix by a sim-
ple matrix multiplication:

1 T
Cov(X) = X X
n
52 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

⇥ X
- variance
XT
-covariance
S
X
X XT
XT S
S covariance matrix

This matrix is symmetric, since Cov(j, k) = Cov(k, j), and it has


the variances of the various features along the diagonal.

For any matrix, X of any size, M = XT X is square and sym-


metric: Mij = Mij because both values are the dot product of
columns i and j in the data.

We’ll denote the covariance matrix of our dataset X by S . This is


the matrix that we’re interested in: the eigenvectors of S coincide
with the principal components of X.
I expect that that doesn’t immediately make a lot of intuitive
sense. We’ve developed eigenvectors in terms of matrices that
transform points in space. We don’t usually think of S as trans-
forming space. It’s not common to see a vector multiplied by S.
Yet, we can easily diagonalize S. In fact, since it’s symmetric,
2.3. DATA NORMALIZATION AND BASIS TRANSFORMATIONS 53

the spectral theorem tells us that we can diagonalize it with an


orthogonal matrix, and we can be sure that it has n eigenvalues.
To develop some intuition for what these eigenvalues mean we
can have a look at the common practice of data normalization.

2.3 Data normalization and basis transformations


Data normalization is a very common data pre-processing step
in many data science processes. For many applications we don’t
much care about the natural scale of the data, and we instead
want the data to lie in a predictable range. For one-dimensional
data, one way to do this is to rescale the data so that its mean
equals 0, and its variance equals 1. We achieve this easily by
first shifting the data so that the mean is at 0, and then scaling
uniformly until the variance is 1.
To work out how to make this transformation, we can imag-
ine that our data originally had mean 0, and variance 1, and
was then transformed by scaling and then adding some constant
value. That is, every point x we observed was derived from an
unseen point z by two parameters s and t as

x = sz + t
with the z’s having mean 0 and variance 1. We will call z the
hidden or latent variable behind the observed variable x.
After scaling by s, the mean of the ”original” data is still 0, so
we should set t = x̄ to end up with the correct mean for our ob-
served data. To work out s, we move this term to the other side:

x - x̄ = sz .
The left-hand side is the mean-centered data, and the right hand
side is a scaled version of the latent data. Since variance isn’t
affected by the additive term, we get

1X 1X 2
Var({x}) = (sz)2 = s2 z = s2 ⇥ Var({z}) = s2
n z n z

So, to end up with the correct variance for x, we should set the
square of s equal to the data variance, or equivalently, we should
54 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

set s equal to the data standard deviation (the square root of


the variance). So, the correct normalization is:
x - x̄
x0 = .

This may seem like an overly elaborate way to derive a pretty


intuitive normalization, but we will generalize this approach to
higher dimensions later, so it pays to understand the steps.

Instead of thinking of this operation as moving the data around,


we can also think about it as keeping the data where it is, but just
expressing it in different axes. We move the origin to coincide
with the data mean, and then scale the unit (the length of the
arrow from 0 to 1) so that its tip lies at the point x̄ + . On this
new axis, the data is normalized.

mean std. dev.


{x} :

0 1

0 1

We can see the operation of normalizing our one-dimensional


data as simply expressing the same points on a different axis.
We change the location of the origin and the length of the
unit, and our data is normalized.

In higher dimensions, the units we use to express points in space


are often called a basis. We take a bunch of vectors b1 , . . . , bk ,
called the basis vectors and express points as a sum of the basis
vectors, each multiplied by a particular scalar, where the scalars
are unique to the point we are expressing.
2.3. DATA NORMALIZATION AND BASIS TRANSFORMATIONS 55

Strictly speaking, for the set b1 , . . . , bk to be a basis, the vectors


should also be linearly independent. For our current purposes,
we don’t need to define this so precisely. We’ll discuss linear
independence in more detail in a later chapter.

This is how our standard coordinate system works as well: in


three dimensions, the basis vectors are e1 = (1, 0, 0), e2 =
(0, 1, 0) and e3 = (0, 0, 1). When we refer to a point p with
the coordinates (7, 3, 5), we are implicitly saying that
0 1 0 1 0 1
1 0 0
p = 7 ⇥ @0A + 3 ⇥ @1A + 5 ⇥ @0A
0 0 1
This is called the standard basis. It’s a little elaborate for some-
thing so familiar, but it shows a principle we can apply for other
sets of basis vectors. With any set of vectors B = {b1 , . . . , bk }, we
can describe a point by writing down a vector pB , and computing

1 b 1 + . . . + pk b k
p = pB B

Here, p are the coordinates of the point in the standard basis,


and pB are the coordinates in the basis B.

e1
e2 ✓ ◆ ✓ ◆ ✓ ◆
3 1 1
e1 b1 e1 =3 +2
2 0 0
e2 b2 e2 ✓ ◆ ✓ ◆
b1 0.9 0.7
b1 =2 + 1.7
b2 0.4 0.7
b2
e1
e2
b1

The point (3,


b
2) as expressed in our standard basis becomes
2

the point (2, 1.7) when expressed in the basis defined by vec-
tors b1 = (0.9, 0.4) and b2 = (0.7, 0.7).

If we concatenate the basis vectors into the columns of a matrix


56 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

B, we can express this transformation as a simple matrix multipli-


cation:

p = BpB

This also suggest how to transform in the other direction: from a


point in the standard basis to a point in the basis B: we require
that B is invertible and use

pB = B-1 p .

The set of points you can express in a particular basis is called


its span. In the image above, the span is the same as that of
the standard basis, but if you define two basis vectors in a three-
dimensional standard basis, their span would usually be some
plane crossing the origin.
If you want to actually compute the transformation into the
basis, the computation of a matrix inverse is finicky and very
likely to be numerically unstable. It’s nice if you can ensure
that B-1 = BT . We’ve seen matrices with this property already:
they’re called orthogonal matrices. To refresh your memory, or-
thogonal matrices are square matrices with columns that are all
unit vectors and all mutually orthogonal.

A basis expressed by an orthogonal matrix is called an orthonor-


mal basis. It’s a rotated or flipped version of the standard basis,
but the basis vectors are still all orthogonal and they are still
all unit vectors.

We can now say that our data normalization was nothing more
than a simple basis transformation in R1 . We mean-center the
data, and replace the standard basis vector by one that matches
the variance of our data. This is not an orthonormal basis, but
we’ll see a fix for that later.
More importantly, we can translate the idea of data normal-
ization to higher dimensions.
In R1 , we were after a basis in which the variance was 1. In
R we will look for a basis in which the covariance is I.
n
2.3. DATA NORMALIZATION AND BASIS TRANSFORMATIONS 57

This requirement has two consequences:


• In our new coordinates, the variance is 1 along all axes.
• In our new coordinates all covariances are 0. That is, the
data is perfectly decorrelated.

This kind of normalization is called whitening (because stan-


dard normally distributed noise is sometimes called white noise).
It’s not usually necessary in data science, but it can be a very
powerful preprocessing method if you can spare the required re-
sources. We’re primarily discussing it as a way of making intu-
itive what is happening under the hood of PCA.

We’ll proceed the same way we did before: we will imagine that
our data was originally of this form, and has been transformed by
an affine transformation. We’ll call the matrix for this imagined
“original” data Z. This means that we assume that X was pro-
duced by sampling Z from a standard normal distribution and
transforming with as:

XT = AZT + t
with some A and t.
As before, we will first figure out which A and which t will
take us from the latent data Z to our observed data X, and then
we will invert this transformation to find the transformation that
normalizes our data.
The logic for t is the same as it was before: since Z has zero
mean, it still has zero mean after being transformed by A. If we
set t = x̄, we transport this to the mean of the observed data.
We move this term to the left-hand side

XT - x̄ = AZT
and observe that the mean-centered data on the left is equal to
our A-transformed latent data.
Now, we need to set A to achieve the right covariance. The co-
variance is unaffected by the additive term -x̄, so we can ignore
that. The covariance of the transformed data is:
58 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

1 1
Cov(X) = AZT (AZT )T = AZT ZAT = ACov(Z)AT = AAT .
n n
Where previously we needed to choose our scalar s so that its
square was equal to the data variance 2 , we now need to choose
our transformation matrix A so that its “square” AAT is equal
to the data covariance S.
If we find such an A, we know that its transformation is what
maps the decorrelated data to the data we’ve observed. So even
though we never transform any points by the covariance ma-
trix, we see that internally, it does contain a very natural trans-
formation matrix.
There are a few ways to find A for a given S. The Cholesky
decomposition is the most natural analogue to the square root
we used in the 1D case. This road leads to a technique known
as Cholesky whitening.
But this is not a book about whitening, it’s a book about PCA.
We’re trying build some intuition for what PCA is doing. So in-
stead, we’ll solve S = AAT using the orthogonal diagonalization
we developed earlier, which will lead us to a method called PCA
whitening (a kind of byproduct of the PCA analysis).
We know that S is square and symmetric, so we know it can
be orthogonally diagonalized:

S = PDPT .
To turn this into a solution to S = AAT we need two factors, with
the second the transpose of the first. We can do this easily by not-
ing two things about the diagonal matrix in the middle. First, the
1
square root D 2 of a diagonal matrix D is just another diagonal
matrix with the square roots of the original elements along the
1 1
diagonal. This gives us D = D 2 D 2 . Second, the transpose of a
1 1 T
diagonal matrix is the same matrix, so that D = D 2 D 2 . Thus

1 1 T
S = PD 2 D 2 PT
1
= AAT with A = PD 2
2.3. DATA NORMALIZATION AND BASIS TRANSFORMATIONS 59

Finally, to whiten our data, we reverse the transformation from


Z to X and get
1
Z = A-1 (X - t) = D- 2 PT (X - t) .

So, to map our data to a zero-mean, unit-variance, decorrelated


form, we map to the basis formed by the eigenvectors of S and
then divide along each axis by the square root of the eigenvalues.
We can see here that the eigenvalues of the covariance matrix are
the variances of the data, along the eigenvectors (remember that
we divided by the square of the variance before).
Taking the alternative perspective we took above, we can also
keep the data where it is and change our basis. We scale the
1
standard basis along the axes by D 2 rotate by P and translate by
x̄. In the resulting axes, our data has mean 0, and covariance I.
60 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

Note how we’re having our cake and eating it too. We are scaling
our axes to control the variance, so we can’t have an orthonor-
mal basis, but the eigendecomposition breaks the basis transfor-
mation in two steps: first an orthonormal basis transformation,
which allows us to use PT instead of P-1 , and then a scaling
along the new axes by the eigenvalues.

2.4 Quadratic forms


Have we fully married our first intuition about eigenvectors in
transformation matrices with the role eigenvectors play in PCA,
as the eigenvectors of S? Not quite. We’ve shown that S is
in some sense composed of a very important transformation A,
which transforms decorrelated data with unit variance to have
covariance S, but the eigenvectors we’re using are not the eigen-
vectors of A. Rather, A is made up of our eigenvectors and may
itself have different eigenvectors, or no (real) eigenvectors at all.

We will see in Chapter 4 that the eigenvectors of S are called


the singular vectors of A.

To develop an intuition for how S operates on space, it’s more


helpful not to look at the linear form

Sx
but at the quadratic form

xT Sx .
This may look mysterious, but it’s just a concise way of writ-
ing second-order polynomials in n variables (just like Mx is
a concise way of writing a linear function from n to m vari-
ables). For instance,
✓ ◆
T 2 3
x x = 2x1 2 + 3x2 x1 + 4x1 x2 + 5x2 2
4 5
The simplest quadratic is xT Ix, or just xT x. If we set this equal
to 1, the points that satisfy the resulting equation are the unit
2.4. QUADRATIC FORMS 61

vectors. In 2 dimensions, these form a circle called the bi-unit


circle. In higher dimensions, the resulting set is called the bi-
unit (hyper)sphere.
There are two ways to use quadratic forms to study the eigen-
vectors of S. The first is to look at xT Sx and to study what this
function looks like when constrained to the bi-unit sphere. Look-
ing only at the points for which xT x = 1 what happens to the
parabola xT Sx?

✓ ◆
2 3
The bi-unit circle is deformed by the parabola xT x
4 5

If we diagonalize S = PDPT , the quadratic becomes xT PDPT x.


The first and last two factors are just a change of basis so we
can also write zT Dz with z = PT x. Since P is orthogonal, the
62 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

change of basis doesn’t change the length of the vectors and the
constraint that x should be unit vectors is equivalent to requir-
ing that z be unit vectors.
The quadratic form zDz is particularly simple, because D is di-
agonal. We simply get

zDz = z1 z1 D11 + z2 z2 D22 + . . . + zm zm Dmm .

This sum is very important. Note that all the factors zr zr are not
only positive (because they’re squares), but they also sum to one,
since kzk2 = z1 2 + . . . + zm 2 is the squared length of vector z,
which we constrained to 1.
We’ll call this a weighted sum: a sum over some set of num-
bers where each term is multiplied by a positive weight, so that
the weights sum to 1.
In the next section, we will use this sum to prove just about
every open question we have left. For now, just notice what hap-
pens when x is an eigenvector. In that case, z is a one-hot vector,
because Pz = x, and only one of the terms in the sum is non-zero.
This is one way to think of the quadratic form of S: it defines
m orthogonal directions in space (the eigenvectors), at which
the quadratic takes some arbitary value (the eigenvalues). For all
other directions x, the quadratic is a weighted mixture between
the eigenvalues, with the weights determined by how much of x
projects onto the corresponding eigenvectors.

Looking at the sum above, you may be able to figure out what
the minimum and maximum values are of the quadratic form
xT Sx under the constraint that xT x = 1. Note the similarity of
this optimization problem to the PCA problem. If you don’t see
it yet, don’t worry. We’ll dig deeper into this in the next section.

For another geometric interpretation of the eigenvectors of S,


think back to the one-dimensional example of normalizing data.
In the normalized version of the data, the variance is equal to
1. This means that, for most distributions, we can be sure that
the majority of the data lies in the interval (-1, 1). This is called
the bi-unit interval, since it is made up of two units around the
origin. If our data is normally distributed, this interval captures
2.4. QUADRATIC FORMS 63

about 68% of it after normalization. The transformation by t


and s maps this interval to an interval that captures the same
proportion of the unnormalized data.
In higher dimensions, the analogue of the bi-unit interval is the
bi-unit sphere, the set of all points that are at most 1 away from
the origin. To follow the analogy, we can transform the bi-unit
sphere, which captures the majority of Z, by some A-1 so that
we capture the majority of the observed data X.

The bi-unit circle captures the majority of the normalized


data. The ellipse we create by transforming the circle by A-1
(and translating to the mean) captures the same majority of
the unnormalized data.
64 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

In two dimensions, the transformation by A and t that we de-


rived above is the transformation that maps the bi-unit circle to
an ellipse which captures the majority of the data. In more than
two dimensions, we’re mapping a hypersphere to an ellipsoid.
Note that the standard basis vectors are mapped to the eigenvec-
tors of S. We call this new basis, in which the data is normal-
ized, the eigenbasis of S.
To work out the shape of the ellipsoid in quadratic form, we
just start with the set of all unit vectors C = u : uT u = 1 and
transform each by A individually (much like we transformed the
Mona Lisa earlier).

AC = Au : uT Iu = 1
= y : uT u = 1 and y = Au
⌦ T

= y : A-1 y A-1 y = 1
= y : yT S-1 y = 1

That is, to transform the bi-unit circle to an ellipsoid that cov-


ers the same proportion of X as the circle did of Z, we turn the
equation uT u = 1 into the equation yS-1 y = 1. This gives
us a quadratic form that desribes the ellipsoid that covers the
majority of our data.
Why do we get an inversion from S to S-1 ? Follow the trans-
formation along the first eigenvector of S. In this direction, we
are multiplying the input unit vector by the square of the first
1
eigenvalue (remember A = PD 2 ). To keep the vector a unit
vector, we should therefore divide by the square of the first eigen-
value. In other words, if we want to define a quadratic form for
which the arguments transformed by A stay unit vectors, then
the more the value of xT Sx grows, the more our quadratic form
should shrink and vice versa.

Note that the inverse of a symmetric matrix has a particularly


simple expression in terms of its orthogonal diagonalization:
-1
S-1 = PDPT = PD-1 PT . That is, the eigenvectors stay
the same, and we just take the reciprocals of the eigenvalues.
2.5. WHY IS PCA OPTIMAL? 65

2.5 Why is PCA optimal?


With the quadratic form added to our toolbox, we can finally
start answering some of the deeper questions. Both visually, us-
ing transformations of ellipses, and formally, using the language
of eigenvectors.
Let’s start simply: why does the first principal component co-
incide with an eigenvector of S? And, while we’re at it, which
eigenvector does it coincide with?
Visually, this is easy to show. In the image above, we plotted
the bi-unit circle, and its transformation into an ellipse that cov-
ers the same part of the observed data. The standard basis vectors
are mapped to the eigenvectors. Note that these become the axes
of the ellipse. One of the standard basis vectors is mapped to the
ellipse’s major axis, the direction in which it bulges the most.The
direction in which the data bulges the most is also the direction
of greatest variance, and therefore the first principal component.
The proof of this fact is not very complex.

First eigenvector. The first principal component is the eigenvec-


tor of the covariance matrix S with the largest eigenvalue.

Proof. The first principal component w1 is defined as


X 2
argmax wT xi such that wT w = 1
w
i

that is, the direction in which the variance of the projected data
is maximized.
Rewriting the objective function, we get

X 2 X
wT x i = wT xi xi T w
i i
!
X
= wT xi xi T w
i
= wT XT Xw
= NwT Sw .
66 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

This means that the direction in which the sum of variances is


maximized, is the direction, represented by a unit vector w, for
which the quadratic form wT Sw is maximal.
If we orthogonally diagonalize S, with the eigenvalues canoni-
cally arranged, we get

wT Sw = wT PDPT w
= zDz with z = PT w .

In the last step, we’ve simplified to a diagonal quadratic form in


the eigenbasis of S. This quadratic form simplifies to

zT Dz = z1 2 D11 + . . . + zm 2 Dmm
where the constraint that z is a unit vector means that z1 2 +
. . . + zm 2 = 1. In other words, this is a weighted sum over the
diagonal elements of D. To maximize a weighted sum, we assign
weight 1 to the largest element and weight 0 to the rest. Since we
took D11 to be the largest eigenvalue, the vector ẑ = (1, 0, . . . , 0)
maximizes the quadratic form.
Mapping back to the standard basis, we get w1 = Pẑ. That is,
the first column of P, which is the first eigenvector of S.

We can extend this proof to show that all the other PCs are eigen-
vectors as well.

PCs as eigenvectors. The k-th principal component of X is the


k-th eigenvector of the covariance matrix S.

Proof. For the first principal component w1 , the previous the-


orem provides a proof. For w2 , follow the previous proof until
the weighted sum

zT Dz = z1 2 D11 + . . . + zm 2 Dmm .
First, note that any vector w 0 that is orthogonal to w1 must also
be orthogonal after transformation by PPT :
2.5. WHY IS PCA OPTIMAL? 67

0 = wT1 w 0 = w1 T PPT w 0 = z1 T z 0 .
Thus, the second eigenvector is orthogonal to the first (as re-
quired) if and only if their projections by PT are orthogonal as
well (and similarly for higher eigenvectors).
Recall that the z-vector of the first principal component is
(1, 0, . . . , 0), so to be orthogonal, the z vector corresponding to
the second principal component must have 0 at its first element.
Since the Djj are arranged in decreasing order, we maximize the
sum under this constraint with the vector ẑ = (0, 1, 0, . . . , 0).
PT ẑ selects the second column in P, so the second principal com-
ponent coincides with the second eigenvector.
The same logic holds for the other principal components. For
each component r, we must set all weights zi 2 with i < r to
zero in order for the z vector to be orthogonal to all principal
components already chosen. In the remainder of the sum, we
maximize the weight with the one-hot vector for r, which selects
the r-th eigenvector.

We have finally shown that the eigenvectors are the vectors that
maximize the variance.
There is one question left to answer: Why is the set of the first
k principal components a solution to the combined problem at k?

Optimality of PCA. For any k, a solution to the iterative problem


is a solution to the combined problem.

Proof. Let W be a solution to the combined problem at k. Let


z1 , . . . , zk be the columns of PT W, that is, the solution vectors
expressed in the eigenbasis of S. The total variance captured by
all these z is
X X
zr T Szr = zrj 2 Djj .
r r,j

Where zrj is the j-th element of vector zr . These are r weighted


sums, summed together. We can group the weights that each zr
contributes to each eigenvalue Djj in to a single sum:
68 CHAPTER 2—EIGENVECTORS AND EIGENVALUES
X X X
zr 2 Djj = Djj zrj 2 .
r,j j r

Now, since the different zr ’s are orthogonal, and each of unit


length, their sum contribution to each Djj must be no more than
1, and the sum of each of their weights is 1.
Why? Imagine the matrix Z with the zr for its columns. What
we’re saying is that its squared elements of Z should sum to 1
over the columns and not exceed 1 when summed over the rows.
If we take ZT Z, the squared and summed columns end up along
the diagonal. We know ZT Z = I, so these are all 1.
If we take ZZT , the squared and summed rows end up along
the diagonal. Extend Z with orthogonal unit vectors until it is
square and orthogonal. Then ZZT = I. Each 1 on the diagonal
of I is the result of a sum of squared values. Some come from
the original Z, some from the columns we added, but all are
squares, so it’s a sum of nonnegative terms. Therefore, the terms
contributed by the original vectors cannot sum to more than 1.

Z
z11 2 + z12 2 + z13 2 + z14 2 + z15 2 = 1
ZT
Z
ZT z11 2 + z12 2 + z13 2 + z14 2 + z15 2 = 1
Z0
Z 0T
extend Z with arbitrary
orthogonal unit columns
0 20 0 2 2
z11 2 + z21 2 + z31 + z41 + z51 =1
Z0
Z 0T
2.5. WHY IS PCA OPTIMAL? 69

So, we have a weighted sum where the total weight allowed is


k, and the maximum weight per element is 1. The optimal so-
lution is to give a maximum weight of 1 to each of the largest
k elements—that is, the first k eigenvalues—and zero weight to
everything else.
One way to achieve this is by setting {zr } to be the first k
one-hot vectors, which yield the first k eigenvectors when we
transform back to the standard basis.

Note that when we say “one way” in the last paragraph, we do


not mean the only way. For instance, if we set k = 2, we get an
optimum with z1 = (1, 0, 0, . . .), z2 = (0, 1, 0, . . .) (the PCA solu-
tion), but rotating
p the pvectors by 45 degrees pin theirp shared plane
gives us z1 = ( 1/2, 1/2, 0, . . .), z2 = ( 1/2, - 1/2, 0, . . .).
Filling these values into the sum, we see that they also result in a
weight of 1 for D11 and a weight of 1 for D22 , which means that
this is also a solution to the combined problem at k = 2.
More broadly, given the PCA solution, any other W whose
columns span the same space as the span of the PCA solution is
also a solution to the combined problem.
We have finally proved our Venn diagram correct, and we have
illustrated what the rest of the light blue circle is made of.

solutions to the solution to the


combined problem iterative problem

2.5.1 Characterizing the PCA solution


Since the eigenvectors are the solution to the PCA problem, you
may be forgiven for thinking of the eigenvectors themselves in
terms error minimization or variance maximization. In that case,
we should guard against a misconception.
70 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

Look back at the ellipse we drew above. The first eigenvector


was the major axis of the ellipse, the direction in which the data
bulged out the most. However, the other eigenvector is its minor
axis. The direction in which the data bulges the least, this makes
it the direction in which the variance is minimized.
To study this a bit more formally, we take the first two proofs
of the previous section and turn them around. If we start with
the last eigenvector and work backward, we are choosing the
directions that minimize the variance (and hence maximize the
reconstruction error).

Last eigenvector. The direction in which the variance is mini-


mized, is the eigenvector of A with the smallest eigenvalue.

Proof. Follow the proof of the first eigenvector theorem until


the sum

zT Dz = z1 2 d11 + . . . + zm 2 dmm .
A weighted sum is minimized when all the weight is given to the
smallest term. Following the same logic as before, this leads to a
one-hot vector ẑ = (0, . . . , 0, 1) that selects the last column of P,
which is the last eigenvector.

Did we make a mistake somewhere? We defined the principal


components as directions for which variance is maximized. Then
we showed that all principal components are eigenvectors. Now
we learn that at least one eigenvector actually minimizes the vari-
ance. What gives?
The solution lies in the fact that the sum of all variances is
fixed to the sum of the variances of the data, ztotal . Imagine solv-
ing the combined problem for k = m - 1. The resulting variances
along the columns of the solution W should be as high as pos-
sible. However since all these columns are orthogonal, there is
only one direction v left which is orthogonal to all of them. The
variance along this direction, zv , is whatever variance we haven’t
captured in our solution:

z1 + . . . + zm-1 + zv = ztotal
2.5. WHY IS PCA OPTIMAL? 71

Since ztotal is fixed, maximizing the first m - 1 terms is equiva-


lent to minimizing the last.
We can define a kind of reverse iterative problem where we de-
fine the last principal component as the direction that minimizes
the variance, the last-but-one principal component as the direc-
tion orthogonal to the last principal component, the last-but-two
principal component as the direction orthogonal to the last two
that minimizes the variance and so on.
We can show that optimizing for this problem gives us exactly
the same vectors as optimizing for the original iterative problem
which maximized the variance.

Reverse iteration. Under the reverse iterative problem, the last-


but-r principal component chosen coincides with the k-th eigen-
vector of S, with k = m - r, and therefore with the k-th principal
component.

The proof is the same as that of the PCs as eigenvectors theo-


rem, except starting with the smallest eigenvector instead of the
largest and choosing ẑ to minimize at every step.
This shows us that it’s not quite right to think of the eigen-
vectors as maximizing or minimizing some quantity like variance
or reconstruction error (even though we’ve defined the principal
components that way). The eigenvectors of S simply form a very
natural orthonormal basis for the data, from which we can derive
natural solutions to optimization objectives in both directions.
There is one question that we haven’t answered yet. How do
we refine the combined problem so that it coincides with the iter-
ative problem? The one property that we use in our derivations
above that is not stated in the combined problem, is that in the
new basis, the data are decorrelated. If we add this requirement
to the optimization objective, we get:
X 2
argmax xi T W
W i
such that W T W = I
1
and W T XT XW is diagonal.
N
For this problem, there is only one solution (up to negations of
the principal components): the PCA solution. With this, we can
72 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

finally let go of our iterative view of PCA, and embrace methods


that compute all principal components in a single operation.
We have come home from a long walk. Let’s settle by the
fireplace and talk about all the things we’ve seen.
We started with a broad idea of PCA as a method that iter-
atively minimizes the reconstruction error, while projecting into
a lower dimensional space. For some reason, we saw last time,
this works amazingly well and exposes many meaningful latent
dimensions in our data. In this chapter, we showed first that mini-
mizing reconstruction error is equivalent to maximizing variance.
We then looked at eigenvectors, and we show that the eigen-
vectors of the data covariance S arise naturally when we imagine
that our data was originally decorrelated with unit variance in
all directions. To me, this provides some intuition for why PCA
works so well when it does. We can imagine that our data was
constructed by sampling independent latent variables z and then
mixing them up linearly. In our income dataset, in the first chap-
ter, there was one important latent variable: each person’s salary.
From this, we derived the monthly salary, and the majority of
their quarterly income. The other latent variable captured ran-
dom noise: whether people had some extra income, bonuses, etc.

We can imagine the same approach with the Olivetti faces. We get
4096 features, but under water, most of the work is done by a few
latent dimensions which are largely independent of each other:
the subject’s age, the direction of the light source, their apparent
gender and so on. All of these can be chosen independently from
each other, and are likely mostly decorrelated in the data. That
is, if we don’t light all women from the left, or only chose old
men and young women.
2.5. WHY IS PCA OPTIMAL? 73

If these assumptions are violated, it may point to undesirable


biases in our data. A very relevant topic at the moment. This
shows that bias can be defined in terms of the assumed latent
variables. Unfortunately, once the data is biased, it reduces our
ability to extract the latent features, which makes it more dif-
ficult to counteract the bias.

Since the assumptions behind our transformation from decorre-


lated data to the observed data are mostly correct, finding this
transformation, and inverting it retrieves the latent dimensions.
The greater the variance along a latent dimension, the more vari-
ance that particular “choice” added to the the data. The choice
of the subject’s age adds more variance than the lighting, and the
lighting adds more variance than the gender.
The heart of the method is the spectral theorem. Without
the decomposition S = PDPT , none of this would work. Prov-
ing that such a decomposition always exists for a symmetric ma-
trix, and that every matrix for which the decomposition exists
is symmetric, is not very difficult, but it takes a lot more back-
ground than we had room for here: this includes matrix de-
terminants, the characteristic polynomial and complex numbers.
In the next chapter, we will go over all these subjects carefully,
building our intuition for them, and finish by thoroughly prov-
ing the spectral theorem.
Finally, you may wonder if any of these new insights help us
in computing the principal component analysis. The answer is
yes, the eigendecomposition S = PDPT can be computed effi-
ciently, and any linear algebra package allows you to do so. This
gives you the principal components P, and the rest is just ma-
trix multiplication.
The eigendecomposition is certainly faster and more reliable
than the projected gradient descent we’ve used so far, but it can
still be a little numerically unstable. In practice, PCA is almost
always computed by singular value decomposition (SVD). The
SVD is such a massively useful method that it’s worth looking at
in more detail. It’s inspired very much by everything we’ve set
out above, but its practical applications reach far beyond just the
computation of principal components. We’ll develop the SVD in
74 CHAPTER 2—EIGENVECTORS AND EIGENVALUES

Chapter 4 and provide some algorithms for computing both the


eigendecomposition and the SVD in Chapter 5. But before all
that, we’ll return to the spectral theorem, and see exactly what
is required to prove it.
C HAPTER 3 · P ROVING THE
SPECTRAL THEOREM

When I started writing this, it was not meant to be a book. I was


going for a short explanation of Principal Component Analysis
that was simple, but that also didn’t skip any steps. I was frus-
trated with other explanations that leave things out, or require
the reader to take things at face value.
This chapter illustrates why that so often happens. In this
chapter we will prove the spectral theorem, which we intro-
duced in the previous chapter. This is very much the dark heart
of PCA: the one result from which everything else follows, so it
pays to understand it properly. The drawback is that the proof of
the spectral theorem adds a boatload of preliminaries to the story.
Suddenly, just to understand this one statement, we need to
understand determinants, the characteristic polynomial, com-
plex numbers, vectors and matrices and the fundamental the-
orem of algebra. All interesting, of course, and worth knowing
about, but it’s a lot of baggage if you just want to know how PCA
works. So I decided to move it all into one self-contained chapter.

3.1 Restating the spectral theorem


In the last chapter, we learned the following.
An orthogonal matrix is a square matrix whose columns are
mutually orthogonal unit vectors. Equivalently, an orthogonal
matrix is a matrix P for which P-1 = PT .
Any square matrix A is orthogonally diagonalizable if there
exists an orthogonal matrix P and a diagonal matrix D such that
A = PDPT . A matrix A is symmetric if A = AT .

The spectral theorem A matrix is orthogonally diagonalizable


if and only if it is symmetric.
76 CHAPTER 3—PROVING THE SPECTRAL THEOREM

We call this “the” spectral theorem in the context of this book.


In general, there are many spectral theorems about which oper-
ators can be diagonalized under which conditions.

Previously, we saw how much follows from this one simple the-
orem. If we take this to be true, we get eigenvectors, whitening
and principal components.
In the rest of this chapter we’ll build a toolkit step by step,
with which to analyze this problem. At the end, we’ll return to
the theorem and apply our tools to prove it.
The first is a very useful function of a matrix: the determinant.

3.2 Determinants
The determinant started life long before linear algebra. As early
as the 3rd century BCE, the function was used as a property
of a set of linear equations that would allow you to determine
whether the equations had a solution.
Later, determinants were studied as functions in their own
right. In this context, they were seen as very opaque and ab-
stract: something that was useful in higher mathematics, but
hard to explain to the lay person. It wasn’t until matrices become
popular, and in particular the view of a matrix as representing a
geometric transformation, that determinants finally acquired an
intuitive and simple explanation.
That explanation—apart from some subtleties which we’ll dis-
cuss later—is that for a square matrix A, thre determinant ex-
presses how much A inflates the space it transforms.
For example, here are three different ways that a matrix might
transform space to squish and stretch in different directions.
3.2. DETERMINANTS 77
<latexit sha1_base64="BlKkiVI6bfSIR358ifQPweZ3RYM=">AAAJ9HicfVZNb+M2ENVuv+K022bbYy9CjQJFYQSStWs5KBbYxE53D91NGsTJApERUDQtC6Y+QFKWtYR+SXsrit76T3pt0X9TSrYVSaTLiwbz3gyHM8803Rj7lBnGv48ef/DhRx9/ctA5/PSzJ59/cfT0yxsaJQSiCYxwRN65gCLsh2jCfIbRu5ggELgY3brLUYHfrhChfhResyxG0wB4oT/3IWDCdX906qQu5I4b6Ke5Xn7Xue78oFfuM7V7VLnvj7rGsVEuXTbMrdHVtuvy/unBL84sgkmAQgYxoPTONGI25YAwH2KUHzoJRTGAS+Chu4TNh1Puh3HCUAhz/VuBzROss0gvjqPPfIIgw5kwACS+yKDDBSAAMnHow2YqikIQINqbrfyYbky68jYGA6JjU74uO5o/aURyj4B44cN1ozQOAhoAtpCcNAvcphMlGJFV0HQWZYoiW8w1ItCnRRMuRWcu4mJK9Dq63OKLLF6gkOY8ITivBwoAEYLmIrA0KWJJzMvTCGks6QtGEtQrzNL3YgzI8grNeiJPw9EsZ44jwJouVxxDdCdEKYyCAIQz7sQ5dxhaM+70jvOyd3X0KufcKRrluvpVATfQtzX0bZ43wfMaeC7AJjqp0Lk+aYfe1MAbadfbGnrbDnWTGppI6KqGrqTMblqDUwle19C1hGY1NJPQ9zX0vdxnIGRx15/yzSzKofIL7K/QK4JQmPNuP2+fhYh535nNkEIDvGvmZbtnaC7ulQ0QZAWdv75+81POR8P+c2OQtxkuTtCOYliD5yNDonibarYcYzjsn0mciIDQqxKNzwenppwoTkiMK5JtWz+eyJkyhHGUVplGZ+O+1SKJhjRrMm3TMNqnTz24Iwxse2y060nxA+F0NLbs9jYpqXDXtFC/3bwUPxBm1vCZnMCtcMsYDE4kHD8QbAvYM4mwrPCTcrV/UKKAthq2Q9e7pi6px1PRt61UBriqgI1klPylzH9FQLaHHamy75SkjIhVETtZKSMyVcROY7uIZkiqaFMpJuUGpYwkOt7PV8ysVNqe7Co63s9XTKyU4Z7sKjrez1fMt9SoupHLuLj/luL5ERcvBYA3lDESbwiC3oh78UL87wEWke/FZUi8wBc6FF+nV1j/RwTrHRGUzxmz/XiRjZv+sTk4tn5+1n15tn3YHGhfa99o32mmZmsvtdfapTbRoPaH9pf2t/ZPZ9X5tfNb5/cN9fGjbcxXWmN1/vwPc0anig==</latexit> <latexit sha1_base64="BlKkiVI6bfSIR358ifQPweZ3RYM=">AAAJ9HicfVZNb+M2ENVuv+K022bbYy9CjQJFYQSStWs5KBbYxE53D91NGsTJApERUDQtC6Y+QFKWtYR+SXsrit76T3pt0X9TSrYVSaTLiwbz3gyHM8803Rj7lBnGv48ef/DhRx9/ctA5/PSzJ59/cfT0yxsaJQSiCYxwRN65gCLsh2jCfIbRu5ggELgY3brLUYHfrhChfhResyxG0wB4oT/3IWDCdX906qQu5I4b6Ke5Xn7Xue78oFfuM7V7VLnvj7rGsVEuXTbMrdHVtuvy/unBL84sgkmAQgYxoPTONGI25YAwH2KUHzoJRTGAS+Chu4TNh1Puh3HCUAhz/VuBzROss0gvjqPPfIIgw5kwACS+yKDDBSAAMnHow2YqikIQINqbrfyYbky68jYGA6JjU74uO5o/aURyj4B44cN1ozQOAhoAtpCcNAvcphMlGJFV0HQWZYoiW8w1ItCnRRMuRWcu4mJK9Dq63OKLLF6gkOY8ITivBwoAEYLmIrA0KWJJzMvTCGks6QtGEtQrzNL3YgzI8grNeiJPw9EsZ44jwJouVxxDdCdEKYyCAIQz7sQ5dxhaM+70jvOyd3X0KufcKRrluvpVATfQtzX0bZ43wfMaeC7AJjqp0Lk+aYfe1MAbadfbGnrbDnWTGppI6KqGrqTMblqDUwle19C1hGY1NJPQ9zX0vdxnIGRx15/yzSzKofIL7K/QK4JQmPNuP2+fhYh535nNkEIDvGvmZbtnaC7ulQ0QZAWdv75+81POR8P+c2OQtxkuTtCOYliD5yNDonibarYcYzjsn0mciIDQqxKNzwenppwoTkiMK5JtWz+eyJkyhHGUVplGZ+O+1SKJhjRrMm3TMNqnTz24Iwxse2y060nxA+F0NLbs9jYpqXDXtFC/3bwUPxBm1vCZnMCtcMsYDE4kHD8QbAvYM4mwrPCTcrV/UKKAthq2Q9e7pi6px1PRt61UBriqgI1klPylzH9FQLaHHamy75SkjIhVETtZKSMyVcROY7uIZkiqaFMpJuUGpYwkOt7PV8ysVNqe7Co63s9XTKyU4Z7sKjrez1fMt9SoupHLuLj/luL5ERcvBYA3lDESbwiC3oh78UL87wEWke/FZUi8wBc6FF+nV1j/RwTrHRGUzxmz/XiRjZv+sTk4tn5+1n15tn3YHGhfa99o32mmZmsvtdfapTbRoPaH9pf2t/ZPZ9X5tfNb5/cN9fGjbcxXWmN1/vwPc0anig==</latexit> <latexit sha1_base64="BlKkiVI6bfSIR358ifQPweZ3RYM=">AAAJ9HicfVZNb+M2ENVuv+K022bbYy9CjQJFYQSStWs5KBbYxE53D91NGsTJApERUDQtC6Y+QFKWtYR+SXsrit76T3pt0X9TSrYVSaTLiwbz3gyHM8803Rj7lBnGv48ef/DhRx9/ctA5/PSzJ59/cfT0yxsaJQSiCYxwRN65gCLsh2jCfIbRu5ggELgY3brLUYHfrhChfhResyxG0wB4oT/3IWDCdX906qQu5I4b6Ke5Xn7Xue78oFfuM7V7VLnvj7rGsVEuXTbMrdHVtuvy/unBL84sgkmAQgYxoPTONGI25YAwH2KUHzoJRTGAS+Chu4TNh1Puh3HCUAhz/VuBzROss0gvjqPPfIIgw5kwACS+yKDDBSAAMnHow2YqikIQINqbrfyYbky68jYGA6JjU74uO5o/aURyj4B44cN1ozQOAhoAtpCcNAvcphMlGJFV0HQWZYoiW8w1ItCnRRMuRWcu4mJK9Dq63OKLLF6gkOY8ITivBwoAEYLmIrA0KWJJzMvTCGks6QtGEtQrzNL3YgzI8grNeiJPw9EsZ44jwJouVxxDdCdEKYyCAIQz7sQ5dxhaM+70jvOyd3X0KufcKRrluvpVATfQtzX0bZ43wfMaeC7AJjqp0Lk+aYfe1MAbadfbGnrbDnWTGppI6KqGrqTMblqDUwle19C1hGY1NJPQ9zX0vdxnIGRx15/yzSzKofIL7K/QK4JQmPNuP2+fhYh535nNkEIDvGvmZbtnaC7ulQ0QZAWdv75+81POR8P+c2OQtxkuTtCOYliD5yNDonibarYcYzjsn0mciIDQqxKNzwenppwoTkiMK5JtWz+eyJkyhHGUVplGZ+O+1SKJhjRrMm3TMNqnTz24Iwxse2y060nxA+F0NLbs9jYpqXDXtFC/3bwUPxBm1vCZnMCtcMsYDE4kHD8QbAvYM4mwrPCTcrV/UKKAthq2Q9e7pi6px1PRt61UBriqgI1klPylzH9FQLaHHamy75SkjIhVETtZKSMyVcROY7uIZkiqaFMpJuUGpYwkOt7PV8ysVNqe7Co63s9XTKyU4Z7sKjrez1fMt9SoupHLuLj/luL5ERcvBYA3lDESbwiC3oh78UL87wEWke/FZUi8wBc6FF+nV1j/RwTrHRGUzxmz/XiRjZv+sTk4tn5+1n15tn3YHGhfa99o32mmZmsvtdfapTbRoPaH9pf2t/ZPZ9X5tfNb5/cN9fGjbcxXWmN1/vwPc0anig==</latexit> <latexit sha1_base64="BlKkiVI6bfSIR358ifQPweZ3RYM=">AAAJ9HicfVZNb+M2ENVuv+K022bbYy9CjQJFYQSStWs5KBbYxE53D91NGsTJApERUDQtC6Y+QFKWtYR+SXsrit76T3pt0X9TSrYVSaTLiwbz3gyHM8803Rj7lBnGv48ef/DhRx9/ctA5/PSzJ59/cfT0yxsaJQSiCYxwRN65gCLsh2jCfIbRu5ggELgY3brLUYHfrhChfhResyxG0wB4oT/3IWDCdX906qQu5I4b6Ke5Xn7Xue78oFfuM7V7VLnvj7rGsVEuXTbMrdHVtuvy/unBL84sgkmAQgYxoPTONGI25YAwH2KUHzoJRTGAS+Chu4TNh1Puh3HCUAhz/VuBzROss0gvjqPPfIIgw5kwACS+yKDDBSAAMnHow2YqikIQINqbrfyYbky68jYGA6JjU74uO5o/aURyj4B44cN1ozQOAhoAtpCcNAvcphMlGJFV0HQWZYoiW8w1ItCnRRMuRWcu4mJK9Dq63OKLLF6gkOY8ITivBwoAEYLmIrA0KWJJzMvTCGks6QtGEtQrzNL3YgzI8grNeiJPw9EsZ44jwJouVxxDdCdEKYyCAIQz7sQ5dxhaM+70jvOyd3X0KufcKRrluvpVATfQtzX0bZ43wfMaeC7AJjqp0Lk+aYfe1MAbadfbGnrbDnWTGppI6KqGrqTMblqDUwle19C1hGY1NJPQ9zX0vdxnIGRx15/yzSzKofIL7K/QK4JQmPNuP2+fhYh535nNkEIDvGvmZbtnaC7ulQ0QZAWdv75+81POR8P+c2OQtxkuTtCOYliD5yNDonibarYcYzjsn0mciIDQqxKNzwenppwoTkiMK5JtWz+eyJkyhHGUVplGZ+O+1SKJhjRrMm3TMNqnTz24Iwxse2y060nxA+F0NLbs9jYpqXDXtFC/3bwUPxBm1vCZnMCtcMsYDE4kHD8QbAvYM4mwrPCTcrV/UKKAthq2Q9e7pi6px1PRt61UBriqgI1klPylzH9FQLaHHamy75SkjIhVETtZKSMyVcROY7uIZkiqaFMpJuUGpYwkOt7PV8ysVNqe7Co63s9XTKyU4Z7sKjrez1fMt9SoupHLuLj/luL5ERcvBYA3lDESbwiC3oh78UL87wEWke/FZUi8wBc6FF+nV1j/RwTrHRGUzxmz/XiRjZv+sTk4tn5+1n15tn3YHGhfa99o32mmZmsvtdfapTbRoPaH9pf2t/ZPZ9X5tfNb5/cN9fGjbcxXWmN1/vwPc0anig==</latexit>

Ax Bx Cx Ax Bx Cx Ax Bx Cx Ax Bx Cx

before after

Three linear transformations, showing the effect on the Mona


Lisa, and the unit square.

In the first, everything is stretched equally in all directions by a


factor of 2. That means that a square with area 1 in the original
(a unit square) has area 4 after the transformation by A (since
both its sides are doubled). This is what we mean by inflating
space: the determinant of A is 4 because transforming something
by A increases its area by a factor of 4. In the second example,
We stretch by 1.1 in one direction, and shrink to 0.5 in the other.
The result is that that a unit square in the original ends up smaller
after the transformation: the determinant of B is 0.5⇥1.1 = 0.55.

To see that objects other than squares are inflated by the same
amount, just subdivide it into small squares. Each of the squares
is inflated by the same amount, so the total is as well.

The third example is a little trickier. The Mona Lisa is again


squished and stretched in different directions, but these are not
aligned with the axes. The area of the unit square seems to be
getting a little smaller, but how can we tell by how much exactly?
Before we dig into the technical details, let’s first look at why
it is worth doing so. Why is it so important to know by how much
a matrix inflates space? There are many answers, but in the con-
text of this series, the most important reason to care about the
determinant is that it gives us a very convenient characteriza-
tion of invertible matrices.
78 CHAPTER 3—PROVING THE SPECTRAL THEOREM

An invertible matrix is simply a matrix whose transformation is


invertible. That is, after we apply the transformation y Ax we
can always transform y back to x, and end up where we started.
When is a matrix not invertible? When multiple inputs x are
mapped to a single output y. In linear transformations, this hap-
pens when the input is squished so much in one direction, that
the resulting space has a lower dimensionality than the original.

Three transformations with increasingly small determinant.


In the third, the unit square is squeezed into a line. Note that
the two edges on the bottom left of the square are mapped to
the same part of the line, so the transformation is not invert-
ible.

We don’t need to know how to compute the determinant to know


what its value is in this case. The unit square is mapped to a line
segment, so its area goes from 1 to 0. This is how the determinant
helps us to characterize invertible matrices: if the determinant is
non-zero, the matrix is invertible, if the determinant is zero, the
matrix is not invertible, or singular.

3.2.1 Computing the 2 ⇥ 2 determinant


Using this definition, it’s pretty straightforward to work out what
the formula is for the determinant of a matrix A that transforms
a 2D space. We’ll start by drawing a unit square, and labeling
3.2. DETERMINANTS 79

the four corners:

✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
<latexit sha1_base64="bIr+e9JSA1QvIgGeAVi7IHTWR00=">AAAKp3icjVZdb9s2FJW7r1Zb13R77Iswr8A2GIFkN7aDoUAbO2sf1iYLYidAZAQUfS0Lpj5AUZZVQb9hv28/ZO+jZFuRRHobX3xxz7lX5LkHNK2AOCHT9b9ajz77/Isvv3r8RP36m6ffPjt6/t009COKYYJ94tNbC4VAHA8mzGEEbgMKyLUI3FirUY7frIGGju9dsySAmYtsz1k4GDGeuj/607TAdrw0cBGjzibTTVPTTfDmZcb8VZVwjP/gGP+jjyHpc3/U1o/1YmliYOyCtrJbl/fPnyjm3MeRCx7DBIXhnaEHbJYiyhxMIFPNKIQA4RWy4S5ii+EsdbwgYuDhTHvJsUVENOZruTja3KGAGUl4gDB1eAcNLxFFmHEJ1XqrEDzkQtiZr50g3Ibh2t4GDHH9Z+mmmE/2tFaZ2hQFSwdvaltLkRtyEZZCMkxcq56EiABdu/Vkvk2+yQZzAxQ7YS7CJVfmIshnHl77lzt8mQRL8MIsjSjJqoUcAEphwQuLMAQWBWlxGm60Vfia0Qg6eVjkXo8RXV3BvMP71BL17SyIj1g9ZTWO4TkYFlzvTOWaeRBj33URN4gZZKnJYMNSs3OcFYpW0assTc1cPsvSrnK4hn6soB+zrA6eV8BzDtbRSYkutEmzdFoBp8JXbyroTbPUiipoJKDrCroWOltxBY4FeFNBNwKaVNBEQD9V0E+izoib5a47S7ezKEadXhBnDe8ogJel7W7WPAvlLrgz6iW5M9K2kRVyz2HB764t4CY5PX1//eH3LB0Nuyd6P2syLBLBnqL3+icjXaDY293sOPpw2D0TOD5Fnl02Gp/33xpioyCiASlJg0Hvt1OxUwKE+HHZaXQ27vYaJC5IfU/GwND15uljG+8J/cFgrDf3E5MHwtvRuDdofiamJW4ZPeg2xYvJA2HeG74SG1gl3tP7/VMBJw+EQQ8N5gJhVeKnxWrifolD//Rkq0HNLlhwy84UWtvQBHfZMvpOammBJSvYWkrKX4n8dxQlB9i+rPveadKKQFaxt520IpFV7D24r6iXxBKZCrNJP1DYTKCTw3zJzAonHuguo5PDfMnECpse6C6jk8N8yXwLD8vZkvkWji5lb5gnyK/TFeZjzp8jiGxHMwb+UKHwgV+zF/zPFTGf/sLvVmq7Drct/zU7efRvRLTZE3nE30xG84UkBtPusdE/7v3xqv3mbPd6eqy8UH5QflIMZaC8Ud4rl8pEwcrfrRetH1sv1Z/VC3Wq3m6pj1q7mu+V2lLRP2HX6wo=</latexit>

✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
<latexit sha1_base64="bIr+e9JSA1QvIgGeAVi7IHTWR00=">AAAKp3icjVZdb9s2FJW7r1Zb13R77Iswr8A2GIFkN7aDoUAbO2sf1iYLYidAZAQUfS0Lpj5AUZZVQb9hv28/ZO+jZFuRRHobX3xxz7lX5LkHNK2AOCHT9b9ajz77/Isvv3r8RP36m6ffPjt6/t009COKYYJ94tNbC4VAHA8mzGEEbgMKyLUI3FirUY7frIGGju9dsySAmYtsz1k4GDGeuj/607TAdrw0cBGjzibTTVPTTfDmZcb8VZVwjP/gGP+jjyHpc3/U1o/1YmliYOyCtrJbl/fPnyjm3MeRCx7DBIXhnaEHbJYiyhxMIFPNKIQA4RWy4S5ii+EsdbwgYuDhTHvJsUVENOZruTja3KGAGUl4gDB1eAcNLxFFmHEJ1XqrEDzkQtiZr50g3Ibh2t4GDHH9Z+mmmE/2tFaZ2hQFSwdvaltLkRtyEZZCMkxcq56EiABdu/Vkvk2+yQZzAxQ7YS7CJVfmIshnHl77lzt8mQRL8MIsjSjJqoUcAEphwQuLMAQWBWlxGm60Vfia0Qg6eVjkXo8RXV3BvMP71BL17SyIj1g9ZTWO4TkYFlzvTOWaeRBj33URN4gZZKnJYMNSs3OcFYpW0assTc1cPsvSrnK4hn6soB+zrA6eV8BzDtbRSYkutEmzdFoBp8JXbyroTbPUiipoJKDrCroWOltxBY4FeFNBNwKaVNBEQD9V0E+izoib5a47S7ezKEadXhBnDe8ogJel7W7WPAvlLrgz6iW5M9K2kRVyz2HB764t4CY5PX1//eH3LB0Nuyd6P2syLBLBnqL3+icjXaDY293sOPpw2D0TOD5Fnl02Gp/33xpioyCiASlJg0Hvt1OxUwKE+HHZaXQ27vYaJC5IfU/GwND15uljG+8J/cFgrDf3E5MHwtvRuDdofiamJW4ZPeg2xYvJA2HeG74SG1gl3tP7/VMBJw+EQQ8N5gJhVeKnxWrifolD//Rkq0HNLlhwy84UWtvQBHfZMvpOammBJSvYWkrKX4n8dxQlB9i+rPveadKKQFaxt520IpFV7D24r6iXxBKZCrNJP1DYTKCTw3zJzAonHuguo5PDfMnECpse6C6jk8N8yXwLD8vZkvkWji5lb5gnyK/TFeZjzp8jiGxHMwb+UKHwgV+zF/zPFTGf/sLvVmq7Drct/zU7efRvRLTZE3nE30xG84UkBtPusdE/7v3xqv3mbPd6eqy8UH5QflIMZaC8Ud4rl8pEwcrfrRetH1sv1Z/VC3Wq3m6pj1q7mu+V2lLRP2HX6wo=</latexit>

0 0 1 1 0 0 1 1
0 1 0 1 0 1 0 1

✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
<latexit sha1_base64="bIr+e9JSA1QvIgGeAVi7IHTWR00=">AAAKp3icjVZdb9s2FJW7r1Zb13R77Iswr8A2GIFkN7aDoUAbO2sf1iYLYidAZAQUfS0Lpj5AUZZVQb9hv28/ZO+jZFuRRHobX3xxz7lX5LkHNK2AOCHT9b9ajz77/Isvv3r8RP36m6ffPjt6/t009COKYYJ94tNbC4VAHA8mzGEEbgMKyLUI3FirUY7frIGGju9dsySAmYtsz1k4GDGeuj/607TAdrw0cBGjzibTTVPTTfDmZcb8VZVwjP/gGP+jjyHpc3/U1o/1YmliYOyCtrJbl/fPnyjm3MeRCx7DBIXhnaEHbJYiyhxMIFPNKIQA4RWy4S5ii+EsdbwgYuDhTHvJsUVENOZruTja3KGAGUl4gDB1eAcNLxFFmHEJ1XqrEDzkQtiZr50g3Ibh2t4GDHH9Z+mmmE/2tFaZ2hQFSwdvaltLkRtyEZZCMkxcq56EiABdu/Vkvk2+yQZzAxQ7YS7CJVfmIshnHl77lzt8mQRL8MIsjSjJqoUcAEphwQuLMAQWBWlxGm60Vfia0Qg6eVjkXo8RXV3BvMP71BL17SyIj1g9ZTWO4TkYFlzvTOWaeRBj33URN4gZZKnJYMNSs3OcFYpW0assTc1cPsvSrnK4hn6soB+zrA6eV8BzDtbRSYkutEmzdFoBp8JXbyroTbPUiipoJKDrCroWOltxBY4FeFNBNwKaVNBEQD9V0E+izoib5a47S7ezKEadXhBnDe8ogJel7W7WPAvlLrgz6iW5M9K2kRVyz2HB764t4CY5PX1//eH3LB0Nuyd6P2syLBLBnqL3+icjXaDY293sOPpw2D0TOD5Fnl02Gp/33xpioyCiASlJg0Hvt1OxUwKE+HHZaXQ27vYaJC5IfU/GwND15uljG+8J/cFgrDf3E5MHwtvRuDdofiamJW4ZPeg2xYvJA2HeG74SG1gl3tP7/VMBJw+EQQ8N5gJhVeKnxWrifolD//Rkq0HNLlhwy84UWtvQBHfZMvpOammBJSvYWkrKX4n8dxQlB9i+rPveadKKQFaxt520IpFV7D24r6iXxBKZCrNJP1DYTKCTw3zJzAonHuguo5PDfMnECpse6C6jk8N8yXwLD8vZkvkWji5lb5gnyK/TFeZjzp8jiGxHMwb+UKHwgV+zF/zPFTGf/sLvVmq7Drct/zU7efRvRLTZE3nE30xG84UkBtPusdE/7v3xqv3mbPd6eqy8UH5QflIMZaC8Ud4rl8pEwcrfrRetH1sv1Z/VC3Wq3m6pj1q7mu+V2lLRP2HX6wo=</latexit>

✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
<latexit sha1_base64="bIr+e9JSA1QvIgGeAVi7IHTWR00=">AAAKp3icjVZdb9s2FJW7r1Zb13R77Iswr8A2GIFkN7aDoUAbO2sf1iYLYidAZAQUfS0Lpj5AUZZVQb9hv28/ZO+jZFuRRHobX3xxz7lX5LkHNK2AOCHT9b9ajz77/Isvv3r8RP36m6ffPjt6/t009COKYYJ94tNbC4VAHA8mzGEEbgMKyLUI3FirUY7frIGGju9dsySAmYtsz1k4GDGeuj/607TAdrw0cBGjzibTTVPTTfDmZcb8VZVwjP/gGP+jjyHpc3/U1o/1YmliYOyCtrJbl/fPnyjm3MeRCx7DBIXhnaEHbJYiyhxMIFPNKIQA4RWy4S5ii+EsdbwgYuDhTHvJsUVENOZruTja3KGAGUl4gDB1eAcNLxFFmHEJ1XqrEDzkQtiZr50g3Ibh2t4GDHH9Z+mmmE/2tFaZ2hQFSwdvaltLkRtyEZZCMkxcq56EiABdu/Vkvk2+yQZzAxQ7YS7CJVfmIshnHl77lzt8mQRL8MIsjSjJqoUcAEphwQuLMAQWBWlxGm60Vfia0Qg6eVjkXo8RXV3BvMP71BL17SyIj1g9ZTWO4TkYFlzvTOWaeRBj33URN4gZZKnJYMNSs3OcFYpW0assTc1cPsvSrnK4hn6soB+zrA6eV8BzDtbRSYkutEmzdFoBp8JXbyroTbPUiipoJKDrCroWOltxBY4FeFNBNwKaVNBEQD9V0E+izoib5a47S7ezKEadXhBnDe8ogJel7W7WPAvlLrgz6iW5M9K2kRVyz2HB764t4CY5PX1//eH3LB0Nuyd6P2syLBLBnqL3+icjXaDY293sOPpw2D0TOD5Fnl02Gp/33xpioyCiASlJg0Hvt1OxUwKE+HHZaXQ27vYaJC5IfU/GwND15uljG+8J/cFgrDf3E5MHwtvRuDdofiamJW4ZPeg2xYvJA2HeG74SG1gl3tP7/VMBJw+EQQ8N5gJhVeKnxWrifolD//Rkq0HNLlhwy84UWtvQBHfZMvpOammBJSvYWkrKX4n8dxQlB9i+rPveadKKQFaxt520IpFV7D24r6iXxBKZCrNJP1DYTKCTw3zJzAonHuguo5PDfMnECpse6C6jk8N8yXwLD8vZkvkWji5lb5gnyK/TFeZjzp8jiGxHMwb+UKHwgV+zF/zPFTGf/sLvVmq7Drct/zU7efRvRLTZE3nE30xG84UkBtPusdE/7v3xqv3mbPd6eqy8UH5QflIMZaC8Ud4rl8pEwcrfrRetH1sv1Z/VC3Wq3m6pj1q7mu+V2lLRP2HX6wo=</latexit>

0 0 1 1 0 0 1 1
0 1 0 1 0 1 0 1

The corner points (0, 0), (0, 1) (1, 0) and (1, 1) can be trans-
formed by multiplying them by A. we know that under a lin-
ear operation like this, line segments stay line segments, so the
four edges of the square are transformed to line segments, and
the resulting figure between the four points must be a quadrilat-
eral. We also know that parallelism is preserved: two line seg-
ments that were parallel before the transformation are parallel
after. Lastly, we know that the origin stays where it is, unless we
apply a translation, so corner (0, 0) is not affected by the transfor-
mation. All this means that the picture after the transformation
will look something like this.
80 CHAPTER 3—PROVING THE SPECTRAL THEOREM

A parallelogram with one corner touching the origin. The deter-


minant tells us the ratio between the area of the paralellogram
and the original square. Since the original square has area 1, the
area of the parallelogram is the determinant of A.
Working this out requires only a small amount of basic geom-
etry. Here’s the simplest way to do it.
We first name the four elements of our matrix:
✓ ◆
a b
A=
c d
We can now name the four corners of our parallelogram in terms
of these four scalars, by multiplying the corner points of the
unit square by A:
✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
0 1 0 1

✓ ◆ ✓0◆ ✓0◆ ✓1◆ ✓ 1 ◆
a b 0 a b a+b
c d 0 c d c+d
This gives us the following picture.
✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
<latexit sha1_base64="gI0dDNeRd8f+JEhF7hODuT/3r9c=">AAAKq3ichVZdb9s2FJXbfbTauqXr4x4mzBjQrW4g2Y3lYCjQxs7ah7XJsjgpFhkBRV/LgqkPUJRlVdCv2K/bT9nbKMlWJFFe+eKLe869Is89oGn6xA6Yqv7TuXf/s8+/+PLBQ/mrrx998+3B4++uAi+kGKbYIx79YKIAiO3ClNmMwAefAnJMAtfmapzh12ugge25lyz2YeYgy7UXNkaMp24P/jZMsGw38R3EqL1JVcNQVAPceZkxfpUbHJNz5p/gIM7Bn+I8yzrhZ81etwdd9VDNlyIG2jboStt1fvv4oWTMPRw64DJMUBDcaKrPZgmizMYEUtkIA/ARXiELbkK2GM0S2/VDBi5OlZ84tgiJwjwlE0iZ2xQwIzEPEKY276DgJaIIMy6jXG8VgIscCHrzte0HRRisrSJgiM9glmzyGaWPapWJRZG/tPGmtrUEOQEXYSkkg9gx60kICdC1U09m2+SbbDA3QLEdZCKcc2XO/GzuwaV3vsWXsb8EN0iTkJK0WsgBoBQWvDAPA2Chn+Sn4WZbBS8ZDaGXhXnu5QTR1QXMe7xPLVHfzoJ4iNVTZuMYro1hwfVOZa6ZCxH2HAdxgxh+mhgMNiwxeodprmgVvUiTxMjkM03lIoNr6PsK+j5N6+BpBTzlYB2dluhCmTZLryrglfDV6wp63Sw1wwoaCui6gq6FzmZUgSMB3lTQjYDGFTQW0I8V9KOoM+JmuenPkmIW+aiTM2Kv4Q0FcNOk20+bZ6HcBTdavSRzRtLV0lzuOSz4/VUATpzRk7eX735Pk/Gof6QO0ybDJCHsKOpgeDRWBYpV7GbLUUej/onA8ShyrbLR5HT4WhMb+SH1SUnS9cFvx2KnGAjxorLT+GTSHzRIXJD6njRdU9Xm6SML7whDXZ+ozf1E5I7wejwZ6M3PRLTETW0A/aZ4EbkjzAejF2IDs8QH6nB4LODkjqAPkD4XCKsSP85XE/dKHIbHR4UGNbtgwS1bUyhdTRHcZbXRt1K3FphtBYWlWvkrkf+GongP22vrvnNaa4XfVrGzXWtF3Fax8+Cuol4StciUm631A7nNBDrZz2+ZWe7EPd3b6GQ/v2ViuU33dG+jk/38lvnmHm5nt8w3d3Qpe8M8fnadrjAfc/YcQaQYzQT4Q4XCO37NnvE/V8Q8+gu/W6nl2Ny2/NfoZdH/EdFmR+QRfzNpzReSGFz1D7Xh4eCPF91XJ9vX0wPpe+lH6amkSbr0SnornUtTCUv/dn7oPO38LD+X/5T/ko2Ceq+zrXki1ZYM/wFBve14</latexit>

0 b a a+b
0 d c c+d
✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
bc
<latexit sha1_base64="miWMvlGm2nK0FOcimAsCTWZJgZU=">AAAKHnicfVZNb9tGEGXStI3cpk2aYy5EhQJFIRikGIvyIUBiyU0OTewalm3AEoLlakQRWn5guZTELPgnck0P/TW5Fb22/6ZLUpJJ7qp70WDfm+Hsm4fVOhHxYmYY/967/8WDL7/6+mHr4JtvH333/eMnP1zFYUIxjHBIQnrjoBiIF8CIeYzATUQB+Q6Ba2cxyPHrJdDYC4NLlkYw8ZEbeDMPIya2bsYrirmDs/eP28ahUSxdDsxN0NY26/z9k5Y2noY48SFgmKA4vjWNiE04oszDBLKDcRJDhPACuXCbsFl/wr0gShgEONN/EtgsIToL9bwnfepRwIykIkCYeqKCjueIIsxE5wf1UjEEyIe4M116UVyG8dItA4bEsSd8XciSPaplcpeiaO7hda01jvzYR2wubcap79Q3ISFAl359M29TNNlgroFiL85FOBfKnEW51PFleL7B52k0hyDOeEJJVk0UAFAKM5FYhDGwJOLFacR8F/ELRhPo5GGx92KI6OICph1Rp7ZRb2dGQsTqW07jGIGHYSb0zg6EZgGscOj7KJjycZTxMYM14+POYVYoWkUvMs7HuXyOo1/kcA19V0HfZVkdPK2ApwKso6MdOtNHzdSrCnglffW6gl43U52kgiYSuqygS6mys6rAKwleV9C1hKYVNJXQDxX0g6wzEma57U54OYti1PyMeEt4TQGCjLe7WfMsVLjg1qyn5M7gbTMr5J7CTFwZJeCnOZ2/uXz7W8YH/e6R0cuaDIcksKUYVu9oYEgUt+xmwzH6/e6JxAkpCtxdoeFp75UpF4oSGpEdybatX4/lSikQEq52lQYnw67VIAlB6j2ZtmkYzdOvXLwl9Gx7aDT7WZE7wqvB0LKbnxF36BZ3TAu6TfFW5I4wtfrP5QLODreMXu9YwskdwbaQPZUIix1+XKwmHu5w6B0flRrU7IIlt2xMobdNXXKXq6JvpFYmOKqE0lJK/kLmv6Yo3cMOVdW3TlNmRKqMre2UGakqY+vBbUY9ZaWQqTCb8gOFzSQ62c9XzKxw4p7qKjrZz1dMrLDpnuoqOtnPV8y38LCarZhv4eid7A3zRPl1usBizPlzBJFyNEMQDxUKb8U1eyb+XBEL6S/ibqWu7wnbit9xJ4/+j4jWW6KIxJvJbL6Q5OCqe2j2Dq3fn7dfnmxeTw+1Z9qP2s+aqdnaS+2Ndq6NNKwR7aP2Sfuj9Wfrc+uv1t8l9f69Tc5TrbZa//wHHKK52g==</latexit>

<latexit sha1_base64="gI0dDNeRd8f+JEhF7hODuT/3r9c=">AAAKq3ichVZdb9s2FJXbfbTauqXr4x4mzBjQrW4g2Y3lYCjQxs7ah7XJsjgpFhkBRV/LgqkPUJRlVdCv2K/bT9nbKMlWJFFe+eKLe869Is89oGn6xA6Yqv7TuXf/s8+/+PLBQ/mrrx998+3B4++uAi+kGKbYIx79YKIAiO3ClNmMwAefAnJMAtfmapzh12ugge25lyz2YeYgy7UXNkaMp24P/jZMsGw38R3EqL1JVcNQVAPceZkxfpUbHJNz5p/gIM7Bn+I8yzrhZ81etwdd9VDNlyIG2jboStt1fvv4oWTMPRw64DJMUBDcaKrPZgmizMYEUtkIA/ARXiELbkK2GM0S2/VDBi5OlZ84tgiJwjwlE0iZ2xQwIzEPEKY276DgJaIIMy6jXG8VgIscCHrzte0HRRisrSJgiM9glmzyGaWPapWJRZG/tPGmtrUEOQEXYSkkg9gx60kICdC1U09m2+SbbDA3QLEdZCKcc2XO/GzuwaV3vsWXsb8EN0iTkJK0WsgBoBQWvDAPA2Chn+Sn4WZbBS8ZDaGXhXnu5QTR1QXMe7xPLVHfzoJ4iNVTZuMYro1hwfVOZa6ZCxH2HAdxgxh+mhgMNiwxeodprmgVvUiTxMjkM03lIoNr6PsK+j5N6+BpBTzlYB2dluhCmTZLryrglfDV6wp63Sw1wwoaCui6gq6FzmZUgSMB3lTQjYDGFTQW0I8V9KOoM+JmuenPkmIW+aiTM2Kv4Q0FcNOk20+bZ6HcBTdavSRzRtLV0lzuOSz4/VUATpzRk7eX735Pk/Gof6QO0ybDJCHsKOpgeDRWBYpV7GbLUUej/onA8ShyrbLR5HT4WhMb+SH1SUnS9cFvx2KnGAjxorLT+GTSHzRIXJD6njRdU9Xm6SML7whDXZ+ozf1E5I7wejwZ6M3PRLTETW0A/aZ4EbkjzAejF2IDs8QH6nB4LODkjqAPkD4XCKsSP85XE/dKHIbHR4UGNbtgwS1bUyhdTRHcZbXRt1K3FphtBYWlWvkrkf+GongP22vrvnNaa4XfVrGzXWtF3Fax8+Cuol4StciUm631A7nNBDrZz2+ZWe7EPd3b6GQ/v2ViuU33dG+jk/38lvnmHm5nt8w3d3Qpe8M8fnadrjAfc/YcQaQYzQT4Q4XCO37NnvE/V8Q8+gu/W6nl2Ny2/NfoZdH/EdFmR+QRfzNpzReSGFz1D7Xh4eCPF91XJ9vX0wPpe+lH6amkSbr0SnornUtTCUv/dn7oPO38LD+X/5T/ko2Ceq+zrXki1ZYM/wFBve14</latexit>

bc 0 b a a+b
<latexit sha1_base64="miWMvlGm2nK0FOcimAsCTWZJgZU=">AAAKHnicfVZNb9tGEGXStI3cpk2aYy5EhQJFIRikGIvyIUBiyU0OTewalm3AEoLlakQRWn5guZTELPgnck0P/TW5Fb22/6ZLUpJJ7qp70WDfm+Hsm4fVOhHxYmYY/967/8WDL7/6+mHr4JtvH333/eMnP1zFYUIxjHBIQnrjoBiIF8CIeYzATUQB+Q6Ba2cxyPHrJdDYC4NLlkYw8ZEbeDMPIya2bsYrirmDs/eP28ahUSxdDsxN0NY26/z9k5Y2noY48SFgmKA4vjWNiE04oszDBLKDcRJDhPACuXCbsFl/wr0gShgEONN/EtgsIToL9bwnfepRwIykIkCYeqKCjueIIsxE5wf1UjEEyIe4M116UVyG8dItA4bEsSd8XciSPaplcpeiaO7hda01jvzYR2wubcap79Q3ISFAl359M29TNNlgroFiL85FOBfKnEW51PFleL7B52k0hyDOeEJJVk0UAFAKM5FYhDGwJOLFacR8F/ELRhPo5GGx92KI6OICph1Rp7ZRb2dGQsTqW07jGIGHYSb0zg6EZgGscOj7KJjycZTxMYM14+POYVYoWkUvMs7HuXyOo1/kcA19V0HfZVkdPK2ApwKso6MdOtNHzdSrCnglffW6gl43U52kgiYSuqygS6mys6rAKwleV9C1hKYVNJXQDxX0g6wzEma57U54OYti1PyMeEt4TQGCjLe7WfMsVLjg1qyn5M7gbTMr5J7CTFwZJeCnOZ2/uXz7W8YH/e6R0cuaDIcksKUYVu9oYEgUt+xmwzH6/e6JxAkpCtxdoeFp75UpF4oSGpEdybatX4/lSikQEq52lQYnw67VIAlB6j2ZtmkYzdOvXLwl9Gx7aDT7WZE7wqvB0LKbnxF36BZ3TAu6TfFW5I4wtfrP5QLODreMXu9YwskdwbaQPZUIix1+XKwmHu5w6B0flRrU7IIlt2xMobdNXXKXq6JvpFYmOKqE0lJK/kLmv6Yo3cMOVdW3TlNmRKqMre2UGakqY+vBbUY9ZaWQqTCb8gOFzSQ62c9XzKxw4p7qKjrZz1dMrLDpnuoqOtnPV8y38LCarZhv4eid7A3zRPl1usBizPlzBJFyNEMQDxUKb8U1eyb+XBEL6S/ibqWu7wnbit9xJ4/+j4jWW6KIxJvJbL6Q5OCqe2j2Dq3fn7dfnmxeTw+1Z9qP2s+aqdnaS+2Ndq6NNKwR7aP2Sfuj9Wfrc+uv1t8l9f69Tc5TrbZa//wHHKK52g==</latexit>

0 d c c+d

ad ad
<latexit sha1_base64="wMFaANlFUsHWkhmuF99o1dm3oH4=">AAAKHnicfVZNb9tGEGXStI3cpk2aYy5EhQJFIRikGIvyIUBiyU0OTewalm3AEoLlckQRWn5guRTFLPgnck0P/TW5Fb22/6ZLSqJJLtW9aLDvzXD2zcNqrZC4EdO0f+/d/+LBl199/bBz8M23j777/vGTH66iIKYYJjggAb2xUATE9WHCXEbgJqSAPIvAtbUc5fj1CmjkBv4lS0OYecjx3bmLERNbN9MkwBzZ2fvHXe1QK5YqB/o26Crbdf7+SUeZ2gGOPfAZJiiKbnUtZDOOKHMxgexgGkcQIrxEDtzGbD6ccdcPYwY+ztSfBDaPicoCNe9JtV0KmJFUBAhTV1RQ8QJRhJno/KBeKgIfeRD17JUbRpswWjmbgCFx7BlfF7Jkj2qZ3KEoXLh4XWuNIy/yEFtIm1HqWfVNiAnQlVffzNsUTTaYa6DYjXIRzoUyZ2EudXQZnG/xRRouwI8yHlOSVRMFAJTCXCQWYQQsDnlxGjHfZfSC0Rh6eVjsvRgjurwAuyfq1Dbq7cxJgFh9y2ocw3cxzIXe2YHQzIcEB56HfJtPw4xPGawZn/YOs0LRKnqRcT7N5bMs9SKHa+i7Cvouy+rgaQU8FWAdnZToXJ00U68q4JX01esKet1MteIKGkvoqoKupMpWUoETCV5X0LWEphU0ldAPFfSDrDMSZrntz/hmFsWo+RlxV/CaAvgZ7/az5lmocMGtXk/JncG7elbIbcNcXBkbwEtzOn9z+fa3jI+G/SNtkDUZFolhR9GMwdFIkyjOppstRxsO+ycSJ6DId8pC49PBK10uFMY0JCXJNI1fj+VKKRASJGWl0cm4bzRIQpB6T7qpa1rz9ImDd4SBaY61Zj8JuSO8Go0Ns/mZhJa4pRvQb4qXkDuCbQyfywWsEje0weBYwskdwTSQaUuEZYkfF6uJByUOg+OjjQY1u2DJLVtTqF1dldzltNG3UrcmWG0JG0u18pcy/zVF6R520FZ957TWjLAtY2e71oy0LWPnwV1GPSVpkakwW+sHCptJdLKf3zKzwol7qrfRyX5+y8QKm+6p3kYn+/kt8y083M5umW/h6FL2hnnC/DpdYjHm/DmCyGY0YxAPFQpvxTV7Jv5cEQvoL+JupY7nCtuK32kvj/6PiNY7oojEm0lvvpDk4Kp/qA8Ojd+fd1+ebF9PD5Vnyo/Kz4qumMpL5Y1yrkwUrBDlo/JJ+aPzZ+dz56/O3xvq/XvbnKdKbXX++Q//arnX</latexit>

<latexit sha1_base64="wMFaANlFUsHWkhmuF99o1dm3oH4=">AAAKHnicfVZNb9tGEGXStI3cpk2aYy5EhQJFIRikGIvyIUBiyU0OTewalm3AEoLlckQRWn5guRTFLPgnck0P/TW5Fb22/6ZLSqJJLtW9aLDvzXD2zcNqrZC4EdO0f+/d/+LBl199/bBz8M23j777/vGTH66iIKYYJjggAb2xUATE9WHCXEbgJqSAPIvAtbUc5fj1CmjkBv4lS0OYecjx3bmLERNbN9MkwBzZ2fvHXe1QK5YqB/o26Crbdf7+SUeZ2gGOPfAZJiiKbnUtZDOOKHMxgexgGkcQIrxEDtzGbD6ccdcPYwY+ztSfBDaPicoCNe9JtV0KmJFUBAhTV1RQ8QJRhJno/KBeKgIfeRD17JUbRpswWjmbgCFx7BlfF7Jkj2qZ3KEoXLh4XWuNIy/yEFtIm1HqWfVNiAnQlVffzNsUTTaYa6DYjXIRzoUyZ2EudXQZnG/xRRouwI8yHlOSVRMFAJTCXCQWYQQsDnlxGjHfZfSC0Rh6eVjsvRgjurwAuyfq1Dbq7cxJgFh9y2ocw3cxzIXe2YHQzIcEB56HfJtPw4xPGawZn/YOs0LRKnqRcT7N5bMs9SKHa+i7Cvouy+rgaQU8FWAdnZToXJ00U68q4JX01esKet1MteIKGkvoqoKupMpWUoETCV5X0LWEphU0ldAPFfSDrDMSZrntz/hmFsWo+RlxV/CaAvgZ7/az5lmocMGtXk/JncG7elbIbcNcXBkbwEtzOn9z+fa3jI+G/SNtkDUZFolhR9GMwdFIkyjOppstRxsO+ycSJ6DId8pC49PBK10uFMY0JCXJNI1fj+VKKRASJGWl0cm4bzRIQpB6T7qpa1rz9ImDd4SBaY61Zj8JuSO8Go0Ns/mZhJa4pRvQb4qXkDuCbQyfywWsEje0weBYwskdwTSQaUuEZYkfF6uJByUOg+OjjQY1u2DJLVtTqF1dldzltNG3UrcmWG0JG0u18pcy/zVF6R520FZ957TWjLAtY2e71oy0LWPnwV1GPSVpkakwW+sHCptJdLKf3zKzwol7qrfRyX5+y8QKm+6p3kYn+/kt8y083M5umW/h6FL2hnnC/DpdYjHm/DmCyGY0YxAPFQpvxTV7Jv5cEQvoL+JupY7nCtuK32kvj/6PiNY7oojEm0lvvpDk4Kp/qA8Ojd+fd1+ebF9PD5Vnyo/Kz4qumMpL5Y1yrkwUrBDlo/JJ+aPzZ+dz56/O3xvq/XvbnKdKbXX++Q//arnX</latexit>

✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
<latexit sha1_base64="gI0dDNeRd8f+JEhF7hODuT/3r9c=">AAAKq3ichVZdb9s2FJXbfbTauqXr4x4mzBjQrW4g2Y3lYCjQxs7ah7XJsjgpFhkBRV/LgqkPUJRlVdCv2K/bT9nbKMlWJFFe+eKLe869Is89oGn6xA6Yqv7TuXf/s8+/+PLBQ/mrrx998+3B4++uAi+kGKbYIx79YKIAiO3ClNmMwAefAnJMAtfmapzh12ugge25lyz2YeYgy7UXNkaMp24P/jZMsGw38R3EqL1JVcNQVAPceZkxfpUbHJNz5p/gIM7Bn+I8yzrhZ81etwdd9VDNlyIG2jboStt1fvv4oWTMPRw64DJMUBDcaKrPZgmizMYEUtkIA/ARXiELbkK2GM0S2/VDBi5OlZ84tgiJwjwlE0iZ2xQwIzEPEKY276DgJaIIMy6jXG8VgIscCHrzte0HRRisrSJgiM9glmzyGaWPapWJRZG/tPGmtrUEOQEXYSkkg9gx60kICdC1U09m2+SbbDA3QLEdZCKcc2XO/GzuwaV3vsWXsb8EN0iTkJK0WsgBoBQWvDAPA2Chn+Sn4WZbBS8ZDaGXhXnu5QTR1QXMe7xPLVHfzoJ4iNVTZuMYro1hwfVOZa6ZCxH2HAdxgxh+mhgMNiwxeodprmgVvUiTxMjkM03lIoNr6PsK+j5N6+BpBTzlYB2dluhCmTZLryrglfDV6wp63Sw1wwoaCui6gq6FzmZUgSMB3lTQjYDGFTQW0I8V9KOoM+JmuenPkmIW+aiTM2Kv4Q0FcNOk20+bZ6HcBTdavSRzRtLV0lzuOSz4/VUATpzRk7eX735Pk/Gof6QO0ybDJCHsKOpgeDRWBYpV7GbLUUej/onA8ShyrbLR5HT4WhMb+SH1SUnS9cFvx2KnGAjxorLT+GTSHzRIXJD6njRdU9Xm6SML7whDXZ+ozf1E5I7wejwZ6M3PRLTETW0A/aZ4EbkjzAejF2IDs8QH6nB4LODkjqAPkD4XCKsSP85XE/dKHIbHR4UGNbtgwS1bUyhdTRHcZbXRt1K3FphtBYWlWvkrkf+GongP22vrvnNaa4XfVrGzXWtF3Fax8+Cuol4StciUm631A7nNBDrZz2+ZWe7EPd3b6GQ/v2ViuU33dG+jk/38lvnmHm5nt8w3d3Qpe8M8fnadrjAfc/YcQaQYzQT4Q4XCO37NnvE/V8Q8+gu/W6nl2Ny2/NfoZdH/EdFmR+QRfzNpzReSGFz1D7Xh4eCPF91XJ9vX0wPpe+lH6amkSbr0SnornUtTCUv/dn7oPO38LD+X/5T/ko2Ceq+zrXki1ZYM/wFBve14</latexit>

0 b a a+b
0 d c c+d

✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
<latexit sha1_base64="gI0dDNeRd8f+JEhF7hODuT/3r9c=">AAAKq3ichVZdb9s2FJXbfbTauqXr4x4mzBjQrW4g2Y3lYCjQxs7ah7XJsjgpFhkBRV/LgqkPUJRlVdCv2K/bT9nbKMlWJFFe+eKLe869Is89oGn6xA6Yqv7TuXf/s8+/+PLBQ/mrrx998+3B4++uAi+kGKbYIx79YKIAiO3ClNmMwAefAnJMAtfmapzh12ugge25lyz2YeYgy7UXNkaMp24P/jZMsGw38R3EqL1JVcNQVAPceZkxfpUbHJNz5p/gIM7Bn+I8yzrhZ81etwdd9VDNlyIG2jboStt1fvv4oWTMPRw64DJMUBDcaKrPZgmizMYEUtkIA/ARXiELbkK2GM0S2/VDBi5OlZ84tgiJwjwlE0iZ2xQwIzEPEKY276DgJaIIMy6jXG8VgIscCHrzte0HRRisrSJgiM9glmzyGaWPapWJRZG/tPGmtrUEOQEXYSkkg9gx60kICdC1U09m2+SbbDA3QLEdZCKcc2XO/GzuwaV3vsWXsb8EN0iTkJK0WsgBoBQWvDAPA2Chn+Sn4WZbBS8ZDaGXhXnu5QTR1QXMe7xPLVHfzoJ4iNVTZuMYro1hwfVOZa6ZCxH2HAdxgxh+mhgMNiwxeodprmgVvUiTxMjkM03lIoNr6PsK+j5N6+BpBTzlYB2dluhCmTZLryrglfDV6wp63Sw1wwoaCui6gq6FzmZUgSMB3lTQjYDGFTQW0I8V9KOoM+JmuenPkmIW+aiTM2Kv4Q0FcNOk20+bZ6HcBTdavSRzRtLV0lzuOSz4/VUATpzRk7eX735Pk/Gof6QO0ybDJCHsKOpgeDRWBYpV7GbLUUej/onA8ShyrbLR5HT4WhMb+SH1SUnS9cFvx2KnGAjxorLT+GTSHzRIXJD6njRdU9Xm6SML7whDXZ+ozf1E5I7wejwZ6M3PRLTETW0A/aZ4EbkjzAejF2IDs8QH6nB4LODkjqAPkD4XCKsSP85XE/dKHIbHR4UGNbtgwS1bUyhdTRHcZbXRt1K3FphtBYWlWvkrkf+GongP22vrvnNaa4XfVrGzXWtF3Fax8+Cuol4StciUm631A7nNBDrZz2+ZWe7EPd3b6GQ/v2ViuU33dG+jk/38lvnmHm5nt8w3d3Qpe8M8fnadrjAfc/YcQaQYzQT4Q4XCO37NnvE/V8Q8+gu/W6nl2Ny2/NfoZdH/EdFmR+QRfzNpzReSGFz1D7Xh4eCPF91XJ9vX0wPpe+lH6amkSbr0SnornUtTCUv/dn7oPO38LD+X/5T/ko2Ceq+zrXki1ZYM/wFBve14</latexit>

0 b a a+b
0 d c c+d
ad bc
<latexit sha1_base64="wMFaANlFUsHWkhmuF99o1dm3oH4=">AAAKHnicfVZNb9tGEGXStI3cpk2aYy5EhQJFIRikGIvyIUBiyU0OTewalm3AEoLlckQRWn5guRTFLPgnck0P/TW5Fb22/6ZLSqJJLtW9aLDvzXD2zcNqrZC4EdO0f+/d/+LBl199/bBz8M23j777/vGTH66iIKYYJjggAb2xUATE9WHCXEbgJqSAPIvAtbUc5fj1CmjkBv4lS0OYecjx3bmLERNbN9MkwBzZ2fvHXe1QK5YqB/o26Crbdf7+SUeZ2gGOPfAZJiiKbnUtZDOOKHMxgexgGkcQIrxEDtzGbD6ccdcPYwY+ztSfBDaPicoCNe9JtV0KmJFUBAhTV1RQ8QJRhJno/KBeKgIfeRD17JUbRpswWjmbgCFx7BlfF7Jkj2qZ3KEoXLh4XWuNIy/yEFtIm1HqWfVNiAnQlVffzNsUTTaYa6DYjXIRzoUyZ2EudXQZnG/xRRouwI8yHlOSVRMFAJTCXCQWYQQsDnlxGjHfZfSC0Rh6eVjsvRgjurwAuyfq1Dbq7cxJgFh9y2ocw3cxzIXe2YHQzIcEB56HfJtPw4xPGawZn/YOs0LRKnqRcT7N5bMs9SKHa+i7Cvouy+rgaQU8FWAdnZToXJ00U68q4JX01esKet1MteIKGkvoqoKupMpWUoETCV5X0LWEphU0ldAPFfSDrDMSZrntz/hmFsWo+RlxV/CaAvgZ7/az5lmocMGtXk/JncG7elbIbcNcXBkbwEtzOn9z+fa3jI+G/SNtkDUZFolhR9GMwdFIkyjOppstRxsO+ycSJ6DId8pC49PBK10uFMY0JCXJNI1fj+VKKRASJGWl0cm4bzRIQpB6T7qpa1rz9ImDd4SBaY61Zj8JuSO8Go0Ns/mZhJa4pRvQb4qXkDuCbQyfywWsEje0weBYwskdwTSQaUuEZYkfF6uJByUOg+OjjQY1u2DJLVtTqF1dldzltNG3UrcmWG0JG0u18pcy/zVF6R520FZ957TWjLAtY2e71oy0LWPnwV1GPSVpkakwW+sHCptJdLKf3zKzwol7qrfRyX5+y8QKm+6p3kYn+/kt8y083M5umW/h6FL2hnnC/DpdYjHm/DmCyGY0YxAPFQpvxTV7Jv5cEQvoL+JupY7nCtuK32kvj/6PiNY7oojEm0lvvpDk4Kp/qA8Ojd+fd1+ebF9PD5Vnyo/Kz4qumMpL5Y1yrkwUrBDlo/JJ+aPzZ+dz56/O3xvq/XvbnKdKbXX++Q//arnX</latexit>
<latexit sha1_base64="miWMvlGm2nK0FOcimAsCTWZJgZU=">AAAKHnicfVZNb9tGEGXStI3cpk2aYy5EhQJFIRikGIvyIUBiyU0OTewalm3AEoLlakQRWn5guZTELPgnck0P/TW5Fb22/6ZLUpJJ7qp70WDfm+Hsm4fVOhHxYmYY/967/8WDL7/6+mHr4JtvH333/eMnP1zFYUIxjHBIQnrjoBiIF8CIeYzATUQB+Q6Ba2cxyPHrJdDYC4NLlkYw8ZEbeDMPIya2bsYrirmDs/eP28ahUSxdDsxN0NY26/z9k5Y2noY48SFgmKA4vjWNiE04oszDBLKDcRJDhPACuXCbsFl/wr0gShgEONN/EtgsIToL9bwnfepRwIykIkCYeqKCjueIIsxE5wf1UjEEyIe4M116UVyG8dItA4bEsSd8XciSPaplcpeiaO7hda01jvzYR2wubcap79Q3ISFAl359M29TNNlgroFiL85FOBfKnEW51PFleL7B52k0hyDOeEJJVk0UAFAKM5FYhDGwJOLFacR8F/ELRhPo5GGx92KI6OICph1Rp7ZRb2dGQsTqW07jGIGHYSb0zg6EZgGscOj7KJjycZTxMYM14+POYVYoWkUvMs7HuXyOo1/kcA19V0HfZVkdPK2ApwKso6MdOtNHzdSrCnglffW6gl43U52kgiYSuqygS6mys6rAKwleV9C1hKYVNJXQDxX0g6wzEma57U54OYti1PyMeEt4TQGCjLe7WfMsVLjg1qyn5M7gbTMr5J7CTFwZJeCnOZ2/uXz7W8YH/e6R0cuaDIcksKUYVu9oYEgUt+xmwzH6/e6JxAkpCtxdoeFp75UpF4oSGpEdybatX4/lSikQEq52lQYnw67VIAlB6j2ZtmkYzdOvXLwl9Gx7aDT7WZE7wqvB0LKbnxF36BZ3TAu6TfFW5I4wtfrP5QLODreMXu9YwskdwbaQPZUIix1+XKwmHu5w6B0flRrU7IIlt2xMobdNXXKXq6JvpFYmOKqE0lJK/kLmv6Yo3cMOVdW3TlNmRKqMre2UGakqY+vBbUY9ZaWQqTCb8gOFzSQ62c9XzKxw4p7qKjrZz1dMrLDpnuoqOtnPV8y38LCarZhv4eid7A3zRPl1usBizPlzBJFyNEMQDxUKb8U1eyb+XBEL6S/ibqWu7wnbit9xJ4/+j4jWW6KIxJvJbL6Q5OCqe2j2Dq3fn7dfnmxeTw+1Z9qP2s+aqdnaS+2Ndq6NNKwR7aP2Sfuj9Wfrc+uv1t8l9f69Tc5TrbZa//wHHKK52g==</latexit>

Here we can see the area of the paralellogram clearly: there is


one large rectangle of area ad. To get from this area to the area
of our parallelogram, we should subtract the area of the green
triangle in the bottom, which is part of the rectangle but not
the parallelogram.
But then, there’s a green triangle at the top, with the same
size, which is (mostly) part of the parallelogram but not of the
3.2. DETERMINANTS 81

rectangle, so these cancel each other out. We follow the same


logic for the red triangles.
Putting all this together, the rectangle with area ad has the
same area as the parallellogram, except that we are overcount-
ing three elements (outlined in blue): the two small triangles
in the box at the top, which are not part of the paralellogram,
and the overlap between the green and the red triangles, which
we’ve counted twice. These three overcounted elements add up
precisely to the box at the top-right, which has area bc.
So, the area of the paralellogram, and therefore the deter-
minant of the matrix A is ad - bc. Or, in words: the determi-
nant of a 2 ⇥ 2 matrix is the diagonal product minus the an-
tidiagonal product.
We will write the determinant of a matrix A with two vertical
bars around the matrix. When we write out the values of the ma-
trix explicitly, we will remove the matrix parentheses for clarity:

a b
|A| = = ad - bc .
c d

3.2.2 Negative determinants


In the picture we drew to derive this, ad was bigger than bc, so
the determinant was positive. But this is not guaranteed. Look
at the two column vectors of our matrix. If we flip them around
then the determinant becomes:

b a
= bc - da .
d c

Assuming the values in the matrix are the same, this is the same
quantity as before, but negative. Areas can’t be negative, so how
do we interpret this?
The magnitude remains the same, so the simplest solution
is just to adjust our definitions: the absolute value of |A| is the
amount by which A inflates space.
However, in many situations, the idea of a “negative area”
actually makes a lot of sense. Consider, for instance, this graph
of the velocity of a train along a straight track from one station
to another and back again:
82 CHAPTER 3—PROVING THE SPECTRAL THEOREM
velocity

station B station A
station A time

Here, we’ve used a negative velocity to represent the train trav-


eling backwards. If you’ve done some physics, then you’ll know
that the area under the speed curve represents distance traveled.
Here we have two options: we can look at the absolute value
of the area, and see that the train has, in total, travelled twice
the distance between the stations. We can also take areas be-
low the horizontal axis to be negative. Then, their sum tells us
that the total distance between the train’s starting point and its
final position is exactly zero.
All this is just to say, if you need positive areas, just take the
magnitude of the determinant, but don’t be too quick to throw
away the sign. It may have some important meaning in your par-
ticular setting. For our purposes, we’ll need these kinds of areas
when we want to think about determinants for larger matrices.
We’ll call this kind of positive or negative area a signed area, or
signed volume in higher dimensions. You can think of the paral-
lelogram as a piece of paper. If A stretches the paper, but doesn’t
flip it around, the singed area is positive. If the paper is flipped
around, so that we see the reverse, the area is negative. If you
flip the paper around twice, the sign becomes positive again.

3.2.3 Towards n ⇥ n determinants


Let’s think about what we’ll need to generalize this idea to 3 ⇥
3 matrices and beyond, to general n ⇥ n matrices. The basic
intuition generalizes: we can start with a unit (hyper)cube in n
dimensions. A square matrix transforms this into an analogue of
a parallelogram, called a parallellotope.
For a given dimensionality we can define a notion of n-volume.
The 3-volume is simply the volume we already know. The n-
volume of an n-dimensional “brick”, the analogue of a rectangle,
is the product of its extent in each direction: height times width
3.2. DETERMINANTS 83

times length and so on in all directions. This means that the


unit hypercube, which has sides of length 1 in all directions,
always has n-volume 1.
We will assume, by analogy with the 2 ⇥ 2 case, that the deter-
minant of an n⇥n matrix A is the n-volume of the parallellotope
that results when we transform the unit hypercube by A.

Note that there are 3 ⇥ 3 matrices that will flatten the unit
cube into a parallelogram. In this case, we are not interested
in the area of the parallelogram as we were before. The ma-
trix is 3 ⇥ 3, so we care about the resulting volume, which in
such cases would be 0.

We can generalize a few useful properties from the 2⇥2 case. The
columns of A are those vectors that describe the edges that touch
the origin. We’ll call these the basis vectors of the parallelogram/-
tope.

The edges of the unit cube that touch the origin are the stan-
dard basis vectors. These are mapped to the edges of the par-
allelotope that touch the origin. These are the column vectors
of A. We call these the basis vectors of the parallelotope.
84 CHAPTER 3—PROVING THE SPECTRAL THEOREM

The proof we gave for the 2 ⇥ 2 determinant is very neat, but it


isn’t very easy to generalize to the n ⇥ n case in an intuitive way.
Instead, let’s re-prove our result for the 2 ⇥ 2 case in a way
that’s easier to generalize. We’ll need to convince ourselves of
three properties of the area of a parallellogram.

These are not difficult to prove, but we’ll focus here on the geo-
metric intuition. If you want a more rigorous proof, it’s easier to
let this intuition go, and work purely symbolically.

The first property we need is that if we move one of the sides of


the parallelogram without changing its direction, the area of the
parallellogram remains the same. This is easy to see visually.

A skew transformation preserves the area of a parallelogram.


This means we can align one of the edges with the axes with-
out changing the area.

Note that shifting one of the sides always adds a triangle to the
parallellogram and takes away a triangle of the same size, so the
total stays the same. The last example is particularly relevant:
we can shift the parallelogram so that one of its edges is aligned
with one of the axes. If we do this twice, we’ll have a rectangle
with an area equal to that of the original parallellogram.
What does this look like in the original matrix? Remember
that the columns of the matrix are the two edges of the par-
allelogram that touch the origin. Shifting one of them in this
way is equivalent to adding or subtracting a small multiple of
the other to it.
3.2. DETERMINANTS 85

<latexit sha1_base64="6nAA+1TQuUjB/3ZQ5Ddy4ErHapI=">AAAKU3icfVZNb9tGEGXctI3VunXaYy9EhQJFIRikFEsyigCJJTc5NLFrWHYAUzCWqxFFaPmB5VIUs9gf01/Ta3voob+lly5JSSa5VPeiwbw3w903D6u1Q+JGzDD+eXLwydNPP/v82WHriy+Pvvr6+Pk3t1EQUwwTHJCAfrBRBMT1YcJcRuBDSAF5NoE7eznK8LsV0MgN/BuWhjD1kOO7cxcjJlMPxz9bCcWcCiuxMbdscFyfhx5i1F0XOSR0y9LzEAsL/NkOFg/HbePEyJeuBuYmaGubdfXw/FCzZgGOPfAZJiiK7k0jZFOOKHMxAdGy4ghChJfIgfuYzYdT7vphzMDHQv9BYvOY6CzQs4PoM5cCZiSVAcLUlR10vEAUYSaP26q2isBHHkSd2coNoyKMVk4RMCS1mvJ1rqU4qlRyh6Jw4eJ1ZWsceZHUYKEko9Szq0mICdCVV01m25SbrDHXQLEbZSJcSWUuw2w+0U1wtcEXabgAPxI8pkSUCyUAlMJcFuZhBCwOeX4aaYpl9JLRGDpZmOdejhFdXsOsI/tUEtXtzEmAWDVl147huxjmUm/Rkpr5kODA85D0hxUKbjFYM251TkSuaBm9FpxbmXy2rV9ncAV9X0LfC1EFL0rghQSr6GSHzvVJvfS2BN4qX70roXf1UjsuobGCrkroSulsJyU4UeB1CV0raFpCUwX9WEI/qjojaZb77pQXs8hHzS+Ju4I3FMAXvN0V9bNQ6YJ7s1qSOYO3TZHLPYO5vGcKwEszOn978+5XwUfD7qnRF3WGTWLYUoxe/3RkKBSn2M2GYwyH3XOFE1DkO7tG44v+a1NtFMY0JDvSYND75UztlAIhQbLrNDofd3s1khSkuidzYBpG/fSJg7eE/mAwNur7Scgj4fVo3BvUPyOv3S1umz3o1sVLyCNh1hu+UBvYO7xn9PtnCk4eCYMeGswUwnKHn+Wrjgc7HPpnp4UGFbtgxS0bU+htU1fc5TTRN1I3FthNBYWlGvlLlf+GonQPO2jqvnVaY0XYVLG1XWNF2lSx9eC2olqSNMiUm63xA7nNFDrZz2+YWe7EPd2b6GQ/v2FiuU33dG+ik/38hvnmHm5mN8w3d/RO9pp5wuw6XcpHUJg9RxApRjMG+VCh8E5es5fyzxWxgP4k71bqeK60rfy1Oln0f0S03hJlJN9MZv2FpAa33ROzf9L77UX71fnm9fRM+077XvtRM7WB9kp7q11pEw1rv2t/aH9qfx3+ffhv66D1tKAePNnUfKtVVuvoPxI5zhM=</latexit>

✓ ◆
a
r
c

<latexit sha1_base64="UMcHDudTnYoTHnV3aXgaS4LSKSc=">AAAKR3icfVbLbttGFKWTPiK1aZxm2Q1RoUFRCC4pxpK8CJBYcpNFE7uGZQcwBWM4uqIIDR8YDkUxA35Gv6bbdtFP6Fd0V3TZISXRJIfqbHQx59zLO+cejMYKiBMyTfvr4MHDTz797PNHrfYXXz7+6snh06+vQz+iGCbYJz79YKEQiOPBhDmMwIeAAnItAjfWcpThNyugoeN7VywJYOoi23PmDkZMbN0d/mjGFuamBbbj8cBFjDrrFKnPVUs1TRU/V2cmeLMCSe8OO9qRli9VDvRt0FG26+LuaUsxZz6OXPAYJigMb3UtYFOOKHMwgbRtRiEECC+RDbcRmw+n3PGCiIGHU/U7gc0jojJfzZpXZw4FzEgiAoSpIyqoeIEowkwcsV0tFYKHXAi7s5UThJswXNmbgCGhz5Svc/3Sx5VMblMULBy8rrTGkRsKDRbSZpi4VnUTIgJ05VY3szZFkzXmGih2wkyEC6HMeZDNJLzyL7b4IgkW4IUpjyhJy4kCAEphLhLzMAQWBTw/jTDCMnzJaATdLMz3Xo4RXV7CrCvqVDaq7cyJj1h1y6odw3MwzIXeaVto5kGMfddFwh9mkHKTwZpxs3uU5oqW0cuUczOTz7LUywyuoO9L6Ps0rYJnJfBMgFV0UqBzdVJPvS6B19JXb0roTT3VikpoJKGrErqSKltxCY4leF1C1xKalNBEQj+W0I+yzkiY5bY35ZtZ5KPm58RZwRsK4KW800vrZ6HCBbd6NSVzBu/oaS73DObibtkAbpLR+durdz+nfDTsHWv9tM6wSAQ7imb0j0eaRLE33Ww52nDYO5U4PkWeXRQan/Vf63KhIKIBKUiDgfHTiVwpAUL8uKg0Oh33jBpJCFLtSR/omlY/fWzjHaE/GIy1ej8xuSe8Ho2NQf0zMS1wSzegVxcvJveEmTF8IRewCtzQ+v0TCSf3hIGBBjOJsCzwk3zVcb/AoX9yvNGgYhcsuWVrCrWjq5K77Cb6VurGBKspYWOpRv5S5r+hKNnD9puq75zWmBE0Zexs15iRNGXsPLjLqKbEDTLlZmv8QG4ziU728xtmljtxT/UmOtnPb5hYbtM91ZvoZD+/Yb65h5vZDfPNHV3IXjNPkF2nS/H+CbLnCCKb0YxBPFQovBPX7Ln4c0XMpz+Iu5XariNsK37Nbhb9HxGtd0QRiTeTXn8hycF170jvHxm/vOi8Ot2+nh4p3yjfKt8rujJQXilvlQtlomDlV+U35Xflj9afrb9b/7T+3VAfHGxznimV1T74D4kpyNk=</latexit>


a b

<latexit sha1_base64="HAfhNLiDdii/wp5hwzrfBpbq+8o=">AAAKaHicfVZbb9s2GFW7W+OtW7o+FEVfhBobhs0IJLuxnYcCbeysfVgbL4iTApERUPRnWTB1AUVZVgn+rf2XAXvdHvorSsmXSKI8vvgDzzmfyMMDmnZI3IgZxt/37n/x5Vdff/PgoPHtdw+//+Hw0Y9XURBTDGMckIB+sFEExPVhzFxG4ENIAXk2gWt7Mcjw6yXQyA38S5aGMPGQ47szFyMmp24PR1biYG7Z4Lg+Dz3EqLsSSP9Zt/XfrIRiToWV2JgjoVuWjiUwLQNYWOBPd1Jxe9g0jox86Gphboqmthmj20cHmjUNcOyBzzBBUXRjGiGbcESZiwmIhhVHECK8QA7cxGzWn3DXD2MGPhb6TxKbxURngZ7tTp+6FDAjqSwQpq7soOM5oggz6UGj3CoCH3kQtaZLN4zWZbR01gVD0sAJX+UGi4clJXcoCucuXpWWxpEXSQ/mymSUenZ5EmICdOmVJ7NlykVWmCug2I0yE0bSmfMwO7ToMhht8HkazsGPBI8pEUWhBIBSmElhXkbA4pDnu5FJWUQvGY2hlZX53MshoosLmLZkn9JEeTkzEiBWnrIr2/BdDDPpt2hIz3xIcOB5SObDCgW3GKwYt1pHIne0iF4Izq3MPtvWLzK4hL4voO+FKINnBfBMgmV0vENn+rgqvSqAV8pXrwvodVVqxwU0VtBlAV0qne2kACcKvCqgKwVNC2iqoB8L6EfVZyTDctOe8PVZ5EfNz4m7hDcUwBe82RbVvVCZghuzLMmSwZumyO2ewkxePmvASzM6f3v57g/BB/32sdEVVYZNYthSjE73eGAoFGe9mg3H6PfbpwonoMh3do2GZ93XptoojGlIdqRer/P7idopBUKCZNdpcDpsdyokaUh5TWbPNIzq7uVVuiV0e72hUV1PQu4IrwfDTq/6GXmvbnHb7EC7al5C7gjTTv+F2sDe4R2j2z1RcHJH6HVQb6oQFjv8JB9VPNjh0D05XntQigtW0rIJhd40dSVdTh19Y3WtwK4TrCNVy1+o/DcUpXvYQV33bdJqFWGdYhu7WkVap9hmcKsoS5Iam/Kw1X4gj5lCJ/v5NWeWJ3FP9zo62c+vObE8pnu619HJfn7N+eYZrmfXnG+e6J3tlfCE2XW6kA+kMHuOILI+miHIhwqFd/KaPZd/rogF9Fd5t1LHc2Vs5a/Vyqr/I6LVligr+WYyqy8ktbhqH5ndo86fL5qvTjevpwfaM+259otmaj3tlfZWG2ljDWt/af9o/2r/HXxqHDaeNJ6uqffvbTSPtdJoPP8M26DT6Q==</latexit>

◆ c d
a b + ra
c d + rc
✓ ◆
<latexit sha1_base64="QqdQXKZMd90JLWch1SEBDFBnu58=">AAAKQHicfVbLbttGFGXSV+Q2rdMssyEqFAgKwSDFWJQXARJLbrJoYtew7ACmYAxHVxSh4QPDoShmwF/o13TbLvoX/YPuim676pCSaJJDdTa6uOfcyztnDkZjh8SNmKb9+eDhJ59+9vkXjzoHX371+OtvDp98ex0FMcUwwQEJ6AcbRUBcHybMZQQ+hBSQZxO4sZejHL9ZAY3cwL9iaQhTDzm+O3cxYiJ1d/jcSmzMLRsc1+ehhxh11xlSLUvFFvizMpXdHXa1I61Yqhzo26CrbNfF3ZOOYs0CHHvgM0xQFN3qWsimHFHmYgLZgRVHECK8RA7cxmw+nHLXD2MGPs7U7wU2j4nKAjWfWp25FDAjqQgQpq7ooOIFoggzsbeDeqsIfORB1Jut3DDahNHK2QQMCWGmfF0Ilz2uVXKHonDh4nVtNI68SGiwkJJR6tn1JMQE6MqrJ/MxxZAN5hoodqNchAuhzHmYH0Z0FVxs8UUaLsCPMh5TklULBQCUwlwUFmEELA55sRvhgGX0ktEYenlY5F6OEV1ewqwn+tQS9XHmJECsnrIb2/BdDHOhd3YgNPMhwYHnIeEPK8y4xWDNuNU7ygpFq+hlxrmVy2fb6mUO19D3FfR9ltXBswp4JsA6OinRuTppll5XwGvpqzcV9KZZascVNJbQVQVdSZ3tpAInEryuoGsJTStoKqEfK+hHWWckzHLbn/LNWRRHzc+Ju4I3FMDPeLefNfdChQtu9XpJ7gze1bNC7hnMxaWyAbw0p/O3V+9+yvho2D/WBlmTYZMYdhTNGByPNInibKbZcrThsH8qcQKKfKdsND4bvNblRmFMQ1KSTNP48UTulAIhQVJ2Gp2O+0aDJASpz6SbuqY1d584eEcYmOZYa86TkHvC69HYMJufSWiJ27oB/aZ4CbknzIzhC7mBXeKGNhicSDi5J5gGMmcSYVniJ8Vq4kGJw+DkeKNBzS5YcsvWFGpXVyV3OW30rdStBXZbwcZSrfylzH9DUbqHHbR13zmttSJsq9jZrrUibavYeXBXUS9JWmQqzNb6gcJmEp3s57ecWeHEPd3b6GQ/v+XECpvu6d5GJ/v5LedbeLid3XK+haNL2RvmCfPrdCkePmH+HEFkczRjEA8VCu/ENXsu/lwRC+gP4m6ljucK24pfq5dH/0dE6x1RROLNpDdfSHJw3T/SB0fGzy+6r063r6dHyjPlO+W5oium8kp5q1woEwUrvyi/Kr8pv3f+6PzV+bvzz4b68MG25qlSW51//wPoVMgg</latexit>

a
c

To axis-align a parallellogram defined by a matrix, we take


one of the columns, and add or subtract some multiple of the
other column.

That is, if we take one of the columns of A, multiply it by any


non-zero scalar, and add it to another column, the area of the
resulting parallelogram is unchanged. If we name the column
vectors v and w, and write | v, w | for the determinant of the
matrix with these column vectors, then we have

| v, w | = | v + rw, w | for any nonzero r.


This kind of transformation is called a skew or a shear, so we’ll
call this property skew invariance: the area of a parallelogram
is skew invariant.
The second property we need, is that if we take one of the
column vectors of the matrix, say v, and write it as the sum of two
other vectors v = v1 +v2 , then the area of the paralellogram made
by basis vectors v, w is the sum of the area of the two smaller
parallelograms with basis vectors v1 , w and v2 , w respectively.
This is easy enough to see if v1 and v2 point in the same di-
rection. Then the two smaller parallelograms together simply
combine to form the larger one.
If they don’t point in the same direction, we can skew them
until they do. Since we’ve already shown that the area is skew
invariant, none of this changes the area of the paralellogram.
86 CHAPTER 3—PROVING THE SPECTRAL THEOREM

w w w
<latexit sha1_base64="OnC2V8C+vH/EBzHZgr1RubvVucc=">AAAKIXicfVZNj9s2EFXSr3jbtEl77EWoUaAojIVkZS3vIUCy9jY5NNntYr0bwDYCih7LgqkPUJRlhdDPyDU55Nf0VvRW9M+UkmytJMrlxQO+N6PhmweaVkCckGnaP/fuf/b5F19+9aBz9PU3D7/97tHj729CP6IYJtgnPn1joRCI48GEOYzAm4ACci0Ct9Z6lOG3G6Ch43vXLAlg7iLbc5YORkxsTWexhfnMctU4ffuoqx1r+VLlQN8FXWW3Lt8+7iizhY8jFzyGCQrDqa4FbM4RZQ4mkB7NohAChNfIhmnElsM5d7wgYuDhVP1ZYMuIqMxXs7bUhUMBM5KIAGHqiAoqXiGKMBPNH9VLheAhF8LeYuMEYRGGG7sIGBInn/Ntrkz6sJbJbYqClYO3tdY4ckMXsZW0GSauVd+EiADduPXNrE3RZIO5BYqdMBPhUihzEWRqh9f+5Q5fJcEKvDDlESVpNVEAQCksRWIehsCigOenESNeh08ZjaCXhfne0zGi6ytY9ESd2ka9nSXxEatvWY1jeA6GpdA7PRKaeRBj33WRt+CzIOUzBlvGZ73jNFe0il6lnM8y+SxLvcrgGvq6gr5O0zp4XgHPBVhHJyW6VCfN1JsKeCN99baC3jZTraiCRhK6qaAbqbIVV+BYgrcVdCuhSQVNJPRdBX0n64yEWab9OS9mkY+aXxBnAy8ogJfybj9tnoUKF0z1ekrmDN7V01zuBSzFrVEAbpLR+cvrV7+nfDTsn2iDtMmwSAR7imYMTkaaRLGLbnYcbTjsn0kcnyLPLguNzwfPdblQENGAlCTTNH47lSslQIgfl5VGZ+O+0SAJQeo96aauac3TxzbeEwamOdaa/cTkjvB8NDbM5mdiWuKWbkC/KV5M7ggLY/hELmCVuKENBqcSTu4IpoHMhURYl/hpvpq4X+IwOD0pNKjZBUtu2ZlC7eqq5C67jb6TujXBaksoLNXKX8v8FxQlB9h+W/W901ozgraMve1aM5K2jL0H9xn1lLhFptxsrR/IbSbRyWF+y8xyJx6o3kYnh/ktE8tteqB6G50c5rfMN/dwO7tlvrmjS9kb5gmy63Qt3jZB9hxBpBjNGMRDhcIrcc1eiD9XxHz6q7hbqe06wrbid9bLov8jou2eKCLxZtKbLyQ5uOkf64Nj448n3Wdnu9fTA+VH5SflF0VXTOWZ8lK5VCYKVnzlvfJB+dj51Pmz81fn74J6/94u5weltjr//gepXbrl</latexit>

<latexit sha1_base64="OnC2V8C+vH/EBzHZgr1RubvVucc=">AAAKIXicfVZNj9s2EFXSr3jbtEl77EWoUaAojIVkZS3vIUCy9jY5NNntYr0bwDYCih7LgqkPUJRlhdDPyDU55Nf0VvRW9M+UkmytJMrlxQO+N6PhmweaVkCckGnaP/fuf/b5F19+9aBz9PU3D7/97tHj729CP6IYJtgnPn1joRCI48GEOYzAm4ACci0Ct9Z6lOG3G6Ch43vXLAlg7iLbc5YORkxsTWexhfnMctU4ffuoqx1r+VLlQN8FXWW3Lt8+7iizhY8jFzyGCQrDqa4FbM4RZQ4mkB7NohAChNfIhmnElsM5d7wgYuDhVP1ZYMuIqMxXs7bUhUMBM5KIAGHqiAoqXiGKMBPNH9VLheAhF8LeYuMEYRGGG7sIGBInn/Ntrkz6sJbJbYqClYO3tdY4ckMXsZW0GSauVd+EiADduPXNrE3RZIO5BYqdMBPhUihzEWRqh9f+5Q5fJcEKvDDlESVpNVEAQCksRWIehsCigOenESNeh08ZjaCXhfne0zGi6ytY9ESd2ka9nSXxEatvWY1jeA6GpdA7PRKaeRBj33WRt+CzIOUzBlvGZ73jNFe0il6lnM8y+SxLvcrgGvq6gr5O0zp4XgHPBVhHJyW6VCfN1JsKeCN99baC3jZTraiCRhK6qaAbqbIVV+BYgrcVdCuhSQVNJPRdBX0n64yEWab9OS9mkY+aXxBnAy8ogJfybj9tnoUKF0z1ekrmDN7V01zuBSzFrVEAbpLR+cvrV7+nfDTsn2iDtMmwSAR7imYMTkaaRLGLbnYcbTjsn0kcnyLPLguNzwfPdblQENGAlCTTNH47lSslQIgfl5VGZ+O+0SAJQeo96aauac3TxzbeEwamOdaa/cTkjvB8NDbM5mdiWuKWbkC/KV5M7ggLY/hELmCVuKENBqcSTu4IpoHMhURYl/hpvpq4X+IwOD0pNKjZBUtu2ZlC7eqq5C67jb6TujXBaksoLNXKX8v8FxQlB9h+W/W901ozgraMve1aM5K2jL0H9xn1lLhFptxsrR/IbSbRyWF+y8xyJx6o3kYnh/ktE8tteqB6G50c5rfMN/dwO7tlvrmjS9kb5gmy63Qt3jZB9hxBpBjNGMRDhcIrcc1eiD9XxHz6q7hbqe06wrbid9bLov8jou2eKCLxZtKbLyQ5uOkf64Nj448n3Wdnu9fTA+VH5SflF0VXTOWZ8lK5VCYKVnzlvfJB+dj51Pmz81fn74J6/94u5weltjr//gepXbrl</latexit> <latexit sha1_base64="OnC2V8C+vH/EBzHZgr1RubvVucc=">AAAKIXicfVZNj9s2EFXSr3jbtEl77EWoUaAojIVkZS3vIUCy9jY5NNntYr0bwDYCih7LgqkPUJRlhdDPyDU55Nf0VvRW9M+UkmytJMrlxQO+N6PhmweaVkCckGnaP/fuf/b5F19+9aBz9PU3D7/97tHj729CP6IYJtgnPn1joRCI48GEOYzAm4ACci0Ct9Z6lOG3G6Ch43vXLAlg7iLbc5YORkxsTWexhfnMctU4ffuoqx1r+VLlQN8FXWW3Lt8+7iizhY8jFzyGCQrDqa4FbM4RZQ4mkB7NohAChNfIhmnElsM5d7wgYuDhVP1ZYMuIqMxXs7bUhUMBM5KIAGHqiAoqXiGKMBPNH9VLheAhF8LeYuMEYRGGG7sIGBInn/Ntrkz6sJbJbYqClYO3tdY4ckMXsZW0GSauVd+EiADduPXNrE3RZIO5BYqdMBPhUihzEWRqh9f+5Q5fJcEKvDDlESVpNVEAQCksRWIehsCigOenESNeh08ZjaCXhfne0zGi6ytY9ESd2ka9nSXxEatvWY1jeA6GpdA7PRKaeRBj33WRt+CzIOUzBlvGZ73jNFe0il6lnM8y+SxLvcrgGvq6gr5O0zp4XgHPBVhHJyW6VCfN1JsKeCN99baC3jZTraiCRhK6qaAbqbIVV+BYgrcVdCuhSQVNJPRdBX0n64yEWab9OS9mkY+aXxBnAy8ogJfybj9tnoUKF0z1ekrmDN7V01zuBSzFrVEAbpLR+cvrV7+nfDTsn2iDtMmwSAR7imYMTkaaRLGLbnYcbTjsn0kcnyLPLguNzwfPdblQENGAlCTTNH47lSslQIgfl5VGZ+O+0SAJQeo96aauac3TxzbeEwamOdaa/cTkjvB8NDbM5mdiWuKWbkC/KV5M7ggLY/hELmCVuKENBqcSTu4IpoHMhURYl/hpvpq4X+IwOD0pNKjZBUtu2ZlC7eqq5C67jb6TujXBaksoLNXKX8v8FxQlB9h+W/W901ozgraMve1aM5K2jL0H9xn1lLhFptxsrR/IbSbRyWF+y8xyJx6o3kYnh/ktE8tteqB6G50c5rfMN/dwO7tlvrmjS9kb5gmy63Qt3jZB9hxBpBjNGMRDhcIrcc1eiD9XxHz6q7hbqe06wrbid9bLov8jou2eKCLxZtKbLyQ5uOkf64Nj448n3Wdnu9fTA+VH5SflF0VXTOWZ8lK5VCYKVnzlvfJB+dj51Pmz81fn74J6/94u5weltjr//gepXbrl</latexit>

1
v v2
<latexit sha1_base64="L+cjgjJFpucCh/LRrEUAMJ1ahjU=">AAAKNHicfVZNb9tGEGXSr8htWqc9tgeiQoGiEAxSjEUZRYDEkpscmtg1LDuAqRrL1YgitPzAcimKWfDSX9Nre+h/KdBb0Wt/Q5ekJJNcqnvxeN+b4eybh9XaIXEjpml/Pnj43vsffPjRo87Bx588/vSzwyefX0dBTDFMcEAC+tZGERDXhwlzGYG3IQXk2QRu7OUox29WQCM38K9YGsLUQ47vzl2MmNi6O/zKSgLMLdtTV9nPuvV99d/+3WFXO9KKpcqBvgm6ymZd3D3pKNYswLEHPsMERdGtroVsyhFlLiaQHVhxBCHCS+TAbczmwyl3/TBm4ONM/UZg85ioLFDzTtWZSwEzkooAYeqKCipeIIowE+c5qJeKwEceRL3Zyg2jMoxWThkwJMSY8nUhVva4lskdisKFi9e11jjyIg+xhbQZpZ5d34SYAF159c28TdFkg7kGit0oF+FCKHMe5gOIroKLDb5IwwX4UcZjSrJqogCAUpiLxCKMgMUhL04jpr6MnjEaQy8Pi71nY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd2IDTzIcGB5yF/xq0w4xaDNeNW7ygrFK2ilxnnVi6fbauXOVxD31TQN1lWB88q4JkA6+hkh87VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbCcVOJHgdQVdS2haQVMJfVdB38k6I2GW2/6Ul7MoRs3PibuClxTAz3i3nzXPQoULbvV6Su4M3tWzQu4ZzMVFUgJemtP5q6vXP2Z8NOwfa4OsybBJDFuKZgyOR5pEccpuNhxtOOyfSpyAIt/ZFRqfDV7ocqEwpiHZkUzT+OFErpQCIUGyqzQ6HfeNBkkIUu9JN3VNa54+cfCWMDDNsdbsJyH3hBejsWE2P5PQHW7rBvSb4iXknjAzhk/lAvYON7TB4ETCyT3BNJA5kwjLHX5SrCYe7HAYnByXGtTsgiW3bEyhdnVVcpfTRt9I3ZpgtyWUlmrlL2X+S4rSPeygrfrWaa0ZYVvG1natGWlbxtaD24x6StIiU2G21g8UNpPoZD+/ZWaFE/dUb6OT/fyWiRU23VO9jU7281vmW3i4nd0y38LRO9kb5gnz63QpXjdh/hxBpBzNGMRDhcJrcc2eix9XxAL6nbhbqeO5wrbir9XLo/8jovWWKCLxZtKbLyQ5uO4f6YMj46en3eenm9fTI+VL5WvlW0VXTOW58kq5UCYKVn5RflV+U37v/NH5q/N355+S+vDBJucLpbY6//4HB9jB4A==</latexit>

v1 v2
<latexit sha1_base64="L+cjgjJFpucCh/LRrEUAMJ1ahjU=">AAAKNHicfVZNb9tGEGXSr8htWqc9tgeiQoGiEAxSjEUZRYDEkpscmtg1LDuAqRrL1YgitPzAcimKWfDSX9Nre+h/KdBb0Wt/Q5ekJJNcqnvxeN+b4eybh9XaIXEjpml/Pnj43vsffPjRo87Bx588/vSzwyefX0dBTDFMcEAC+tZGERDXhwlzGYG3IQXk2QRu7OUox29WQCM38K9YGsLUQ47vzl2MmNi6O/zKSgLMLdtTV9nPuvV99d/+3WFXO9KKpcqBvgm6ymZd3D3pKNYswLEHPsMERdGtroVsyhFlLiaQHVhxBCHCS+TAbczmwyl3/TBm4ONM/UZg85ioLFDzTtWZSwEzkooAYeqKCipeIIowE+c5qJeKwEceRL3Zyg2jMoxWThkwJMSY8nUhVva4lskdisKFi9e11jjyIg+xhbQZpZ5d34SYAF159c28TdFkg7kGit0oF+FCKHMe5gOIroKLDb5IwwX4UcZjSrJqogCAUpiLxCKMgMUhL04jpr6MnjEaQy8Pi71nY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd2IDTzIcGB5yF/xq0w4xaDNeNW7ygrFK2ilxnnVi6fbauXOVxD31TQN1lWB88q4JkA6+hkh87VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbCcVOJHgdQVdS2haQVMJfVdB38k6I2GW2/6Ul7MoRs3PibuClxTAz3i3nzXPQoULbvV6Su4M3tWzQu4ZzMVFUgJemtP5q6vXP2Z8NOwfa4OsybBJDFuKZgyOR5pEccpuNhxtOOyfSpyAIt/ZFRqfDV7ocqEwpiHZkUzT+OFErpQCIUGyqzQ6HfeNBkkIUu9JN3VNa54+cfCWMDDNsdbsJyH3hBejsWE2P5PQHW7rBvSb4iXknjAzhk/lAvYON7TB4ETCyT3BNJA5kwjLHX5SrCYe7HAYnByXGtTsgiW3bEyhdnVVcpfTRt9I3ZpgtyWUlmrlL2X+S4rSPeygrfrWaa0ZYVvG1natGWlbxtaD24x6StIiU2G21g8UNpPoZD+/ZWaFE/dUb6OT/fyWiRU23VO9jU7281vmW3i4nd0y38LRO9kb5gnz63QpXjdh/hxBpBzNGMRDhcJrcc2eix9XxAL6nbhbqeO5wrbir9XLo/8jovWWKCLxZtKbLyQ5uO4f6YMj46en3eenm9fTI+VL5WvlW0VXTOW58kq5UCYKVn5RflV+U37v/NH5q/N355+S+vDBJucLpbY6//4HB9jB4A==</latexit>

2v
<latexit sha1_base64="L/UvdyN/6CruCKhksH0XwlqenrY=">AAAKIXicfVZNj9s2EFXSr3jbtEl77EWoUaAojIVkZS3vIUCy9jY5NNntYr0bwDYCih7LgqkPUJRlhdDPyDU55Nf0VvRW9M+UkmytJMrlxQO+N6PhmweaVkCckGnaP/fuf/b5F19+9aBz9PU3D7/97tHj729CP6IYJtgnPn1joRCI48GEOYzAm4ACci0Ct9Z6lOG3G6Ch43vXLAlg7iLbc5YORkxsTWexhfnMctVN+vZRVzvW8qXKgb4LuspuXb593FFmCx9HLngMExSGU10L2JwjyhxMID2aRSEECK+RDdOILYdz7nhBxMDDqfqzwJYRUZmvZm2pC4cCZiQRAcLUERVUvEIUYSaaP6qXCsFDLoS9xcYJwiIMN3YRMCROPufbXJn0YS2T2xQFKwdva61x5IYuYitpM0xcq74JEQG6ceubWZuiyQZzCxQ7YSbCpVDmIsjUDq/9yx2+SoIVeGHKI0rSaqIAgFJYisQ8DIFFAc9PI0a8Dp8yGkEvC/O9p2NE11ew6Ik6tY16O0viI1bfshrH8BwMS6F3eiQ08yDGvusib8FnQcpnDLaMz3rHaa5oFb1KOZ9l8lmWepXBNfR1BX2dpnXwvAKeC7COTkp0qU6aqTcV8Eb66m0FvW2mWlEFjSR0U0E3UmUrrsCxBG8r6FZCkwqaSOi7CvpO1hkJs0z7c17MIh81vyDOBl5QAC/l3X7aPAsVLpjq9ZTMGbyrp7ncC1iKW6MA3CSj85fXr35P+WjYP9EGaZNhkQj2FM0YnIw0iWIX3ew42nDYP5M4PkWeXRYanw+e63KhIKIBKUmmafx2KldKgBA/LiuNzsZ9o0ESgtR70k1d05qnj228JwxMc6w1+4nJHeH5aGyYzc/EtMQt3YB+U7yY3BEWxvCJXMAqcUMbDE4lnNwRTAOZC4mwLvHTfDVxv8RhcHpSaFCzC5bcsjOF2tVVyV12G30ndWuC1ZZQWKqVv5b5LyhKDrD9tup7p7VmBG0Ze9u1ZiRtGXsP7jPqKXGLTLnZWj+Q20yik8P8lpnlTjxQvY1ODvNbJpbb9ED1Njo5zG+Zb+7hdnbLfHNHl7I3zBNk1+lavG2C7DmCSDGaMYiHCoVX4pq9EH+uiPn0V3G3Utt1hG3F76yXRf9HRNs9UUTizaQ3X0hycNM/1gfHxh9Pus/Odq+nB8qPyk/KL4qumMoz5aVyqUwUrPjKe+WD8rHzqfNn56/O3wX1/r1dzg9KbXX+/Q+fqbrk</latexit>

1
<latexit sha1_base64="L+cjgjJFpucCh/LRrEUAMJ1ahjU=">AAAKNHicfVZNb9tGEGXSr8htWqc9tgeiQoGiEAxSjEUZRYDEkpscmtg1LDuAqRrL1YgitPzAcimKWfDSX9Nre+h/KdBb0Wt/Q5ekJJNcqnvxeN+b4eybh9XaIXEjpml/Pnj43vsffPjRo87Bx588/vSzwyefX0dBTDFMcEAC+tZGERDXhwlzGYG3IQXk2QRu7OUox29WQCM38K9YGsLUQ47vzl2MmNi6O/zKSgLMLdtTV9nPuvV99d/+3WFXO9KKpcqBvgm6ymZd3D3pKNYswLEHPsMERdGtroVsyhFlLiaQHVhxBCHCS+TAbczmwyl3/TBm4ONM/UZg85ioLFDzTtWZSwEzkooAYeqKCipeIIowE+c5qJeKwEceRL3Zyg2jMoxWThkwJMSY8nUhVva4lskdisKFi9e11jjyIg+xhbQZpZ5d34SYAF159c28TdFkg7kGit0oF+FCKHMe5gOIroKLDb5IwwX4UcZjSrJqogCAUpiLxCKMgMUhL04jpr6MnjEaQy8Pi71nY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd2IDTzIcGB5yF/xq0w4xaDNeNW7ygrFK2ilxnnVi6fbauXOVxD31TQN1lWB88q4JkA6+hkh87VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbCcVOJHgdQVdS2haQVMJfVdB38k6I2GW2/6Ul7MoRs3PibuClxTAz3i3nzXPQoULbvV6Su4M3tWzQu4ZzMVFUgJemtP5q6vXP2Z8NOwfa4OsybBJDFuKZgyOR5pEccpuNhxtOOyfSpyAIt/ZFRqfDV7ocqEwpiHZkUzT+OFErpQCIUGyqzQ6HfeNBkkIUu9JN3VNa54+cfCWMDDNsdbsJyH3hBejsWE2P5PQHW7rBvSb4iXknjAzhk/lAvYON7TB4ETCyT3BNJA5kwjLHX5SrCYe7HAYnByXGtTsgiW3bEyhdnVVcpfTRt9I3ZpgtyWUlmrlL2X+S4rSPeygrfrWaa0ZYVvG1natGWlbxtaD24x6StIiU2G21g8UNpPoZD+/ZWaFE/dUb6OT/fyWiRU23VO9jU7281vmW3i4nd0y38LRO9kb5gnz63QpXjdh/hxBpBzNGMRDhcJrcc2eix9XxAL6nbhbqeO5wrbir9XLo/8jovWWKCLxZtKbLyQ5uO4f6YMj46en3eenm9fTI+VL5WvlW0VXTOW58kq5UCYKVn5RflV+U37v/NH5q/N355+S+vDBJucLpbY6//4HB9jB4A==</latexit>

v v

v1 v2
<latexit sha1_base64="L+cjgjJFpucCh/LRrEUAMJ1ahjU=">AAAKNHicfVZNb9tGEGXSr8htWqc9tgeiQoGiEAxSjEUZRYDEkpscmtg1LDuAqRrL1YgitPzAcimKWfDSX9Nre+h/KdBb0Wt/Q5ekJJNcqnvxeN+b4eybh9XaIXEjpml/Pnj43vsffPjRo87Bx588/vSzwyefX0dBTDFMcEAC+tZGERDXhwlzGYG3IQXk2QRu7OUox29WQCM38K9YGsLUQ47vzl2MmNi6O/zKSgLMLdtTV9nPuvV99d/+3WFXO9KKpcqBvgm6ymZd3D3pKNYswLEHPsMERdGtroVsyhFlLiaQHVhxBCHCS+TAbczmwyl3/TBm4ONM/UZg85ioLFDzTtWZSwEzkooAYeqKCipeIIowE+c5qJeKwEceRL3Zyg2jMoxWThkwJMSY8nUhVva4lskdisKFi9e11jjyIg+xhbQZpZ5d34SYAF159c28TdFkg7kGit0oF+FCKHMe5gOIroKLDb5IwwX4UcZjSrJqogCAUpiLxCKMgMUhL04jpr6MnjEaQy8Pi71nY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd2IDTzIcGB5yF/xq0w4xaDNeNW7ygrFK2ilxnnVi6fbauXOVxD31TQN1lWB88q4JkA6+hkh87VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbCcVOJHgdQVdS2haQVMJfVdB38k6I2GW2/6Ul7MoRs3PibuClxTAz3i3nzXPQoULbvV6Su4M3tWzQu4ZzMVFUgJemtP5q6vXP2Z8NOwfa4OsybBJDFuKZgyOR5pEccpuNhxtOOyfSpyAIt/ZFRqfDV7ocqEwpiHZkUzT+OFErpQCIUGyqzQ6HfeNBkkIUu9JN3VNa54+cfCWMDDNsdbsJyH3hBejsWE2P5PQHW7rBvSb4iXknjAzhk/lAvYON7TB4ETCyT3BNJA5kwjLHX5SrCYe7HAYnByXGtTsgiW3bEyhdnVVcpfTRt9I3ZpgtyWUlmrlL2X+S4rSPeygrfrWaa0ZYVvG1natGWlbxtaD24x6StIiU2G21g8UNpPoZD+/ZWaFE/dUb6OT/fyWiRU23VO9jU7281vmW3i4nd0y38LRO9kb5gnz63QpXjdh/hxBpBzNGMRDhcJrcc2eix9XxAL6nbhbqeO5wrbir9XLo/8jovWWKCLxZtKbLyQ5uO4f6YMj46en3eenm9fTI+VL5WvlW0VXTOW58kq5UCYKVn5RflV+U37v/NH5q/N355+S+vDBJucLpbY6//4HB9jB4A==</latexit>

If we break one of the basis vectors into the sum of two other
vectors, the original area is the sum of the two parts. This is
easy to see if the sub-vectors point in the same direction as the
original. If they don’t, we simply skew them until they do.

Symbolically, this means that if we break one of the vectors in


a matrix into a sum of two other vectors, then the determinant
distributes over that sum:

| v1 + v2 , w | = | v1 , w | + | v2 , w | .
We won’t need it here, but we can also show that multiplying
one of the vectors by some scalar scales the determinant by the
same value. These two properties together are called multilin-
earity: the area of a paralellogram is a multilinear function of
the basis vectors. It’s a linear function of one of its arguments
if we keep the others fixed.
We need one more property: if we start with a parallelo-
gram with basis vectors v, w, and we flip around the vectors,
w, v, what happens to the area? If we look at the picture of the
parallellogram, at first, it’s difficult to see that anything changes
at all. The two basis vectors are still the same. To see what
happens, we need to look at the operation of the matrix with
these vectors as its columns.
The matrix with columns v, w maps the horizontal unit vec-
tor eh to v, and the vertical unit vector eh to w. For the ma-
trix with the columns swapped, we reverse this mapping. We
3.2. DETERMINANTS 87

get the same, parallelogram, but it’s as if we’ve turned it over.


Since the unit square has positive area, the flipped-over paralel-
logram has negative area.

w w
<latexit sha1_base64="bWQdFgu5gdL3jNkmMt0XnOXZKt8=">AAAKInicfVZNb9tGEGXSr8ht2qQ99kJUKFAUgkGKsSgfAiSW3OTQxK5h2QFMIViuRhSh5QeWS1HMgn+j1/bQX9Nb0VOB/pguSYkmuVT3osG+N8PZNw+rtUPiRkzT/nnw8KOPP/n0s0e9o8+/ePzlV0+efn0TBTHFMMMBCeg7G0VAXB9mzGUE3oUUkGcTuLXXkxy/3QCN3MC/ZmkIcw85vrt0MWJiy7ISYmNu2Z6aZO+f9LVjrViqHOi7oK/s1uX7pz3FWgQ49sBnmKAoutO1kM05oszFBLIjK44gRHiNHLiL2XI8564fxgx8nKnfC2wZE5UFat6XunApYEZSESBMXVFBxStEEWai+6NmqQh85EE0WGzcMCrDaOOUAUPi6HO+LaTJHjcyuUNRuHLxttEaR17kIbaSNqPUs5ubEBOgG6+5mbcpmmwxt0CxG+UiXAplLsJc7ug6uNzhqzRcgR9lPKYkqycKACiFpUgswghYHPLiNGLG6+g5ozEM8rDYez5FdH0Fi4Go09hotrMkAWLNLbt1DN/FsBR6Z0dCMx8SHHge8hfcCjNuMdgybg2Os0LROnqVcW7l8tm2epXDDfRtDX2bZU3wvAaeC7CJzip0qc7aqTc18Eb66m0NvW2n2nENjSV0U0M3UmU7qcGJBG9r6FZC0xqaSuiHGvpB1hkJs9wN57ycRTFqfkHcDbyiAH7G+8OsfRYqXHCnN1NyZ/C+nhVyL2Apro0S8NKczl9fv/k545Px8EQbZW2GTWLYUzRjdDLRJIpTdrPjaOPx8EziBBT5TlVoej56qcuFwpiGpCKZpvHTqVwpBUKCpKo0OZsOjRZJCNLsSTd1TWufPnHwnjAyzanW7ich94SXk6lhtj+T0Aq3dQOGbfESck9YGONncgG7wg1tNDqVcHJPMA1kLiTCusJPi9XGgwqH0elJqUHDLlhyy84Ual9XJXc5XfSd1J0JdldCaalO/lrmv6IoPcAOuqrvndaZEXZl7G3XmZF2Zew9uM9opiQdMhVm6/xAYTOJTg7zO2ZWOPFA9S46OczvmFhh0wPVu+jkML9jvoWHu9kd8y0cXcneMk+YX6dr8bYJ8+cIIuVopiAeKhTeiGv2Qvy5IhbQH8XdSh3PFbYVv9Ygj/6PiLZ7oojEm0lvv5Dk4GZ4rI+OjV+e9V+c7V5Pj5Rvle+UHxRdMZUXymvlUpkpWAmVX5XflN97f/T+7P3V+7ukPnywy/lGaazev/8B8uO7Ww==</latexit>

<latexit sha1_base64="bWQdFgu5gdL3jNkmMt0XnOXZKt8=">AAAKInicfVZNb9tGEGXSr8ht2qQ99kJUKFAUgkGKsSgfAiSW3OTQxK5h2QFMIViuRhSh5QeWS1HMgn+j1/bQX9Nb0VOB/pguSYkmuVT3osG+N8PZNw+rtUPiRkzT/nnw8KOPP/n0s0e9o8+/ePzlV0+efn0TBTHFMMMBCeg7G0VAXB9mzGUE3oUUkGcTuLXXkxy/3QCN3MC/ZmkIcw85vrt0MWJiy7ISYmNu2Z6aZO+f9LVjrViqHOi7oK/s1uX7pz3FWgQ49sBnmKAoutO1kM05oszFBLIjK44gRHiNHLiL2XI8564fxgx8nKnfC2wZE5UFat6XunApYEZSESBMXVFBxStEEWai+6NmqQh85EE0WGzcMCrDaOOUAUPi6HO+LaTJHjcyuUNRuHLxttEaR17kIbaSNqPUs5ubEBOgG6+5mbcpmmwxt0CxG+UiXAplLsJc7ug6uNzhqzRcgR9lPKYkqycKACiFpUgswghYHPLiNGLG6+g5ozEM8rDYez5FdH0Fi4Go09hotrMkAWLNLbt1DN/FsBR6Z0dCMx8SHHge8hfcCjNuMdgybg2Os0LROnqVcW7l8tm2epXDDfRtDX2bZU3wvAaeC7CJzip0qc7aqTc18Eb66m0NvW2n2nENjSV0U0M3UmU7qcGJBG9r6FZC0xqaSuiHGvpB1hkJs9wN57ycRTFqfkHcDbyiAH7G+8OsfRYqXHCnN1NyZ/C+nhVyL2Apro0S8NKczl9fv/k545Px8EQbZW2GTWLYUzRjdDLRJIpTdrPjaOPx8EziBBT5TlVoej56qcuFwpiGpCKZpvHTqVwpBUKCpKo0OZsOjRZJCNLsSTd1TWufPnHwnjAyzanW7ich94SXk6lhtj+T0Aq3dQOGbfESck9YGONncgG7wg1tNDqVcHJPMA1kLiTCusJPi9XGgwqH0elJqUHDLlhyy84Ual9XJXc5XfSd1J0JdldCaalO/lrmv6IoPcAOuqrvndaZEXZl7G3XmZF2Zew9uM9opiQdMhVm6/xAYTOJTg7zO2ZWOPFA9S46OczvmFhh0wPVu+jkML9jvoWHu9kd8y0cXcneMk+YX6dr8bYJ8+cIIuVopiAeKhTeiGv2Qvy5IhbQH8XdSh3PFbYVv9Ygj/6PiLZ7oojEm0lvv5Dk4GZ4rI+OjV+e9V+c7V5Pj5Rvle+UHxRdMZUXymvlUpkpWAmVX5XflN97f/T+7P3V+7ukPnywy/lGaazev/8B8uO7Ww==</latexit>

h ev h ev

v v
<latexit sha1_base64="L/UvdyN/6CruCKhksH0XwlqenrY=">AAAKIXicfVZNj9s2EFXSr3jbtEl77EWoUaAojIVkZS3vIUCy9jY5NNntYr0bwDYCih7LgqkPUJRlhdDPyDU55Nf0VvRW9M+UkmytJMrlxQO+N6PhmweaVkCckGnaP/fuf/b5F19+9aBz9PU3D7/97tHj729CP6IYJtgnPn1joRCI48GEOYzAm4ACci0Ct9Z6lOG3G6Ch43vXLAlg7iLbc5YORkxsTWexhfnMctVN+vZRVzvW8qXKgb4LuspuXb593FFmCx9HLngMExSGU10L2JwjyhxMID2aRSEECK+RDdOILYdz7nhBxMDDqfqzwJYRUZmvZm2pC4cCZiQRAcLUERVUvEIUYSaaP6qXCsFDLoS9xcYJwiIMN3YRMCROPufbXJn0YS2T2xQFKwdva61x5IYuYitpM0xcq74JEQG6ceubWZuiyQZzCxQ7YSbCpVDmIsjUDq/9yx2+SoIVeGHKI0rSaqIAgFJYisQ8DIFFAc9PI0a8Dp8yGkEvC/O9p2NE11ew6Ik6tY16O0viI1bfshrH8BwMS6F3eiQ08yDGvusib8FnQcpnDLaMz3rHaa5oFb1KOZ9l8lmWepXBNfR1BX2dpnXwvAKeC7COTkp0qU6aqTcV8Eb66m0FvW2mWlEFjSR0U0E3UmUrrsCxBG8r6FZCkwqaSOi7CvpO1hkJs0z7c17MIh81vyDOBl5QAC/l3X7aPAsVLpjq9ZTMGbyrp7ncC1iKW6MA3CSj85fXr35P+WjYP9EGaZNhkQj2FM0YnIw0iWIX3ew42nDYP5M4PkWeXRYanw+e63KhIKIBKUmmafx2KldKgBA/LiuNzsZ9o0ESgtR70k1d05qnj228JwxMc6w1+4nJHeH5aGyYzc/EtMQt3YB+U7yY3BEWxvCJXMAqcUMbDE4lnNwRTAOZC4mwLvHTfDVxv8RhcHpSaFCzC5bcsjOF2tVVyV12G30ndWuC1ZZQWKqVv5b5LyhKDrD9tup7p7VmBG0Ze9u1ZiRtGXsP7jPqKXGLTLnZWj+Q20yik8P8lpnlTjxQvY1ODvNbJpbb9ED1Njo5zG+Zb+7hdnbLfHNHl7I3zBNk1+lavG2C7DmCSDGaMYiHCoVX4pq9EH+uiPn0V3G3Utt1hG3F76yXRf9HRNs9UUTizaQ3X0hycNM/1gfHxh9Pus/Odq+nB8qPyk/KL4qumMoz5aVyqUwUrPjKe+WD8rHzqfNn56/O3wX1/r1dzg9KbXX+/Q+fqbrk</latexit>

<latexit sha1_base64="L/UvdyN/6CruCKhksH0XwlqenrY=">AAAKIXicfVZNj9s2EFXSr3jbtEl77EWoUaAojIVkZS3vIUCy9jY5NNntYr0bwDYCih7LgqkPUJRlhdDPyDU55Nf0VvRW9M+UkmytJMrlxQO+N6PhmweaVkCckGnaP/fuf/b5F19+9aBz9PU3D7/97tHj729CP6IYJtgnPn1joRCI48GEOYzAm4ACci0Ct9Z6lOG3G6Ch43vXLAlg7iLbc5YORkxsTWexhfnMctVN+vZRVzvW8qXKgb4LuspuXb593FFmCx9HLngMExSGU10L2JwjyhxMID2aRSEECK+RDdOILYdz7nhBxMDDqfqzwJYRUZmvZm2pC4cCZiQRAcLUERVUvEIUYSaaP6qXCsFDLoS9xcYJwiIMN3YRMCROPufbXJn0YS2T2xQFKwdva61x5IYuYitpM0xcq74JEQG6ceubWZuiyQZzCxQ7YSbCpVDmIsjUDq/9yx2+SoIVeGHKI0rSaqIAgFJYisQ8DIFFAc9PI0a8Dp8yGkEvC/O9p2NE11ew6Ik6tY16O0viI1bfshrH8BwMS6F3eiQ08yDGvusib8FnQcpnDLaMz3rHaa5oFb1KOZ9l8lmWepXBNfR1BX2dpnXwvAKeC7COTkp0qU6aqTcV8Eb66m0FvW2mWlEFjSR0U0E3UmUrrsCxBG8r6FZCkwqaSOi7CvpO1hkJs0z7c17MIh81vyDOBl5QAC/l3X7aPAsVLpjq9ZTMGbyrp7ncC1iKW6MA3CSj85fXr35P+WjYP9EGaZNhkQj2FM0YnIw0iWIX3ew42nDYP5M4PkWeXRYanw+e63KhIKIBKUmmafx2KldKgBA/LiuNzsZ9o0ESgtR70k1d05qnj228JwxMc6w1+4nJHeH5aGyYzc/EtMQt3YB+U7yY3BEWxvCJXMAqcUMbDE4lnNwRTAOZC4mwLvHTfDVxv8RhcHpSaFCzC5bcsjOF2tVVyV12G30ndWuC1ZZQWKqVv5b5LyhKDrD9tup7p7VmBG0Ze9u1ZiRtGXsP7jPqKXGLTLnZWj+Q20yik8P8lpnlTjxQvY1ODvNbJpbb9ED1Njo5zG+Zb+7hdnbLfHNHl7I3zBNk1+lavG2C7DmCSDGaMYiHCoVX4pq9EH+uiPn0V3G3Utt1hG3F76yXRf9HRNs9UUTizaQ3X0hycNM/1gfHxh9Pus/Odq+nB8qPyk/KL4qumMoz5aVyqUwUrPjKe+WD8rHzqfNn56/O3wX1/r1dzg9KbXX+/Q+fqbrk</latexit>

eh eh
<latexit sha1_base64="YGWqs+rEF+BTB/PNHnoGQhIary0=">AAAKXnicfZbfbts2FMbVdmsbb1nT7mZAb4QZA4bBCCS7sRwUBdrYaXuxNlkQJwUiI6DoY1sw9QcUZVll9UZ9mt0N28UeZZRkK5IoT77wMX/fOSIPP9C0fGIHTNP+unf/wTffPnz0eK/13ff7Pzw5ePrsKvBCimGMPeLRTxYKgNgujJnNCHzyKSDHInBtLYcpv14BDWzPvWSxDxMHzV17ZmPExNDtwdsv5kszsjA3LUddJR3VjMjmV5SYL7+oAouPmsnuSKorkoTs9qCtHWrZo8qBvgnayuY5v326p5hTD4cOuAwTFAQ3uuazCUeU2ZhA0jLDAHyEl2gONyGbDSbcdv2QgYsT9RfBZiFRmaema1KnNgXMSCwChKktKqh4gSjCTKy8VS0VgIscCDrTle0HeRis5nnAkGjbhK+ztib7lUw+p8hf2HhdmRpHTuAgtpAGg9ixqoMQEqArpzqYTlNMsqZcA8V2kDbhXHTmzE+3Krj0zjd8EfsLcIOEh5Qk5UQBgFKYicQsDICFPs9WI/yxDF4xGkInDbOxVyNElxcw7Yg6lYHqdGbEQ6w6ZNWW4doYZqLfSUv0zIUIe46D3Ck3/YSbDNaMm53DJOtomV4knJtp+yxLvUhxhX4s0Y9JUoWnJXgqYJWOCzpTx/XUqxK8kt56XaLX9VQrLNFQoqsSXUmVraiEIwmvS3Qt0bhEY4l+LtHPcp+RMMtNd8Lzvci2mp8RewXvKICb8HY3qa+FChfc6NWU1Bm8rSdZu6cwE0dODpw4lfP3lx9+T/hw0D3S+kldYZEQthKt1z8aapJkns9mo9EGg+6JpPEocudFodFp/40uF/JD6pNCZBi9t8dypRgI8aKi0vBk1O3VRKIh1Tnphq5p9dVHc7wV9A1jpNXnE5E7wZvhqGfUXxPRglt6D7r15kXkTjDtDV7IBayC97R+/1ji5E5g9JAxlQTLgh9nT517BYf+8VHeg4pdsOSWjSnUtq5K7po3yTetbkywmhJySzXql7L+HUXxDrXXVH3rtMYMvylja7vGjLgpY+vBbUY1JWpoU2a2xhdkNpPkZLe+Yc8yJ+6o3iQnu/UNO5bZdEf1JjnZrW/Y38zDzeqG/c0cXbS9Zh4/PU6X4nrjp9cRRPKtGYG4qFD4II7ZM/HniphHfxNnK507trCt+DY7afR/QrTeCkUk7kx6/YYkB1fdQ71/2PvjRfv1yeb29Fh5rvys/KroiqG8Vt4r58pYwcpX5U/lb+WfvX9bD1v7rSe59P69Tc6PSuVp/fQfYIXOHQ==</latexit>

| v, w | | w, v |
<latexit sha1_base64="YGWqs+rEF+BTB/PNHnoGQhIary0=">AAAKXnicfZbfbts2FMbVdmsbb1nT7mZAb4QZA4bBCCS7sRwUBdrYaXuxNlkQJwUiI6DoY1sw9QcUZVll9UZ9mt0N28UeZZRkK5IoT77wMX/fOSIPP9C0fGIHTNP+unf/wTffPnz0eK/13ff7Pzw5ePrsKvBCimGMPeLRTxYKgNgujJnNCHzyKSDHInBtLYcpv14BDWzPvWSxDxMHzV17ZmPExNDtwdsv5kszsjA3LUddJR3VjMjmV5SYL7+oAouPmsnuSKorkoTs9qCtHWrZo8qBvgnayuY5v326p5hTD4cOuAwTFAQ3uuazCUeU2ZhA0jLDAHyEl2gONyGbDSbcdv2QgYsT9RfBZiFRmaema1KnNgXMSCwChKktKqh4gSjCTKy8VS0VgIscCDrTle0HeRis5nnAkGjbhK+ztib7lUw+p8hf2HhdmRpHTuAgtpAGg9ixqoMQEqArpzqYTlNMsqZcA8V2kDbhXHTmzE+3Krj0zjd8EfsLcIOEh5Qk5UQBgFKYicQsDICFPs9WI/yxDF4xGkInDbOxVyNElxcw7Yg6lYHqdGbEQ6w6ZNWW4doYZqLfSUv0zIUIe46D3Ck3/YSbDNaMm53DJOtomV4knJtp+yxLvUhxhX4s0Y9JUoWnJXgqYJWOCzpTx/XUqxK8kt56XaLX9VQrLNFQoqsSXUmVraiEIwmvS3Qt0bhEY4l+LtHPcp+RMMtNd8Lzvci2mp8RewXvKICb8HY3qa+FChfc6NWU1Bm8rSdZu6cwE0dODpw4lfP3lx9+T/hw0D3S+kldYZEQthKt1z8aapJkns9mo9EGg+6JpPEocudFodFp/40uF/JD6pNCZBi9t8dypRgI8aKi0vBk1O3VRKIh1Tnphq5p9dVHc7wV9A1jpNXnE5E7wZvhqGfUXxPRglt6D7r15kXkTjDtDV7IBayC97R+/1ji5E5g9JAxlQTLgh9nT517BYf+8VHeg4pdsOSWjSnUtq5K7po3yTetbkywmhJySzXql7L+HUXxDrXXVH3rtMYMvylja7vGjLgpY+vBbUY1JWpoU2a2xhdkNpPkZLe+Yc8yJ+6o3iQnu/UNO5bZdEf1JjnZrW/Y38zDzeqG/c0cXbS9Zh4/PU6X4nrjp9cRRPKtGYG4qFD4II7ZM/HniphHfxNnK507trCt+DY7afR/QrTeCkUk7kx6/YYkB1fdQ71/2PvjRfv1yeb29Fh5rvys/KroiqG8Vt4r58pYwcpX5U/lb+WfvX9bD1v7rSe59P69Tc6PSuVp/fQfYIXOHQ==</latexit>

| v, w | | w, v |

Swapping the columns of a matrix swaps the basis vectors of


the paralellogram, turning a positive area into a negative area.

Put simply, swapping basis vectors around maintains the magni-


tude of the area of a parallellogram, but changes the signs.

| v, w | = -| w, v | .

We call this property alternativity. As in, the area of a parallel-


ogram is and alternating function of its basis vectors.
With these three properties: skew invariance, multilinearity
and alternativity, we can work out our new proof of the determi-
nant formula. First, using multilinearity, we write the first col-
umn of matrix A as the sum of two vectors:

a b a b 0 b
= + .
c d 0 d c d

We’ll call this kind of vector, where only one element is non-zero,
a simple vector. Here’s a visualization of that step.
88 CHAPTER 3—PROVING THE SPECTRAL THEOREM


<latexit sha1_base64="slsRrB9JFy5dSchWlIqykHM9N7k=">AAAKR3icfVZNb9tGEKXTtI3UpnXaYy9EhBZFIbikGEvyIUBiyU0OTewalh3AFIzlakQRWn5guRTFLPgz8mtybQ/9Cf0VvRU9dklJNMmluhcN9r0Zzr55WK0VECdkmvbXwYNPHn762eePWu0vvnz81deHT765Dv2IYphgn/j0nYVCII4HE+YwAu8CCsi1CNxYy1GG36yAho7vXbEkgKmLbM+ZOxgxsXV3+LMZ+5ibFtiOxwMXMeqsU039QbVMU8Xid2aCNyuQ9O6wox1p+VLlQN8GHWW7Lu6etBRz5uPIBY9hgsLwVtcCNuWIMgcTSNtmFEKA8BLZcBux+XDKHS+IGHg4Vb8X2DwiKvPVrHl15lDAjCQiQJg6ooKKF4gizMQR29VSIXjIhbA7WzlBuAnDlb0JGBL6TPk61y99XMnkNkXBwsHrSmscuaHQYCFtholrVTchIkBXbnUza1M0WWOugWInzES4EMqcB9lMwiv/YosvkmABXpjyiJK0nCgAoBTmIjEPQ2BRwPPTCCMsw+eMRtDNwnzv+RjR5SXMuqJOZaPazpz4iFW3rNoxPAfDXOidtoVmHsTYd10k/GEGKTcZrBk3u0dprmgZvUw5NzP5LEu9zOAK+raEvk3TKnhWAs8EWEUnBTpXJ/XU6xJ4LX31poTe1FOtqIRGEroqoSupshWX4FiC1yV0LaFJCU0k9H0JfS/rjIRZbntTvplFPmp+TpwVvKIAXso7vbR+FipccKtXUzJn8I6e5nLPYC7ulg3gJhmdv75682vKR8PesdZP6wyLRLCjaEb/eKRJFHvTzZajDYe9U4njU+TZRaHxWf+lLhcKIhqQgjQYGL+cyJUSIMSPi0qj03HPqJGEINWe9IGuafXTxzbeEfqDwVir9xOTe8LL0dgY1D8T0wK3dAN6dfFick+YGcNncgGrwA2t3z+RcHJPGBhoMJMIywI/yVcd9wsc+ifHGw0qdsGSW7amUDu6KrnLbqJvpW5MsJoSNpZq5C9l/iuKkj1sv6n6zmmNGUFTxs52jRlJU8bOg7uMakrcIFNutsYP5DaT6GQ/v2FmuRP3VG+ik/38honlNt1TvYlO9vMb5pt7uJndMN/c0YXsNfME2XW6FO+fIHuOILIZzRjEQ4XCG3HNnos/V8R8+pO4W6ntOsK24tfsZtH/EdF6RxSReDPp9ReSHFz3jvT+kfHbs86L0+3r6ZHynfJU+VHRlYHyQnmtXCgTBSsflI/K78ofrT9bf7f+af27oT442OZ8q1RW++A/KevItQ==</latexit>


0 b
c d

✓ ◆
<latexit sha1_base64="qOXbbYklHSgpA971L0U6bw94rnA=">AAAKQHicfVbLbttGFGXSV+Q2rdMssyEqFAgKwSDFWJQXARJLbrJoYtew7ACmYAxHVxSh4QPDoShmwF/o13TbLvoX/YPuim676pCSaJJDdTa6uOfcyztnDkZjh8SNmKb9+eDhJ59+9vkXjzoHX371+OtvDp98ex0FMcUwwQEJ6AcbRUBcHybMZQQ+hBSQZxO4sZejHL9ZAY3cwL9iaQhTDzm+O3cxYiJ1d/jcSgLMLRsc1+ehhxh115mmWpaKLfBnZSq7O+xqR1qxVDnQt0FX2a6LuycdxZoFOPbAZ5igKLrVtZBNOaLMxQSyAyuOIER4iRy4jdl8OOWuH8YMfJyp3wtsHhOVBWo+tTpzKWBGUhEgTF3RQcULRBFmYm8H9VYR+MiDqDdbuWG0CaOVswkYEsJM+boQLntcq+QOReHCxevaaBx5kdBgISWj1LPrSYgJ0JVXT+ZjiiEbzDVQ7Ea5CBdCmfMwP4zoKrjY4os0XIAfZTymJKsWCgAohbkoLMIIWBzyYjfCAcvoJaMx9PKwyL0cI7q8hFlP9Kkl6uPMSYBYPWU3tuG7GOZC7+xAaOZDggPPQ8IfVphxi8Gacat3lBWKVtHLjHMrl8+21cscrqHvK+j7LKuDZxXwTIB1dFKic3XSLL2ugNfSV28q6E2z1I4raCyhqwq6kjrbSQVOJHhdQdcSmlbQVEI/VtCPss5ImOW2P+WbsyiOmp8TdwVvKICf8W4/a+6FChfc6vWS3Bm8q2eF3DOYi0tlA3hpTudvr979lPHRsH+sDbImwyYx7CiaMTgeaRLF2Uyz5WjDYf9U4gQU+U7ZaHw2eK3LjcKYhqQkmabx44ncKQVCgqTsNDod940GSQhSn0k3dU1r7j5x8I4wMM2x1pwnIfeE16OxYTY/k9ASt3UD+k3xEnJPmBnDF3IDu8QNbTA4kXByTzANZM4kwrLET4rVxIMSh8HJ8UaDml2w5JatKdSurkructroW6lbC+y2go2lWvlLmf+GonQPO2jrvnNaa0XYVrGzXWtF2lax8+Cuol6StMhUmK31A4XNJDrZz285s8KJe7q30cl+fsuJFTbd072NTvbzW8638HA7u+V8C0eXsjfME+bX6VI8fML8OYLI5mjGIB4qFN6Ja/Zc/LkiFtAfxN1KHc8VthW/Vi+P/o+I1juiiMSbSW++kOTgun+kD46Mn190X51uX0+PlGfKd8pzRVdM5ZXyVrlQJgpWflF+VX5Tfu/80fmr83fnnw314YNtzVOltjr//geJV8f8</latexit>

0
c
✓ ◆
<latexit sha1_base64="2XwiayN+/+7gIs4NXNvekNrVJHc=">AAAKQHicfVbLbttGFGXSV+Q2rdMssyEqFAgKwSDFWJQXARJLbrJoYtew7ACmYAxHVxSh4QPDoShmwF/o13TbLvoX/YPuim676pCSaJJDdTa6uOfcyztnDkZjh8SNmKb9+eDhJ59+9vkXjzoHX371+OtvDp98ex0FMcUwwQEJ6AcbRUBcHybMZQQ+hBSQZxO4sZejHL9ZAY3cwL9iaQhTDzm+O3cxYiJ1d/jcSgLMLRsc1+ehhxh11xlSLUvVLPBnZSq7O+xqR1qxVDnQt0FX2a6LuycdxZoFOPbAZ5igKLrVtZBNOaLMxQSyAyuOIER4iRy4jdl8OOWuH8YMfJyp3wtsHhOVBWo+tTpzKWBGUhEgTF3RQcULRBFmYm8H9VYR+MiDqDdbuWG0CaOVswkYEsJM+boQLntcq+QOReHCxevaaBx5kdBgISWj1LPrSYgJ0JVXT+ZjiiEbzDVQ7Ea5CBdCmfMwP4zoKrjY4os0XIAfZTymJKsWCgAohbkoLMIIWBzyYjfCAcvoJaMx9PKwyL0cI7q8hFlP9Kkl6uPMSYBYPWU3tuG7GOZC7+xAaOZDggPPQ8IfVphxi8Gacat3lBWKVtHLjHMrl8+21cscrqHvK+j7LKuDZxXwTIB1dFKic3XSLL2ugNfSV28q6E2z1I4raCyhqwq6kjrbSQVOJHhdQdcSmlbQVEI/VtCPss5ImOW2P+WbsyiOmp8TdwVvKICf8W4/a+6FChfc6vWS3Bm8q2eF3DOYi0tlA3hpTudvr979lPHRsH+sDbImwyYx7CiaMTgeaRLF2Uyz5WjDYf9U4gQU+U7ZaHw2eK3LjcKYhqQkmabx44ncKQVCgqTsNDod940GSQhSn0k3dU1r7j5x8I4wMM2x1pwnIfeE16OxYTY/k9ASt3UD+k3xEnJPmBnDF3IDu8QNbTA4kXByTzANZM4kwrLET4rVxIMSh8HJ8UaDml2w5JatKdSurkructroW6lbC+y2go2lWvlLmf+GonQPO2jrvnNaa0XYVrGzXWtF2lax8+Cuol6StMhUmK31A4XNJDrZz285s8KJe7q30cl+fsuJFTbd072NTvbzW8638HA7u+V8C0eXsjfME+bX6VI8fML8OYLI5mjGIB4qFN6Ja/Zc/LkiFtAfxN1KHc8VthW/Vi+P/o+I1juiiMSbSW++kOTgun+kD46Mn190X51uX0+PlGfKd8pzRVdM5ZXyVrlQJgpWflF+VX5Tfu/80fmr83fnnw314YNtzVOltjr//gd2ysf6</latexit>


<latexit sha1_base64="b+y7EF1UkhYc7R2dFLbJ9wrH848=">AAAKR3icfVZNb9tGEKXTtI3UpnXaYy9EhBZFIbikGEvyIUBiyU0OTewalh3AFIzlakQRWn5guRTFLPgz8mtybQ/9Cf0VvRU9dklJNMmluhcN9r0Zzr55WK0VECdkmvbXwYNPHn762eePWu0vvnz81deHT765Dv2IYphgn/j0nYVCII4HE+YwAu8CCsi1CNxYy1GG36yAho7vXbEkgKmLbM+ZOxgxsXV3+LMZ+5ibFtiOxwMXMeqsU6T+oFqmqWrid2aCNyuQ9O6wox1p+VLlQN8GHWW7Lu6etBRz5uPIBY9hgsLwVtcCNuWIMgcTSNtmFEKA8BLZcBux+XDKHS+IGHg4Vb8X2DwiKvPVrHl15lDAjCQiQJg6ooKKF4gizMQR29VSIXjIhbA7WzlBuAnDlb0JGBL6TPk61y99XMnkNkXBwsHrSmscuaHQYCFtholrVTchIkBXbnUza1M0WWOugWInzES4EMqcB9lMwiv/YosvkmABXpjyiJK0nCgAoBTmIjEPQ2BRwPPTCCMsw+eMRtDNwnzv+RjR5SXMuqJOZaPazpz4iFW3rNoxPAfDXOidtoVmHsTYd10k/GEGKTcZrBk3u0dprmgZvUw5NzP5LEu9zOAK+raEvk3TKnhWAs8EWEUnBTpXJ/XU6xJ4LX31poTe1FOtqIRGEroqoSupshWX4FiC1yV0LaFJCU0k9H0JfS/rjIRZbntTvplFPmp+TpwVvKIAXso7vbR+FipccKtXUzJn8I6e5nLPYC7ulg3gJhmdv75682vKR8PesdZP6wyLRLCjaEb/eKRJFHvTzZajDYe9U4njU+TZRaHxWf+lLhcKIhqQgjQYGL+cyJUSIMSPi0qj03HPqJGEINWe9IGuafXTxzbeEfqDwVir9xOTe8LL0dgY1D8T0wK3dAN6dfFick+YGcNncgGrwA2t3z+RcHJPGBhoMJMIywI/yVcd9wsc+ifHGw0qdsGSW7amUDu6KrnLbqJvpW5MsJoSNpZq5C9l/iuKkj1sv6n6zmmNGUFTxs52jRlJU8bOg7uMakrcIFNutsYP5DaT6GQ/v2FmuRP3VG+ik/38honlNt1TvYlO9vMb5pt7uJndMN/c0YXsNfME2XW6FO+fIHuOILIZzRjEQ4XCG3HNnos/V8R8+pO4W6ntOsK24tfsZtH/EdF6RxSReDPp9ReSHFz3jvT+kfHbs86L0+3r6ZHynfJU+VHRlYHyQnmtXCgTBSsflI/K78ofrT9bf7f+af27oT442OZ8q1RW++A/F+nIsw==</latexit>


a a b
0 0 d

Breaking the determinant up in to two terms. Note that in the


second image, the smaller parallelogram has negative area,
because the two basis vectors are reversed.

Next, we use the property of skew invariance to subtract mul-


tiples of these new columns from the others. We use whatever
multiple is required to make the rest of the row 0. The other rows
are unaffected, since these all have 0’s in the original column.

a b 0 b a 0 0 b
+ = + .
0 d c d 0 d c 0

Visually, that looks like this.

✓<latexit sha1_base64="/omjraSt5E21A0ri3Dd5jwzsz6Y=">AAAKR3icfVZNb9tGEKXTtI3UpnXaYy9EhBZFIbikGEvyIUBiyU0OTewalh3AFIzlakQRWn5guRTFLPgz8mtybQ/9Cf0VvRU9dklJNMmluhcN9r0Zzr55WK0VECdkmvbXwYNPHn762eePWu0vvnz81deHT765Dv2IYphgn/j0nYVCII4HE+YwAu8CCsi1CNxYy1GG36yAho7vXbEkgKmLbM+ZOxgxsXV3+LMZ+5ibFtiOxwMXMeqsU6T+oGqmqWrid2aCNyuQ9O6wox1p+VLlQN8GHWW7Lu6etBRz5uPIBY9hgsLwVtcCNuWIMgcTSNtmFEKA8BLZcBux+XDKHS+IGHg4Vb8X2DwiKvPVrHl15lDAjCQiQJg6ooKKF4gizMQR29VSIXjIhbA7WzlBuAnDlb0JGBL6TPk61y99XMnkNkXBwsHrSmscuaHQYCFtholrVTchIkBXbnUza1M0WWOugWInzES4EMqcB9lMwiv/YosvkmABXpjyiJK0nCgAoBTmIjEPQ2BRwPPTCCMsw+eMRtDNwnzv+RjR5SXMuqJOZaPazpz4iFW3rNoxPAfDXOidtoVmHsTYd10k/GEGKTcZrBk3u0dprmgZvUw5NzP5LEu9zOAK+raEvk3TKnhWAs8EWEUnBTpXJ/XU6xJ4LX31poTe1FOtqIRGEroqoSupshWX4FiC1yV0LaFJCU0k9H0JfS/rjIRZbntTvplFPmp+TpwVvKIAXso7vbR+FipccKtXUzJn8I6e5nLPYC7ulg3gJhmdv75682vKR8PesdZP6wyLRLCjaEb/eKRJFHvTzZajDYe9U4njU+TZRaHxWf+lLhcKIhqQgjQYGL+cyJUSIMSPi0qj03HPqJGEINWe9IGuafXTxzbeEfqDwVir9xOTe8LL0dgY1D8T0wK3dAN6dfFick+YGcNncgGrwA2t3z+RcHJPGBhoMJMIywI/yVcd9wsc+ifHGw0qdsGSW7amUDu6KrnLbqJvpW5MsJoSNpZq5C9l/iuKkj1sv6n6zmmNGUFTxs52jRlJU8bOg7uMakrcIFNutsYP5DaT6GQ/v2FmuRP3VG+ik/38honlNt1TvYlO9vMb5pt7uJndMN/c0YXsNfME2XW6FO+fIHuOILIZzRjEQ4XCG3HNnos/V8R8+pO4W6ntOsK24tfsZtH/EdF6RxSReDPp9ReSHFz3jvT+kfHbs86L0+3r6ZHynfJU+VHRlYHyQnmtXCgTBSsflI/K78ofrT9bf7f+af27oT442OZ8q1RW++A/LonIgQ==</latexit>


a 0
0 d


<latexit sha1_base64="dKN3gU+sNzr1VBmAuqFKM6871R0=">AAAKR3icfVZNb9tGEKXTtI3UpnXaYy9EhBZFIbikFIvyIUBiyU0OTewalh3AFIzlakQRWn5guRTFLPgz8mtybQ/9Cf0VvRU9dklJNMmluhcN9r0Zzr55WK0VECdkmvbXwYNPHn762eePWu0vvnz81deHT765Dv2IYphgn/j0nYVCII4HE+YwAu8CCsi1CNxYy1GG36yAho7vXbEkgKmLbM+ZOxgxsXV3+LMZ+5ibFtiOxwMXMeqsU039QbVMU8XiVzPBmxVIenfY0Y60fKlyoG+DjrJdF3dPWoo583HkgscwQWF4q2sBm3JEmYMJpG0zCiFAeIlsuI3YfDjljhdEDDycqt8LbB4Rlflq1rw6cyhgRhIRIEwdUUHFC0QRZuKI7WqpEDzkQtidrZwg3IThyt4EDAl9pnyd65c+rmRym6Jg4eB1pTWO3FBosJA2w8S1qpsQEaArt7qZtSmarDHXQLETZiJcCGXOg2wm4ZV/scUXSbAAL0x5RElaThQAUApzkZiHIbAo4PlphBGW4XNGI+hmYb73fIzo8hJmXVGnslFtZ058xKpbVu0YnoNhLvRO20IzD2Lsuy4S/jCDlJsM1oyb3aM0V7SMXqacm5l8lqVeZnAFfVtC36ZpFTwrgWcCrKKTAp2rk3rqdQm8lr56U0Jv6qlWVEIjCV2V0JVU2YpLcCzB6xK6ltCkhCYS+r6Evpd1RsIst70p38wiHzU/J84KXlEAL+WdXlo/CxUuuNWrKZkzeEdPc7lnMBd3ywZwk4zOX1+9+TXlo2HvWBukdYZFIthRtP7geKRJFHvTzZajDYe9U4njU+TZRaHx2eClLhcKIhqQgmQY/V9O5EoJEOLHRaXR6bjXr5GEINWedEPXtPrpYxvvCAPDGGv1fmJyT3g5GveN+mdiWuCW3odeXbyY3BNm/eEzuYBV4H1tMDiRcHJPMPrImEmEZYGf5KuO+wUOg5PjjQYVu2DJLVtTqB1dldxlN9G3UjcmWE0JG0s18pcy/xVFyR6231R957TGjKApY2e7xoykKWPnwV1GNSVukCk3W+MHcptJdLKf3zCz3Il7qjfRyX5+w8Rym+6p3kQn+/kN88093MxumG/u6EL2mnmC7DpdivdPkD1HENmMZgzioULhjbhmz8WfK2I+/UncrdR2HWFb8Wt2s+j/iGi9I4pIvJn0+gtJDq57R/rgqP/bs86L0+3r6ZHynfJU+VHRFUN5obxWLpSJgpUPykfld+WP1p+tv1v/tP7dUB8cbHO+VSqrffAfLpnIgQ==</latexit>


0 b
c 0

Axis-aligning the remaining basis vectors.


3.2. DETERMINANTS 89

We have two parallelograms with one edge axis-aligned, and we


simply skew them so that the other edge is axis-aligned as well.
the area we’re looking for is now the sum of two rectangles, one
of which of facing away from us.
For the first term, we can work out the determinant easily. A
diagonal matrix transforms the unit cube to a rectangle, so we
just multiply the values along the diagonal: ac. For the second
term, we have an anti-diagonal matrix. We can turn this into a
diagonal matrix by swapping the two columns. By the property
of alternativity, this changes the sign of the area, so the result-
ing signed area is -bc:

a 0 0 b a 0 b 0
+ = - = ac - bc .
0 d c 0 0 d 0 c

This was certainly a more involved way of deriving the formula


for the area of a polygon, but the benefit here is that this method
generalizes to higher dimensions.

3.2.4 Determinants for n ⇥ n matrices


We’ll start with the three properties we used above, and see how
they generalize to higher dimensions.
Skew invariance also holds in higher dimensions. If we have
an n ⇥ n matrix A with n column vectors a1 , . . . , an , adding a
multiple of one to another does not change the volume of the
resulting parallelotope. For instance:

|u, v, w| = |u + rv, v, w|
Multilinearity also carries over in the same way. We can break
one of the column vectors up into a linear combination of two (or
more) other vectors and the area of the resulting paralellotope
breaks up in the same way. For instance:

|u1 + u2 , v, w| = |u1 , v, w| + |u2 , v, w|


Finally, alternativity. This requires a little more care. You’d be
forgiven for thinking that since we have a higher-dimensional
space, we now have more ways for our paralellotope to orient as
90 CHAPTER 3—PROVING THE SPECTRAL THEOREM

well. We thought of our paralellogram as a piece of paper which


could lie on the table in two ways. If we hold a parallellotope up
in space, we can rotate it in all sorts of directions.
But the metaphor of a piece of paper is slightly misleading.
When we flip a piece of paper upside-down, we rotate it, but
that’s not what really happens when we swap the basis vectors
of the paralellogram. What really happens is that we turn the
piece of paper inside-out: we flip it by pulling the right edge to
the left and the left edge to the right.
A better metaphor is a mirror: imagine standing in front of a
mirror and holding up your right hand, palm forward.
It looks like your twin inside the mirror is holding up their
left hand, with the thumb facing in the opposite direction. But
if the mirror flips the image left-to-right, why doesn’t it flip the
image up-side-down as well? Why doesn’t anything change if we
turn the mirror ninety degrees?How does the mirror keep track
of which direction the floor is?
The answer is that the mirror doesn’t flip the image left-to-right.
It flips it back-to-front. In a manner of speaking, it pulls the back
of your hand forward and the front of your hand backwards until
the whole hand is flipped.
Putting the mirror to your side will flip you left-to-right, also
turning your right hand into a left hand. Putting the mirror on
the floor and standing on it turns you upside-down, and again
turns the right hand into a left hand. If you use two mirrors, one
in front of you and one below, you are flipped back-to-front and
upside-down, and if you look at this mirror-twin, you’ll see that
their hand has been flipped twice, to become a right hand again.
The result is that we still have only two orientations. Each
time anything gets mirrored, it gets pulled inside out along some
line, and the sign of the volume changes: your right hand turns
into a left hand and vice versa. If it gets flipped an even number
of times the sign stays the same, and if it gets flipped an odd
number of times, the sign changes.
This means that alternativity in higher dimensions is defined
as follows: if we swap around any two columns in a matrix, we
flip the space (along a diagonal between the two corresponding
axes). Therefore, the magnitude of the determinant stays the
3.2. DETERMINANTS 91

A mirror in front of you seems to flip you left-to-right, but


it actually flips you back-to-front. This also turns your right
hand into a left hand. A second mirror on the floor flips you
upside-down, turning your hand back into a right hand again.
92 CHAPTER 3—PROVING THE SPECTRAL THEOREM

same, but the sign changes. If we flip two more axes, the sign
changes back. For instance:

| u, v, w | = -| v, u, w | = | w, u, v | .
That’s our three properties in place. Finally, before we start our
derivation, note that in the 2 ⇥ 2 case, our ultimate aim was to
work the matrix determinant into a sum of determinants of diag-
onal matrices. The idea was that the determinant of a diagonal
matrix is easy to work out.
That’s still true in higher dimensions: the columns of a di-
agonal matrix A each map one of the unit vectors to a a basis
vector of length Aii that lies along the i-th axis. Together these
form the sides of an n-dimensional “brick” whose volume is just
these lengths multiplied together. So the plan stays the same:
use our three properties to rewrite the determinant into a sum
of determinants of diagonal matrices.
We’ll work this out for a 3 ⇥ 3 matrix explicitly as an example,
but the principle holds for any number of dimensions.
We start by taking the first column of our matrix, and breaking
it up into three simple vectors (using multilinearity).

a b c a b c 0 b c 0 b c
d e f = 0 e f + d e f + 0 e f
g h i 0 h i 0 h i g h i

As before, we’ve broken our parallellotope into three parallel-


lotopes, each with one of their edges axis-aligned. Next, we sub-
ract multiples of the first column from each of the other columns
to turn the rows into vectors with only one non-zero element.

a 0 0 0 b c 0 b c
0 e f + d 0 0 + 0 e f
0 h i 0 h i g 0 0
This doesn’t yet look quite as simple as it did in our 2 ⇥ 2 case,
but we can go back to the first step and break the second col-
umn vector of each term into simple vectors as well. For the first
term that looks like this:
3.2. DETERMINANTS 93

a 0 0 a 0 0 a 0 0 a 0 0
0 e f = 0 0 f + 0 e f + 0 0 f
0 h i 0 0 i 0 0 i 0 h i
In the first term, one of the column vectors is zero. This means
the parallelotope becomes a parallelogram, with 0 volume, so we
can remove this term. For the other two, we apply the multilinear
property to sweep the rows, and we get

a 0 0 a 0 0 a 0 0
0 e f = 0 e 0 + 0 0 f
0 h i 0 0 i 0 h 0
The logic is the same for the green and blue terms. If we ignore
the zeros that we added, we end up with a 2 ⇥ 2 submatrix, to
which we can apply the same trick again to turn each term into
two more terms. If we do this we get six terms in total.

a 0 0 a 0 0 0 b 0 0 0 c 0 b 0 0 0 c
0 e 0 + 0 0 f + d 0 0 + d 0 0 + 0 0 f + 0 e 0
0 0 i 0 h 0 0 0 i 0 h 0 g 0 0 g 0 0

The result is that we have separated the determinant into sev-


eral terms, so that for each, the matrix in that term has only one
non-zero element in each row and each column. The structure of
the non-zero values is that of a permutation matrix, with the ex-
ception that permutation matrices contain only 1s and 0s. These
are like the original matrix A with a permutation matrix used
to mask out certain values.
In fact, what we have done is to enumerate all possible permu-
tations. We started by creating one term for each possible choice
for the first column, and then for each term we separated this in
to the remaining two choices for the second column (after which
the choice for the third column was fixed as well).
This idea naturally generalizes to higher dimensions. Moving
from left to right, we pick one element of each column and zero
out the rest of its column and row. At each subsequent step we
limit ourselves to whatever non-zero elements remain. We sum
the determinants of all possible ways of doing this.
94 CHAPTER 3—PROVING THE SPECTRAL THEOREM

We can now turn each of these matrices into a diagonal matrix,


by swapping around a number of columns. By the property of
alternativity, this doesn’t change the magnitude of the determi-
nant, only the sign: for an even number of swaps, it stays the
same and for an odd number it flips around. This gives us

a 0 0 a 0 0 b 0 0 c 0 0 b 0 0 c 0 0
0 e 0 - 0 f 0 - 0 d 0 + 0 d 0 + 0 f 0 - 0 e 0 .
0 0 i 0 0 h 0 0 i 0 0 h 0 0 g 0 0 g

And since the determinant of a diagonal matrix is simply the di-


agonal product, as we worked out earlier, we get

a b c
d e f = aei - afh - bdi + cdh + bfg + ceg .
g h i

To generalize this to n dimensions, we represent a permutation


of the first n natural numbers with the symbol . For instance,
if n = 6 we may have = h1, 5, 4, 3, 2i. We’ll call the sign of a
permutation, sign( ), -1 if the permutation can be placed in the
correct order with an odd number of swaps and 1 if this can be
done with an even number of swaps.
Then, the process we described above leads to the following
formula for the determinant:
X Y
|A| = sign( ) A (i),i
i

where the sum is over all permutations of the first n natural num-
bers, and the product runs from 1 to n.
Note that for each term in our sum, corresponding to the per-
mutation , the value A (i),i marks Q out one of the elements that
we haven’t zeroed out. The value i A (i),i is the product of
all these values.
This is called the Leibniz formulation of the determinant.

We got here by defining the determinant as the volume of a par-


allelotope and then deriving the Leibniz formulation from that.
3.3. THE CHARACTERISTIC POLYNOMIAL 95

This is useful for building a visual intuition, but if you want to


be rigorous it’s not the most efficient approach.
For this reason, the determinant is usually first defined as any
multilinear, alternative function, which yields 1 for the identity
matrix. You can then show that there is only one such func-
tion, and that it’s the Leibniz function above. From there, you
can then prove all the other interpretations of the determinant,
including the geometrical one.

The determinant is a powerful tool, with many uses. Here, we


only care about one of them: it lets us characterize whether a
matrix is invertible or not. The determinant is precisely zero if
and only if the matrix is not invertible. The reason we care about
this, is that it will lead to a very useful way of characterizing the
eigenvalues: the characteristic polynomial of a matrix.

3.3 The characteristic polynomial


Let’s start with how we originally characterized eigenvalues in
Chapter 2. There, we said that an eigenvector of a matrix A is
any vector v for which the direction doesn’t change, under op-
eration of the matrix. The magnitude can change, and the in-
crease of that magnitude we call the eigenvalue correspond-
ing to the eigenvector.
To summarize, for any eigenvector v and its eigenvalue
of A, we have

Av = v .
Moving the right-hand-side over to the left, we get

Av - v = 0,
where both sides are vectors, with a vector of zeros on the right.
To allow us to manipulate the left hand side further we rewrite
v as ( I)v. This gives us:

Av - Iv = 0
(A - I)v = 0 .
96 CHAPTER 3—PROVING THE SPECTRAL THEOREM

This last line is basically a linear problem of the form Mv = 0,


with M = (A - I). The solutions to this problem, the set of
all vectors v that we can multiply by M to get the null vector
0 are called the null space of M. What we have just shown is
that any eigenvector of A with eigenvalue must be in the
null space of the matrix A - I.
This is where invertibility and the determinant come in: if M
is invertible, it can only map one point to any other point. Every
matrix maps 0 to 0, so for any invertible matrix the null space
consists only of the point 0. The only matrices with more in-
teresting null spaces are non-invertible matrices. Or, matrices
with determinant 0.

Note that we’re not talking about the invertibility of A itself,


only of the derived matrix M = A - I

So, now we can tie it all together. Choose some scalar value
. If we have

|A- I|=0
then the matrix A - I has a non-trivial null-space, and is an
eigenvalue. We want to study the left-hand-side of this equation
as a function of , taking the values in A as constants.
As we’ve seen, expanding the determinant into an explicit
form can get a little hairy for dimensions larger than 3, but we
don’t need to make it explicit, so long as we can tell what kind of
function it is. To illustrate, say we have a 2 ⇥ 2 matrix
✓ ◆
a b
A= .
c d
In that case, the value | A - I | works out as

(a - )(d - ) - bc .
We can multiply out these brackets, and we would see that this is
a polynomial with as its variable. This polynomial has 2 as the
highest power. The values for which this polynomial equal zero,
its roots, are the eigenvalues of the original matrix A.
3.4. COMPLEX NUMBERS 97

For an n ⇥ n matrix, as we saw, the Leibniz form of the determi-


nant gives us one term for each possible permutations of length
n, each of which consists of n elements of the matrix multiplied
together. If any of these elements come from the diagonal, they
contain . That means each term contains at most n s, giving
us an n-th order polynomial.
As you may have guessed, this function is what we call the
characteristic polynomial of A. The points where this function
is 0, the roots of the polynomial, are the eigenvalues of A.
And this means that we can apply a whole new set of tools
from the analysis of polynomials, to the study of eigenvectors. We
never have to work out the characteristic polynomial explicitly,
we can just use the knowledge that the determinant is a polyno-
mial and use what we know about polynomials to help us further
along towards the spectral theorem.
And one of the richest and most versatile tools to come out
of the analysis of polynomials, is the idea of complex numbers.
This is a counterintuitive idea, so we’ll take some time to set it
up carefully, before we dig into the mathematics.

3.4 Complex numbers


Complex numbers spring from the idea that there exists a number
i for which i2 is -1. We don’t know of any such number, but we
simply assume that it exists, and investigate the consequences.
For many people this is the point where mathematics becomes
too abstract and they tune out. The idea that squares can be
negative clashes too much with our intuition for what squares
are. The idea that we just pretend that they can be negative and
keep going, seems almost perverse.
And yet, this approach is one that humanity has followed
again and again in the study of numbers. If you step back a
bit, you start to see that it is actually one of the most logical and
uncontroversial things to do.
The study of numbers started somewhere before recorded his-
tory, in or before the late stone age, when early humans began
counting things in earnest, and they learned to add. I have five
apples, I steal three apples from you, now I have eight apples.
That sort of thing.
98 CHAPTER 3—PROVING THE SPECTRAL THEOREM

At some point, these early humans will have solidified their con-
cept of “numbers.” It is a set of concepts (whose meaning we
understand intuitively) which starts 1, 2, 3, . . . and continues. If
you add one number to another, you always get another number.
If the number was big, they may not have had a name for it, but a
patient paleolithic human with enough time could certainly have
carved the required number of tally marks into an animal bone.
The operation of addition can also be reversed. If 5 + 3 gives
8, then taking 5 away from 8 gives 3. If I steal 5 apples from
your collection of 8, you still have 3 left. Thus, subtraction was
born. But subtraction, the inverse of addition, required some care.
Where adding two numbers always yields a new number, sub-
tracting two numbers doesn’t always yield a new number. You
can’t have 5 - 8 apples, because if you have 5 apples I can’t
steal more than 5 of them.
As societies grew more complicated, financial systems devel-
oped and debt became an integral part of daily life. At some
point, the following thought experiment was considered. What
if 5 - 8 is a number after all? Maybe it’s just a number we don’t
have a name for yet.
So, we’ll just give it a name and see if we can make some
sense of how it behaves. No doubt many people were outraged
by such a suggestion, protesting that it was unnatural, and an
insult to whatever God they believed had designed the numbers.
But simple investigation soon showed that if these numbers were
assumed to exist, they followed simple rules and, it made sense to
think of them as a kind of mirror image of the natural numbers,
extending to infinity in the opposite direction. 5 - 8 was the
mirror image of 3, so it made sense to call it “-3”.
The skeptics might argue that this made no sense, because
there is no such thing as having -3 apples, but the mathemati-
cians will have countered that in other areas, such as finance,
there were concepts that could be expressed very beautifully by
the negative numbers. If I owe you 3 apples, because my earlier
theft was found out, but you also stole 8 apples from me, I now
owe you -5 apples, or rather, you owe me 5.
The same principle can be applied to multiplication. If your
tribe has 8 families, and every family is entitled to 5 apples, you
3.4. COMPLEX NUMBERS 99

need to find 8 ⇥ 5 apples. Again, an operator, and any two num-


bers you care to multiply will give you a new number (even if
you believe in negative numbers).
And again, you can do the reverse the operation: if the harvest
has yielded 48 apples, you can work out that every family in your
tribe gets 6 of them. But again, you have to be careful about
which numbers you apply the inverse to. Sometimes you get a
known number, and sometimes you don’t. If you have 50 apples,
suddenly there is no known number that is the result of 50/8.
But what if there was? What if we just gave 50/8 a name
and started investigating? We’d find out pretty quickly that it
would make sense to think of these numbers as lying in between
the integers. We call these the rational numbers. Whoever it was
that invented the rationals must have run into less resistance than
the inventor of the negative numbers; it’s much easier to imagine
half an apple than to imagine -3 of them.
The pattern is hopefully becoming clear. Let’s have one more
example, to really drive the point home, and also to bring us far
enough into recorded history so we can actually see how people
dealt with these revelations. Adding is repeated counting, and
multiplication is repeated adding so raising to a power, repeated
multiplication, is the next operator in the hierarchy.
The story should be familiar at this point. Any two natural
numbers a and b can be ”exponentiated” together as ab and the
result is another natural number.
The inverse operation is a b-th root, but we can stick with
square roots to illustrate our point. In fact the square root of 2,
the length of the diagonal of a unit square, is all we need. In
this
p case, there is nothing abstract or perverse about the quantity
2: in a square room with sides of 1 meter it’s the distance from
one corner to the corner opposite.

<latexit sha1_base64="cONAky5IWOfAHNZRZWeGyDaAoRo=">AAAKJHicfVZNb9tGEGXStI3cpk3aYy9EhAJFIRikGEsyigCJJTc5NLFrWHYAUwmWqxFFaPnR3aUkZsH/kWt66K/preihl/6WLklJJrlUFwY82PdmOPvmYbVORDzGDeOfO3c/uffpZ5/fbx188eWDr75++OibKxbGFMMYhySkbxzEgHgBjLnHCbyJKCDfIXDtLIYZfr0EyrwwuORJBBMfuYE38zDicuutaf8k/2z2G+Wim7572DYOjXzpamBugra2WefvHrU0exri2IeAY4IYuzGNiE8EotzDBNIDO2YQIbxALtzEfDaYCC+IYg4BTvXvJTaLic5DPetMn3oUMCeJDBCmnqyg4zmiCHPZ/0G1FIMA+cA606UXsSJkS7cIOJKHn4h1Lk76oJIpXIqiuYfXldYE8pmP+FzZZInvVDchJkCXfnUza1M2WWOugWKPZSKcS2XOokxwdhmeb/B5Es0hYKmIKUnLiRIASmEmE/OQAY8jkZ9GTnnBnnIaQycL872nI0QXFzDtyDqVjWo7MxIiXt1yascIPAwzqXd6IDULYIVD30fBVNhRKmwOay7szmGaK1pGL1Ih7Ew+x9EvMriCvi6hr9O0Cp6WwFMJVtHxDp3p43rqVQm8Ur56XUKv66lOXEJjBV2W0KVS2VmV4JUCr0voWkGTEpoo6PsS+l7VGUmz3HQnophFPmpxRrwlvKAAQSra3bR+FipdcGNWUzJniLaZ5nJPYSYvjgLwk4wuXl6++iUVw0H3yOildYZDYthSDKt3NDQUilt0s+EYg0H3ROGEFAXurtDotPfcVAtFMY3IjtTvWz8fq5USICRc7SoNT0Zdq0aSglR7MvumYdRPv3LxltDr90dGvZ8VuSU8H46sfv0zK7rDHdOCbl28FbklTK3BE7WAs8Mto9c7VnByS+hbqD9VCIsdfpyvOh7ucOgdHxUaVOyCFbdsTKG3TV1xl9tE30jdmOA0JRSWauQvVP4LipI97LCp+tZpjRlRU8bWdo0ZSVPG1oPbjGrKqkGm3GyNH8htptDJfn7DzHIn7qneRCf7+Q0Ty226p3oTneznN8w393Azu2G+uaN3stfME2XX6QLLMWfPEUSK0YxAPlQovJLX7Jn8cUU8pD/Ku5W6vidtK//bnSz6PyJab4kykm8ms/5CUoOr7qHZO7R+fdJ+drJ5Pd3XvtMeaz9optbXnmkvtXNtrGGNah+0j9rvrT9af7b+av1dUO/e2eR8q1VW69//AJjHu5E=</latexit>

p
11 2

<latexit sha1_base64="cONAky5IWOfAHNZRZWeGyDaAoRo=">AAAKJHicfVZNb9tGEGXStI3cpk3aYy9EhAJFIRikGEsyigCJJTc5NLFrWHYAUwmWqxFFaPnR3aUkZsH/kWt66K/preihl/6WLklJJrlUFwY82PdmOPvmYbVORDzGDeOfO3c/uffpZ5/fbx188eWDr75++OibKxbGFMMYhySkbxzEgHgBjLnHCbyJKCDfIXDtLIYZfr0EyrwwuORJBBMfuYE38zDicuutaf8k/2z2G+Wim7572DYOjXzpamBugra2WefvHrU0exri2IeAY4IYuzGNiE8EotzDBNIDO2YQIbxALtzEfDaYCC+IYg4BTvXvJTaLic5DPetMn3oUMCeJDBCmnqyg4zmiCHPZ/0G1FIMA+cA606UXsSJkS7cIOJKHn4h1Lk76oJIpXIqiuYfXldYE8pmP+FzZZInvVDchJkCXfnUza1M2WWOugWKPZSKcS2XOokxwdhmeb/B5Es0hYKmIKUnLiRIASmEmE/OQAY8jkZ9GTnnBnnIaQycL872nI0QXFzDtyDqVjWo7MxIiXt1yascIPAwzqXd6IDULYIVD30fBVNhRKmwOay7szmGaK1pGL1Ih7Ew+x9EvMriCvi6hr9O0Cp6WwFMJVtHxDp3p43rqVQm8Ur56XUKv66lOXEJjBV2W0KVS2VmV4JUCr0voWkGTEpoo6PsS+l7VGUmz3HQnophFPmpxRrwlvKAAQSra3bR+FipdcGNWUzJniLaZ5nJPYSYvjgLwk4wuXl6++iUVw0H3yOildYZDYthSDKt3NDQUilt0s+EYg0H3ROGEFAXurtDotPfcVAtFMY3IjtTvWz8fq5USICRc7SoNT0Zdq0aSglR7MvumYdRPv3LxltDr90dGvZ8VuSU8H46sfv0zK7rDHdOCbl28FbklTK3BE7WAs8Mto9c7VnByS+hbqD9VCIsdfpyvOh7ucOgdHxUaVOyCFbdsTKG3TV1xl9tE30jdmOA0JRSWauQvVP4LipI97LCp+tZpjRlRU8bWdo0ZSVPG1oPbjGrKqkGm3GyNH8htptDJfn7DzHIn7qneRCf7+Q0Ty226p3oTneznN8w393Azu2G+uaN3stfME2XX6QLLMWfPEUSK0YxAPlQovJLX7Jn8cUU8pD/Ku5W6vidtK//bnSz6PyJab4kykm8ms/5CUoOr7qHZO7R+fdJ+drJ5Pd3XvtMeaz9optbXnmkvtXNtrGGNah+0j9rvrT9af7b+av1dUO/e2eR8q1VW69//AJjHu5E=</latexit>

p
<latexit sha1_base64="cONAky5IWOfAHNZRZWeGyDaAoRo=">AAAKJHicfVZNb9tGEGXStI3cpk3aYy9EhAJFIRikGEsyigCJJTc5NLFrWHYAUwmWqxFFaPnR3aUkZsH/kWt66K/preihl/6WLklJJrlUFwY82PdmOPvmYbVORDzGDeOfO3c/uffpZ5/fbx188eWDr75++OibKxbGFMMYhySkbxzEgHgBjLnHCbyJKCDfIXDtLIYZfr0EyrwwuORJBBMfuYE38zDicuutaf8k/2z2G+Wim7572DYOjXzpamBugra2WefvHrU0exri2IeAY4IYuzGNiE8EotzDBNIDO2YQIbxALtzEfDaYCC+IYg4BTvXvJTaLic5DPetMn3oUMCeJDBCmnqyg4zmiCHPZ/0G1FIMA+cA606UXsSJkS7cIOJKHn4h1Lk76oJIpXIqiuYfXldYE8pmP+FzZZInvVDchJkCXfnUza1M2WWOugWKPZSKcS2XOokxwdhmeb/B5Es0hYKmIKUnLiRIASmEmE/OQAY8jkZ9GTnnBnnIaQycL872nI0QXFzDtyDqVjWo7MxIiXt1yascIPAwzqXd6IDULYIVD30fBVNhRKmwOay7szmGaK1pGL1Ih7Ew+x9EvMriCvi6hr9O0Cp6WwFMJVtHxDp3p43rqVQm8Ur56XUKv66lOXEJjBV2W0KVS2VmV4JUCr0voWkGTEpoo6PsS+l7VGUmz3HQnophFPmpxRrwlvKAAQSra3bR+FipdcGNWUzJniLaZ5nJPYSYvjgLwk4wuXl6++iUVw0H3yOildYZDYthSDKt3NDQUilt0s+EYg0H3ROGEFAXurtDotPfcVAtFMY3IjtTvWz8fq5USICRc7SoNT0Zdq0aSglR7MvumYdRPv3LxltDr90dGvZ8VuSU8H46sfv0zK7rDHdOCbl28FbklTK3BE7WAs8Mto9c7VnByS+hbqD9VCIsdfpyvOh7ucOgdHxUaVOyCFbdsTKG3TV1xl9tE30jdmOA0JRSWauQvVP4LipI97LCp+tZpjRlRU8bWdo0ZSVPG1oPbjGrKqkGm3GyNH8htptDJfn7DzHIn7qneRCf7+Q0Ty226p3oTneznN8w393Azu2G+uaN3stfME2XX6QLLMWfPEUSK0YxAPlQovJLX7Jn8cUU8pD/Ku5W6vidtK//bnSz6PyJab4kykm8ms/5CUoOr7qHZO7R+fdJ+drJ5Pd3XvtMeaz9optbXnmkvtXNtrGGNah+0j9rvrT9af7b+av1dUO/e2eR8q1VW69//AJjHu5E=</latexit>

p 11 2
11 2
100 CHAPTER 3—PROVING THE SPECTRAL THEOREM

And yet, when people investigated, it caused great upset.


The man who gave his name to the theorem we would use to
work out the above picture, Pythagoras, was the head of a cult. A
cult dedicated to mathematics. They lived ascetically, much like
monks would, centuries later, and dedicated themselves to the
study of nature in terms of mathematics. When asked what the
purpose of man was, Pythagoras answered “to observe the heav-
ens.” One fervent belief of the Pythagoreans was that number
and geometry were inseparable: all geometric quantities could
be expressed by (known) numbers.
The story of the Pythagoreans is a mathematical tragedy. It
was one of their own, commonly identified as Hippasus of Metapon-
tum,
p who showed that no rational number corresponded exactly
to 2. Some aspects of geometry were outside the reach of the
known numbers. According to legend, he was out at sea when
he discovered this, and was promptly thrown overboard by the
other Pythagoreans.
Of course, with the benefit of hindsight, we know how to man-
age such upsetting
p discoveries. We simply give the new number
a name, “ 2,” and see if there’s some place among the numbers
where it makes sense to put it. In this case, somewhere between
141/100 and 142/100, in a space we can make infinitely small
by choosing better and better rational approximations.
With this historical pattern clearly highlighted, the discovery
of the complex numbers should be almost obvious. In fact, we
don’t even need a new operation to invert, we are still looking
at square roots, but instead of applying the square root a posi-
a negative integer. To take the sim-
tive integer, we apply it to p
plest example, we’ll look at -1. No number we know gives -1
when we multiply it by itself, so our first instinct is to dismiss the
operation. The square root is only allowed for a subset of the
real-valued numbers. Just like subtraction was only allowed for
a subset of the natural numbers, and division was only allowed
for a subset of the integers. p
But, what if the number -1 did exist? What would the
consequences be?
As the previous paragraphs should illustrate, this kind of in-
vestigation is usually born out of necessity. Like a fussy child
3.4. COMPLEX NUMBERS 101

given a new food, people are consistently reluctant to accept new


types of numbers. In this case, what pushed us over the edge was
the study of polynomials; functions of the form:

f(x) = ax3 + bx2 + cx + d

where the highest exponent in the sum indicates the order of


the polynomial.
The problem of finding the roots of a polynomial, the values
of x for which f(x) is equal to 0 crops up in all sorts of practical
problems. In some cases, this leads to squares of negative num-
bers, as we see when we try to solve x2 + 1 = 0. This didn’t worry
anybody, of course, since this function lies entirely above the hor-
izontal axis—it has no roots—so it’s only natural that solving for
the roots leads to a contradiction.

The function f(x) = x2 + 1 has no roots: it doesn’t crosspthe


horizontal axis. Therefore, it make sense that x2 = 0 or -1
has no solutions.

However, when people started to work out general methods for


finding the roots of third-order polynomials, like x3 - 15x - 4,
which does have roots, it was
p found that the methods worked if
one temporarily accepted -1 as an intermediate value, which
later canceled out. This is where the phrase imaginary number
102 CHAPTER 3—PROVING THE SPECTRAL THEOREM

originates. People (Descartes, to be precise) were not ready to


accept these as numbers, but no one could deny their utility.
Eventually, people followed the pattern that they had followed
for the integers, the rationals
p and all their successors. We give the
new number a name, i = -1, and we see if there’s any way to
relate it, geometrically, to the numbers we know.
Let’s start with addition. What happens if we add i to some
number, say 3? The simple answer is that nothing much happens.
The most we can say about the new number is that it is 3 + i.
Multiplication then. Again 2 ⇥ i doesn’t simplify in any mean-
ingful way, so we’ll just call the new number 2i. What if we
combine the two? With a few subtleties, we can rely on the basic
rules of algebra to let us multiply out brackets and add things
together. So, if we start with i, add 3 and multiply by 2, we get:

2(i + 3) = 2 · 3 + 2 · i = 6 + 2i

This is a very common result: we’ve applied a bunch of oper-


ations, involving the imaginary number i, and the result can
be written as the combination of a real value r, another real
value c and i as:

r + ci .

We will call any number that can be written in this way a complex
number. The set of all complex numbers is written as

C.

At this point you may be worried. What if we come up with


another operation that is not defined for all complex numbers?
Are we going to have to make another jump? Are we going to
find ever bigger families of numbers to deal with? It turns out
that in many ways, C is the end of the line. So long as we stick
to algebraic operations, we can do whatever we like to complex
numbers, and the result will always be well defined as another
complex number.
3.4. COMPLEX NUMBERS 103

To illustrate, let’s show this for a few simple examples. Lets


say we have two complex numbers a + bi and c + di. If we
add them, we get

(a + bi) + (c + di) = a + c + bi + di = (a + c) + (b + d)i


If we multiply them, we get

(a + bi)(c + di) = ac + adi + bic + bidi


= (ad + bdi2 ) + (ad + bc)i
= ad - bd + (ad + bc)i .
That is, one real-valued number, added to i times another real-
valued number. Note that in the second line of the derivation for
2
the
p multiplication, we can use i = -1, since we know that i =
-1. In short, multiplying or adding together any two complex
numbers gives us another complex number.
Since every complex number can be written as the combina-
tion of two real-valued numbers, it makes sense to visualize them
in a plane. We plot the value of the real term along the horizontal
axis and the value of the imaginary term along the vertical.

r + ci
<latexit sha1_base64="R5eWalK8yve3V9gokX+hhA590A4=">AAAKKnicfVZNb9tGEGXSr8htWic95kJUKFC0gkGKsSgfAiSW3OTQxK5h2QFMwViuRhSh5QeWS1HMgn8l1/bQX9Nb0Gt/SJekJJNcqnvRYN+b4eybh9XaIXEjpmmfHjz87PMvvvzqUefg628ef/vd4ZOn11EQUwwTHJCAvrdRBMT1YcJcRuB9SAF5NoEbeznK8ZsV0MgN/CuWhjD1kOO7cxcjJrbuDp9aiY05zdRfVCuhmOPMvTvsakdasVQ50DdBV9msi7snHcWaBTj2wGeYoCi61bWQTTmizMUEsgMrjiBEeIkcuI3ZfDjlrh/GDHycqT8KbB4TlQVq3p86cylgRlIRIExdUUHFC0QRZuIUB/VSEfjIg6g3W7lhVIbRyikDhoQEU74uJMoe1zK5Q1G4cPG61hpHXuQhtpA2o9Sz65sQE6Arr76ZtymabDDXQLEb5SJcCGXOw1z26Cq42OCLNFyAH2U8piSrJgoAKIW5SCzCCFgc8uI0YtbL6AWjMfTysNh7MUZ0eQmznqhT26i3MycBYvUtu3EM38UwF3pnB0IzHxIceB7yZ9wKM24xWDNu9Y6yQtEqeplxbuXy2bZ6mcM19F0FfZdldfCsAp4JsI5OduhcnTRTryvgtfTVmwp600y14woaS+iqgq6kynZSgRMJXlfQtYSmFTSV0A8V9IOsMxJmue1PeTmLYtT8nLgreE0B/Ix3+1nzLFS44Favp+TO4F09K+SewVxcHyXgpTmdv7l6+1vGR8P+sTbImgybxLClaMbgeKRJFKfsZsPRhsP+qcQJKPKdXaHx2eCVLhcKYxqSHck0jV9P5EopEBIku0qj03HfaJCEIPWedFPXtObpEwdvCQPTHGvNfhJyT3g1Ghtm8zPiDt3itm5AvyleQu4JM2P4XC5g73BDGwxOJJzcE0wDmTOJsNzhJ8Vq4sEOh8HJcalBzS5YcsvGFGpXVyV3OW30jdStCXZbQmmpVv5S5r+mKN3DDtqqb53WmhG2ZWxt15qRtmVsPbjNqKckLTIVZmv9QGEziU7281tmVjhxT/U2OtnPb5lYYdM91dvoZD+/Zb6Fh9vZLfMtHL2TvWGeML9Ol1iMOX+OIFKOZgzioULhrbhmz8WfK2IB/VncrdTxXGFb8Wv18uj/iGi9JYpIvJn05gtJDq77R/rgyPj9effl6eb19Eh5pvyg/KToiqm8VN4oF8pEwcpa+aj8ofzZ+avzd+dT55+S+vDBJud7pbY6//4HA9m94w==</latexit>

1 + 2i
<latexit sha1_base64="MogkkwcVGp9qz8KUCGTXCxd5NLw=">AAAKKnicfVZNb9tGEGXSr8htWic95kJUKFC0gkGKsSgfAiSW3OTQxK5h2QFMwViuRhSh5QeWS1HMgn8l1/bQX9Nb0Gt/SJekJJNcqnvRYN+b4eybh9XaIXEjpmmfHjz87PMvvvzqUefg628ef/vd4ZOn11EQUwwTHJCAvrdRBMT1YcJcRuB9SAF5NoEbeznK8ZsV0MgN/CuWhjD1kOO7cxcjJrbuDp9aiY25nqm/qFZCMe9n7t1hVzvSiqXKgb4JuspmXdw96SjWLMCxBz7DBEXRra6FbMoRZS4mkB1YcQQhwkvkwG3M5sMpd/0wZuDjTP1RYPOYqCxQ8/7UmUsBM5KKAGHqigoqXiCKMBOnOKiXisBHHkS92coNozKMVk4ZMCQkmPJ1IVH2uJbJHYrChYvXtdY48iIPsYW0GaWeXd+EmABdefXNvE3RZIO5BordKBfhQihzHuayR1fBxQZfpOEC/CjjMSVZNVEAQCnMRWIRRsDikBenEbNeRi8YjaGXh8XeizGiy0uY9USd2ka9nTkJEKtv2Y1j+C6GudA7OxCa+ZDgwPOQP+NWmHGLwZpxq3eUFYpW0cuMcyuXz7bVyxyuoe8q6Lssq4NnFfBMgHV0skPn6qSZel0Br6Wv3lTQm2aqHVfQWEJXFXQlVbaTCpxI8LqCriU0raCphH6ooB9knZEwy21/ystZFKPm58RdwWsK4Ge828+aZ6HCBbd6PSV3Bu/qWSH3DObi+igBL83p/M3V298yPhr2j7VB1mTYJIYtRTMGxyNNojhlNxuONhz2TyVOQJHv7AqNzwavdLlQGNOQ7Eimafx6IldKgZAg2VUanY77RoMkBKn3pJu6pjVPnzh4SxiY5lhr9pOQe8Kr0dgwm58Rd+gWt3UD+k3xEnJPmBnD53IBe4cb2mBwIuHknmAayJxJhOUOPylWEw92OAxOjksNanbBkls2plC7uiq5y2mjb6RuTbDbEkpLtfKXMv81RekedtBWfeu01oywLWNru9aMtC1j68FtRj0laZGpMFvrBwqbSXSyn98ys8KJe6q30cl+fsvECpvuqd5GJ/v5LfMtPNzObplv4eid7A3zhPl1usRizPlzBJFyNGMQDxUKb8U1ey7+XBEL6M/ibqWO5wrbil+rl0f/R0TrLVFE4s2kN19IcnDdP9IHR8bvz7svTzevp0fKM+UH5SdFV0zlpfJGuVAmClbWykflD+XPzl+dvzufOv+U1IcPNjnfK7XV+fc/rmq9cQ==</latexit>

-1i 1i 2i
<latexit sha1_base64="ygAWZvlYmbm03nOx2qEJLdTvegQ=">AAAKNXicfVZNb9tGEGXSr8htWqc9FgWICgWKQjVIMRZlFAESS25yaGLXsOwApmAsVyOK0PIDy6UoZsFTf02v7aG/pYfeil77F7okJZnkUt2LBvveDGffPKzWDokbMU3788HD997/4MOPHnUOPv7k8aefHT75/DoKYophggMS0Lc2ioC4PkyYywi8DSkgzyZwYy9HOX6zAhq5gX/F0hCmHnJ8d+5ixMTW3eFXVkIx/17PXOuHIryP+pl7d9jVjrRiqXKgb4KuslkXd086ijULcOyBzzBBUXSrayGbckSZiwlkB1YcQYjwEjlwG7P5cMpdP4wZ+DhTvxHYPCYqC9S8VXXmUsCMpCJAmLqigooXiCLMxIEO6qUi8JEHUW+2csOoDKOVUwYMCTWmfF2olT2uZXKHonDh4nWtNY68yENsIW1GqWfXNyEmQFdefTNvUzTZYK6BYjfKRbgQypyH+QSiq+Bigy/ScAF+lPGYkqyaKACgFOYisQgjYHHIi9OIsS+jZ4zG0MvDYu/ZGNHlJcx6ok5to97OnASI1bfsxjF8F8Nc6J0dCM18SHDgecifcSvMuMVgzbjVO8oKRavoZca5lctn2+plDtfQNxX0TZbVwbMKeCbAOjrZoXN10ky9roDX0ldvKuhNM9WOK2gsoasKupIq20kFTiR4XUHXEppW0FRC31XQd7LOSJjltj/l5SyKUfNz4q7gJQXwM97tZ82zUOGCW72ekjuDd/WskHsGc3GTlICX5nT+6ur1TxkfDfvH2iBrMmwSw5aiGYPjkSZRnLKbDUcbDvunEiegyHd2hcZngxe6XCiMaUh2JNM0fjyRK6VASJDsKo1Ox32jQRKC1HvSTV3TmqdPHLwlDExzrDX7Scg94cVobJjNz4g7dIvbugH9pngJuSfMjOFTuYC9ww1tMDiRcHJPMA1kziTCcoefFKuJBzscBifHpQY1u2DJLRtTqF1dldzltNE3Urcm2G0JpaVa+UuZ/5KidA87aKu+dVprRtiWsbVda0balrH14DajnpK0yFSYrfUDhc0kOtnPb5lZ4cQ91dvoZD+/ZWKFTfdUb6OT/fyW+RYebme3zLdw9E72hnnC/DpdYjHm/DmCSDmaMYiHCoXX4po9F3+uiAX0O3G3UsdzhW3Fr9XLo/8jovWWKCLxZtKbLyQ5uO4f6YMj4+en3eenm9fTI+VL5WvlW0VXTOW58kq5UCYKVn5RflV+U37v/NH5q/N355+S+vDBJucLpbY6//4HAKzCgg==</latexit>

-1i 1i 2i
<latexit sha1_base64="ygAWZvlYmbm03nOx2qEJLdTvegQ=">AAAKNXicfVZNb9tGEGXSr8htWqc9FgWICgWKQjVIMRZlFAESS25yaGLXsOwApmAsVyOK0PIDy6UoZsFTf02v7aG/pYfeil77F7okJZnkUt2LBvveDGffPKzWDokbMU3788HD997/4MOPHnUOPv7k8aefHT75/DoKYophggMS0Lc2ioC4PkyYywi8DSkgzyZwYy9HOX6zAhq5gX/F0hCmHnJ8d+5ixMTW3eFXVkIx/17PXOuHIryP+pl7d9jVjrRiqXKgb4KuslkXd086ijULcOyBzzBBUXSrayGbckSZiwlkB1YcQYjwEjlwG7P5cMpdP4wZ+DhTvxHYPCYqC9S8VXXmUsCMpCJAmLqigooXiCLMxIEO6qUi8JEHUW+2csOoDKOVUwYMCTWmfF2olT2uZXKHonDh4nWtNY68yENsIW1GqWfXNyEmQFdefTNvUzTZYK6BYjfKRbgQypyH+QSiq+Bigy/ScAF+lPGYkqyaKACgFOYisQgjYHHIi9OIsS+jZ4zG0MvDYu/ZGNHlJcx6ok5to97OnASI1bfsxjF8F8Nc6J0dCM18SHDgecifcSvMuMVgzbjVO8oKRavoZca5lctn2+plDtfQNxX0TZbVwbMKeCbAOjrZoXN10ky9roDX0ldvKuhNM9WOK2gsoasKupIq20kFTiR4XUHXEppW0FRC31XQd7LOSJjltj/l5SyKUfNz4q7gJQXwM97tZ82zUOGCW72ekjuDd/WskHsGc3GTlICX5nT+6ur1TxkfDfvH2iBrMmwSw5aiGYPjkSZRnLKbDUcbDvunEiegyHd2hcZngxe6XCiMaUh2JNM0fjyRK6VASJDsKo1Ox32jQRKC1HvSTV3TmqdPHLwlDExzrDX7Scg94cVobJjNz4g7dIvbugH9pngJuSfMjOFTuYC9ww1tMDiRcHJPMA1kziTCcoefFKuJBzscBifHpQY1u2DJLRtTqF1dldzltNE3Urcm2G0JpaVa+UuZ/5KidA87aKu+dVprRtiWsbVda0balrH14DajnpK0yFSYrfUDhc0kOtnPb5lZ4cQ91dvoZD+/ZWKFTfdUb6OT/fyW+RYebme3zLdw9E72hnnC/DpdYjHm/DmCSDmaMYiHCoXX4po9F3+uiAX0O3G3UsdzhW3Fr9XLo/8jovWWKCLxZtKbLyQ5uO4f6YMj4+en3eenm9fTI+VL5WvlW0VXTOW58kq5UCYKVn5RflV+U37v/NH5q/N355+S+vDBJucLpbY6//4HAKzCgg==</latexit>

r + ci
<latexit sha1_base64="R5eWalK8yve3V9gokX+hhA590A4=">AAAKKnicfVZNb9tGEGXSr8htWic95kJUKFC0gkGKsSgfAiSW3OTQxK5h2QFMwViuRhSh5QeWS1HMgn8l1/bQX9Nb0Gt/SJekJJNcqnvRYN+b4eybh9XaIXEjpmmfHjz87PMvvvzqUefg628ef/vd4ZOn11EQUwwTHJCAvrdRBMT1YcJcRuB9SAF5NoEbeznK8ZsV0MgN/CuWhjD1kOO7cxcjJrbuDp9aiY05zdRfVCuhmOPMvTvsakdasVQ50DdBV9msi7snHcWaBTj2wGeYoCi61bWQTTmizMUEsgMrjiBEeIkcuI3ZfDjlrh/GDHycqT8KbB4TlQVq3p86cylgRlIRIExdUUHFC0QRZuIUB/VSEfjIg6g3W7lhVIbRyikDhoQEU74uJMoe1zK5Q1G4cPG61hpHXuQhtpA2o9Sz65sQE6Arr76ZtymabDDXQLEb5SJcCGXOw1z26Cq42OCLNFyAH2U8piSrJgoAKIW5SCzCCFgc8uI0YtbL6AWjMfTysNh7MUZ0eQmznqhT26i3MycBYvUtu3EM38UwF3pnB0IzHxIceB7yZ9wKM24xWDNu9Y6yQtEqeplxbuXy2bZ6mcM19F0FfZdldfCsAp4JsI5OduhcnTRTryvgtfTVmwp600y14woaS+iqgq6kynZSgRMJXlfQtYSmFTSV0A8V9IOsMxJmue1PeTmLYtT8nLgreE0B/Ix3+1nzLFS44Favp+TO4F09K+SewVxcHyXgpTmdv7l6+1vGR8P+sTbImgybxLClaMbgeKRJFKfsZsPRhsP+qcQJKPKdXaHx2eCVLhcKYxqSHck0jV9P5EopEBIku0qj03HfaJCEIPWedFPXtObpEwdvCQPTHGvNfhJyT3g1Ghtm8zPiDt3itm5AvyleQu4JM2P4XC5g73BDGwxOJJzcE0wDmTOJsNzhJ8Vq4sEOh8HJcalBzS5YcsvGFGpXVyV3OW30jdStCXZbQmmpVv5S5r+mKN3DDtqqb53WmhG2ZWxt15qRtmVsPbjNqKckLTIVZmv9QGEziU7281tmVjhxT/U2OtnPb5lYYdM91dvoZD+/Zb6Fh9vZLfMtHL2TvWGeML9Ol1iMOX+OIFKOZgzioULhrbhmz8WfK2IB/VncrdTxXGFb8Wv18uj/iGi9JYpIvJn05gtJDq77R/rgyPj9effl6eb19Eh5pvyg/KToiqm8VN4oF8pEwcpa+aj8ofzZ+avzd+dT55+S+vDBJud7pbY6//4HA9m94w==</latexit>

0
<latexit sha1_base64="74DnoI2VxywZbxYpGWdO1x9Hdfs=">AAAKF3icfVZNb9tGEGU+2kZu0yTNsRciQoGiEAxSjEX5ECCx5CaHJnYMyw5gCcFyNaIILT+wuxTFEPwFvbaH/prcgl577L/pkpRokkt1LxrsezOcffOwWisgDuOa9u+du/fuf/X1Nw86B99+9/D7R4+f/HDF/JBimGCf+PSDhRgQx4MJdziBDwEF5FoErq3VKMOv10CZ43uXPA5g5iLbcxYORlxsvdc+Pu5qh1q+VDnQt0FX2a7zj086ynTu49AFj2OCGLvRtYDPEkS5gwmkB9OQQYDwCtlwE/LFcJY4XhBy8HCq/iSwRUhU7qtZM+rcoYA5iUWAMHVEBRUvEUWYi5YP6qUYeMgF1puvnYAVIVvbRcCROO8s2eR6pA9rmYlNUbB08KbWWoJc5iK+lDZZ7Fr1TQgJ0LVb38zaFE02mBug2GGZCOdCmbMg05hd+udbfBkHS/BYmoSUpNVEAQClsBCJeciAh0GSn0YMdsVecBpCLwvzvRdjRFcXMO+JOrWNejsL4iNe37Iax/AcDAuhd3ogNPMgwr7rIm+eTIM0mXLY8GTaO0xzRavoRZok00w+y1IvMriGvqug79K0Dp5WwFMB1tFJiS7USTP1qgJeSV+9rqDXzVQrrKChhK4r6FqqbEUVOJLgTQXdSGhcQWMJ/VRBP8k6I2GWm/4sKWaRjzo5I84aXlMAL026/bR5FipccKPXUzJnJF09zeWew0LcFQXgxhk9eXP59rc0GQ37R9ogbTIsEsKOohmDo5EmUeyimy1HGw77JxLHp8izy0Lj08ErXS4UhDQgJck0jV+P5UoxEOJHZaXRybhvNEhCkHpPuqlrWvP0kY13hIFpjrVmPxG5JbwajQ2z+ZmIlrilG9BviheRW8LcGD6XC1glbmiDwbGEk1uCaSBzLhFWJX6crybulzgMjo8KDWp2wZJbtqZQu7oquctuo2+lbk2w2hIKS7XyVzL/NUXxHrbfVn3ntNaMoC1jZ7vWjLgtY+fBXUY9JWqRKTdb6wdym0l0sp/fMrPciXuqt9HJfn7LxHKb7qneRif7+S3zzT3czm6Zb+7oUvaGeYLsOl1hMebsOYJIMZoxiIcKhbfimj0Tf66I+/QXcbdS23WEbcXvtJdF/0dEmx1RROLNpDdfSHJw1T/UB4fG++fdlyfb19MD5UflmfKzoium8lJ5o5wrEwUroPyu/KH82fmr87nzpfN3Qb17Z5vzVKmtzj//AbhJtl8=</latexit>

-1 1 2 -1 1 2 -1 1 2
<latexit sha1_base64="4CXIZ/uO3SheZvJzFDMmn0nUgjI=">AAAKMnicfVZdb+NEFPUuX5vCQhfe4MUiQkIoVHa8jVOhlXablN0Hdluqpl2pjqrx5MaxMv7QeBzHO7LEr+EVHvgz8IZ45UcwdhLX9jjMS67mnHt959yjydghcSOmaX8+ePje+x98+NGjzsHHnzz+9LPDJ59fR0FMMUxwQAL61kYRENeHCXMZgbchBeTZBG7s5SjHb1ZAIzfwr1gawtRDju/OXYyY2Lo7/NJKbMy/1zPrhyIqg352d9jVjrRiqXKgb4Ousl0Xd086ijULcOyBzzBBUXSrayGbckSZiwlkB1YcQYjwEjlwG7P5cMpdP4wZ+DhTvxHYPCYqC9S8T3XmUsCMpCJAmLqigooXiCLMxGkO6qUi8JEHUW+2csNoE0YrZxMwJKSY8nUhVfa4lskdisKFi9e11jjyIg+xhbQZpZ5d34SYAF159c28TdFkg7kGit0oF+FCKHMe5vJHV8HFFl+k4QL8KOMxJVk1UQBAKcxFYhFGwOKQF6cRM19GzxiNoZeHxd6zMaLLS5j1RJ3aRr2dOQkQq2/ZjWP4Loa50Ds7EJr5kODA85A/41aYcYvBmnGrd5QVilbRy4xzK5fPttXLHK6hbyromyyrg2cV8EyAdXRSonN10ky9roDX0ldvKuhNM9WOK2gsoasKupIq20kFTiR4XUHXEppW0FRC31XQd7LOSJjltj/lm1kUo+bnxF3BSwrgZ7zbz5pnocIFt3o9JXcG7+pZIfcM5uIa2QBemtP5q6vXP2V8NOwfa4OsybBJDDuKZgyOR5pEcTbdbDnacNg/lTgBRb5TFhqfDV7ocqEwpiEpSaZp/HgiV0qBkCApK41Ox32jQRKC1HvSTV3TmqdPHLwjDExzrDX7Scg94cVobJjNzyS0xG3dgH5TvITcE2bG8KlcwC5xQxsMTiSc3BNMA5kzibAs8ZNiNfGgxGFwcrzRoGYXLLllawq1q6uSu5w2+lbq1gS7LWFjqVb+Uua/pCjdww7aqu+c1poRtmXsbNeakbZl7Dy4y6inJC0yFWZr/UBhM4lO9vNbZlY4cU/1NjrZz2+ZWGHTPdXb6GQ/v2W+hYfb2S3zLRxdyt4wT5hfp0ssxpw/RxDZjGYM4qFC4bW4Zs/FnytiAf1O3K3U8VxhW/Fr9fLo/4hovSOKSLyZ9OYLSQ6u+0f64Mj4+Wn3+en29fRI+Ur5WvlW0RVTea68Ui6UiYKVX5Rfld+U3zt/dP7q/N35Z0N9+GCb84VSW51//wOUx8D5</latexit> <latexit sha1_base64="4CXIZ/uO3SheZvJzFDMmn0nUgjI=">AAAKMnicfVZdb+NEFPUuX5vCQhfe4MUiQkIoVHa8jVOhlXablN0Hdluqpl2pjqrx5MaxMv7QeBzHO7LEr+EVHvgz8IZ45UcwdhLX9jjMS67mnHt959yjydghcSOmaX8+ePje+x98+NGjzsHHnzz+9LPDJ59fR0FMMUxwQAL61kYRENeHCXMZgbchBeTZBG7s5SjHb1ZAIzfwr1gawtRDju/OXYyY2Lo7/NJKbMy/1zPrhyIqg352d9jVjrRiqXKgb4Ousl0Xd086ijULcOyBzzBBUXSrayGbckSZiwlkB1YcQYjwEjlwG7P5cMpdP4wZ+DhTvxHYPCYqC9S8T3XmUsCMpCJAmLqigooXiCLMxGkO6qUi8JEHUW+2csNoE0YrZxMwJKSY8nUhVfa4lskdisKFi9e11jjyIg+xhbQZpZ5d34SYAF159c28TdFkg7kGit0oF+FCKHMe5vJHV8HFFl+k4QL8KOMxJVk1UQBAKcxFYhFGwOKQF6cRM19GzxiNoZeHxd6zMaLLS5j1RJ3aRr2dOQkQq2/ZjWP4Loa50Ds7EJr5kODA85A/41aYcYvBmnGrd5QVilbRy4xzK5fPttXLHK6hbyromyyrg2cV8EyAdXRSonN10ky9roDX0ldvKuhNM9WOK2gsoasKupIq20kFTiR4XUHXEppW0FRC31XQd7LOSJjltj/lm1kUo+bnxF3BSwrgZ7zbz5pnocIFt3o9JXcG7+pZIfcM5uIa2QBemtP5q6vXP2V8NOwfa4OsybBJDDuKZgyOR5pEcTbdbDnacNg/lTgBRb5TFhqfDV7ocqEwpiEpSaZp/HgiV0qBkCApK41Ox32jQRKC1HvSTV3TmqdPHLwjDExzrDX7Scg94cVobJjNzyS0xG3dgH5TvITcE2bG8KlcwC5xQxsMTiSc3BNMA5kzibAs8ZNiNfGgxGFwcrzRoGYXLLllawq1q6uSu5w2+lbq1gS7LWFjqVb+Uua/pCjdww7aqu+c1poRtmXsbNeakbZl7Dy4y6inJC0yFWZr/UBhM4lO9vNbZlY4cU/1NjrZz2+ZWGHTPdXb6GQ/v2W+hYfb2S3zLRxdyt4wT5hfp0ssxpw/RxDZjGYM4qFC4bW4Zs/FnytiAf1O3K3U8VxhW/Fr9fLo/4hovSOKSLyZ9OYLSQ6u+0f64Mj4+Wn3+en29fRI+Ur5WvlW0RVTea68Ui6UiYKVX5Rfld+U3zt/dP7q/N35Z0N9+GCb84VSW51//wOUx8D5</latexit> <latexit sha1_base64="4CXIZ/uO3SheZvJzFDMmn0nUgjI=">AAAKMnicfVZdb+NEFPUuX5vCQhfe4MUiQkIoVHa8jVOhlXablN0Hdluqpl2pjqrx5MaxMv7QeBzHO7LEr+EVHvgz8IZ45UcwdhLX9jjMS67mnHt959yjydghcSOmaX8+ePje+x98+NGjzsHHnzz+9LPDJ59fR0FMMUxwQAL61kYRENeHCXMZgbchBeTZBG7s5SjHb1ZAIzfwr1gawtRDju/OXYyY2Lo7/NJKbMy/1zPrhyIqg352d9jVjrRiqXKgb4Ousl0Xd086ijULcOyBzzBBUXSrayGbckSZiwlkB1YcQYjwEjlwG7P5cMpdP4wZ+DhTvxHYPCYqC9S8T3XmUsCMpCJAmLqigooXiCLMxGkO6qUi8JEHUW+2csNoE0YrZxMwJKSY8nUhVfa4lskdisKFi9e11jjyIg+xhbQZpZ5d34SYAF159c28TdFkg7kGit0oF+FCKHMe5vJHV8HFFl+k4QL8KOMxJVk1UQBAKcxFYhFGwOKQF6cRM19GzxiNoZeHxd6zMaLLS5j1RJ3aRr2dOQkQq2/ZjWP4Loa50Ds7EJr5kODA85A/41aYcYvBmnGrd5QVilbRy4xzK5fPttXLHK6hbyromyyrg2cV8EyAdXRSonN10ky9roDX0ldvKuhNM9WOK2gsoasKupIq20kFTiR4XUHXEppW0FRC31XQd7LOSJjltj/lm1kUo+bnxF3BSwrgZ7zbz5pnocIFt3o9JXcG7+pZIfcM5uIa2QBemtP5q6vXP2V8NOwfa4OsybBJDDuKZgyOR5pEcTbdbDnacNg/lTgBRb5TFhqfDV7ocqEwpiEpSaZp/HgiV0qBkCApK41Ox32jQRKC1HvSTV3TmqdPHLwjDExzrDX7Scg94cVobJjNzyS0xG3dgH5TvITcE2bG8KlcwC5xQxsMTiSc3BNMA5kzibAs8ZNiNfGgxGFwcrzRoGYXLLllawq1q6uSu5w2+lbq1gS7LWFjqVb+Uua/pCjdww7aqu+c1poRtmXsbNeakbZl7Dy4y6inJC0yFWZr/UBhM4lO9vNbZlY4cU/1NjrZz2+ZWGHTPdXb6GQ/v2W+hYfb2S3zLRxdyt4wT5hfp0ssxpw/RxDZjGYM4qFC4bW4Zs/FnytiAf1O3K3U8VxhW/Fr9fLo/4hovSOKSLyZ9OYLSQ6u+0f64Mj4+Wn3+en29fRI+Ur5WvlW0RVTea68Ui6UiYKVX5Rfld+U3zt/dP7q/N35Z0N9+GCb84VSW51//wOUx8D5</latexit>

-1i 1i 2i
<latexit sha1_base64="ygAWZvlYmbm03nOx2qEJLdTvegQ=">AAAKNXicfVZNb9tGEGXSr8htWqc9FgWICgWKQjVIMRZlFAESS25yaGLXsOwApmAsVyOK0PIDy6UoZsFTf02v7aG/pYfeil77F7okJZnkUt2LBvveDGffPKzWDokbMU3788HD997/4MOPHnUOPv7k8aefHT75/DoKYophggMS0Lc2ioC4PkyYywi8DSkgzyZwYy9HOX6zAhq5gX/F0hCmHnJ8d+5ixMTW3eFXVkIx/17PXOuHIryP+pl7d9jVjrRiqXKgb4KuslkXd086ijULcOyBzzBBUXSrayGbckSZiwlkB1YcQYjwEjlwG7P5cMpdP4wZ+DhTvxHYPCYqC9S8VXXmUsCMpCJAmLqigooXiCLMxIEO6qUi8JEHUW+2csOoDKOVUwYMCTWmfF2olT2uZXKHonDh4nWtNY68yENsIW1GqWfXNyEmQFdefTNvUzTZYK6BYjfKRbgQypyH+QSiq+Bigy/ScAF+lPGYkqyaKACgFOYisQgjYHHIi9OIsS+jZ4zG0MvDYu/ZGNHlJcx6ok5to97OnASI1bfsxjF8F8Nc6J0dCM18SHDgecifcSvMuMVgzbjVO8oKRavoZca5lctn2+plDtfQNxX0TZbVwbMKeCbAOjrZoXN10ky9roDX0ldvKuhNM9WOK2gsoasKupIq20kFTiR4XUHXEppW0FRC31XQd7LOSJjltj/l5SyKUfNz4q7gJQXwM97tZ82zUOGCW72ekjuDd/WskHsGc3GTlICX5nT+6ur1TxkfDfvH2iBrMmwSw5aiGYPjkSZRnLKbDUcbDvunEiegyHd2hcZngxe6XCiMaUh2JNM0fjyRK6VASJDsKo1Ox32jQRKC1HvSTV3TmqdPHLwlDExzrDX7Scg94cVobJjNz4g7dIvbugH9pngJuSfMjOFTuYC9ww1tMDiRcHJPMA1kziTCcoefFKuJBzscBifHpQY1u2DJLRtTqF1dldzltNE3Urcm2G0JpaVa+UuZ/5KidA87aKu+dVprRtiWsbVda0balrH14DajnpK0yFSYrfUDhc0kOtnPb5lZ4cQ91dvoZD+/ZWKFTfdUb6OT/fyW+RYebme3zLdw9E72hnnC/DpdYjHm/DmCSDmaMYiHCoXX4po9F3+uiAX0O3G3UsdzhW3Fr9XLo/8jovWWKCLxZtKbLyQ5uO4f6YMj4+en3eenm9fTI+VL5WvlW0VXTOW58kq5UCYKVn5RflV+U37v/NH5q/N355+S+vDBJucLpbY6//4HAKzCgg==</latexit>
104 CHAPTER 3—PROVING THE SPECTRAL THEOREM

The real-valued numbers that we already knew are a subset of the


complex numbers: those complex number for which the imagi-
nary part is zero. In this picture, the real-valued numbers are
on the horizontal axis.
Note that this is just a visualization. There is nothing inher-
ently two-dimensional about the complex numbers, except that
there is a very natural mapping from C to R2 . At heart, it’s just a
set of numbers with a bunch of operations defined for them.
The nice thing about the mapping to the plane, however, is
that we can take operations like multiplication, addition and so
on, and see what they look like in this picture. This way, we
can build a very helpful visual intuition for how the complex
numbers behave.
Let’s look at the most important concepts we’ll need going
forward. For addition, we can build on our existing intuitions.
Adding two complex numbers works the same as adding two vec-
tors in the plane: we place the tail of one on the head of the other.

<latexit sha1_base64="q5VEWnpV9bzjITauIOMgrm+tht8=">AAAKMnicfVZNb9tGEGWStoncpnGSW3shKhRIW8EgpViSDwESS25yaGLXsOwApmAsVyOJ0PIDy6UkZkGgvybX9NA/096KXvsjuiQlmuSuyoM92PdmdvbNw2rtgDghM4w/79y999nnX9x/0Nj78quHXz/af/zkMvQjimGEfeLT9zYKgTgejJjDCLwPKCDXJnBlLwYpfrUEGjq+d8HiAMYumnnO1MGIiaWb/W8sG3Ok/6TjRPyxKObPbBFMfkicm/2mcWBkny4H5iZoapvv7OZxQ7MmPo5c8BgmKAyvTSNgY44oczCBZM+KQggQXqAZXEds2h9zxwsiBp7Y/HuBTSOiM19P+9QnDgXMSCwChKkjKuh4jijCTJxmr1oqBA+5ELYmSycI8zBczvKAISHFmK8zqZKHlUw+oyiYO3hdaY0jN3QRm0uLYeza1UWICNClW11M2xRN1phroNgJUxHOhDKnQSp/eOGfbfB5HMzBCxMeUZKUEwUAlMJUJGZhCCwKeHYaMfNF+ILRCFppmK29GCK6OIdJS9SpLFTbmRIfseqSXTuG52CYCr2TPaGZByvsuy7yJtwKEm4xWDNutQ6STNEyep5wbqXy2bZ+nsIV9F0JfZckVfCkBJ4IsIqOCnSqj+qplyXwUtr1qoRe1VPtqIRGErosoUupsr0qwSsJXpfQtYTGJTSW0A8l9IOsMxJmuW6PeT6LbNT8lDhLeE0BvIQ320n9LFS44NqspqTO4E0zyeSewFRcIzngximdv7l4+0vCB/32odFN6gybRLClGJ3u4cCQKLO8mw3H6PfbxxLHp8ibFYWGJ91XplwoiGhAClKv1/n5SK4UAyH+qqg0OB62OzWSEKTak9kzDaN++tUMbwndXm9o1PtZkVvCq8Gw06tvs6IFbpsdaNfFW5FbwqTTfy4XsAu8Y3S7RxJObgm9DupNJMKiwI+yr477BQ7do8Ncg4pdsOSWjSn0pqlL7pqp6BuplQm2KiG3lJK/kPmvKYp3sH1V9a3TlBmBKmNrO2VGrMrYenCbUU1ZKWTKzKbcILOZRCe7+YqZZU7cUV1FJ7v5iollNt1RXUUnu/mK+WYeVrMV880cXcheM0+QXqcLLMacPkcQyUczBPFQofBWXLOn4scVMZ/+KO5WOnMdYVvx32ql0f8R0XpLFJF4M5n1F5IcXLYPzO5B59fnzZfHm9fTA+1b7TvtmWZqPe2l9kY700Ya1n7TPmqftN8bfzT+avzd+Cen3r2zyXmqVb7Gv/8Bnam/IQ==</latexit>

a + c + (b + d)i

c + di
<latexit sha1_base64="Un3k/EOJ1+myQP1glEK72XqsD7k=">AAAKK3icfVZNb9tGEGXSr8htWqc55kJUKFC0gkGKsSgfAiSW3OTQxK5h2QEswVguRxSh5QeWS1HMgr8l1/bQX9NTi177P7okJZnkUt2LBvvezM6+eaDWCokbMU3768HDTz797PMvHnUOvvzq8dffHD759joKYophggMS0PcWioC4PkyYywi8DykgzyJwYy1HOX6zAhq5gX/F0hBmHnJ8d+5ixMTW3eFTdZoEmONM/amM7My9O+xqR1qxVDnQN0FX2ayLuycdZWoHOPbAZ5igKLrVtZDNOKLMxQSyg2kcQYjwEjlwG7P5cMZdP4wZ+OLc7wU2j4nKAjVvULVdCpiRVAQIU1dUUPECUYSZuMZBvVQEPvIg6tkrN4zKMFo5ZcCQ0GDG14VG2eNaJncoChcuXtda48iLPMQW0maUelZ9E2ICdOXVN/M2RZMN5hoodqNchAuhzHmY6x5dBRcbfJGGC/CjjMeUZNVEAQClMBeJRRgBi0Ne3EYMexm9YDSGXh4Wey/GiC4vwe6JOrWNejtzEiBW37Ia1/BdDHOhd3YgNPMhwYHnId/m0zDjUwZrxqe9o6xQtIpeZpxPc/ksS73M4Rr6roK+y7I6eFYBzwRYRyc7dK5OmqnXFfBaOvWmgt40U624gsYSuqqgK6mylVTgRILXFXQtoWkFTSX0QwX9IOuMhFlu+zNezqIYNT8n7gpeUwA/491+1rwLFS641espuTN4V88KuW2Yi+9HCXhpTudvrt7+kvHRsH+sDbImwyIxbCmaMTgeaRLFKbvZcLThsH8qcQKKfGdXaHw2eKXLhcKYhmRHMk3j5xO5UgqEBMmu0uh03DcaJCFIvSfd1DWtefvEwVvCwDTHWrOfhNwTXo3Ghtk8JqE73NIN6DfFS8g9wTaGz+UC1g43tMHgRMLJPcE0kGlLhOUOPylWEw92OAxOjksNanbBkls2plC7uiq5y2mjb6RuTbDaEkpLtfKXMv81RekedtBWfeu01oywLWNru9aMtC1j68FtRj0laZGpMFvrAYXNJDrZz2+ZWeHEPdXb6GQ/v2VihU33VG+jk/38lvkWHm5nt8y3cPRO9oZ5wvxzusRizPlzBJFyNGMQDxUKb8Vn9lz8uSIW0B/Ft5U6nitsK36nvTz6PyJab4kiEm8mvflCkoPr/pE+ODJ+fd59ebp5PT1SninfKT8oumIqL5U3yoUyUbCSKh+V35TfO390/uz83fmnpD58sMl5qtRW59//AEMSvgk=</latexit>

0
<latexit sha1_base64="74DnoI2VxywZbxYpGWdO1x9Hdfs=">AAAKF3icfVZNb9tGEGU+2kZu0yTNsRciQoGiEAxSjEX5ECCx5CaHJnYMyw5gCcFyNaIILT+wuxTFEPwFvbaH/prcgl577L/pkpRokkt1LxrsezOcffOwWisgDuOa9u+du/fuf/X1Nw86B99+9/D7R4+f/HDF/JBimGCf+PSDhRgQx4MJdziBDwEF5FoErq3VKMOv10CZ43uXPA5g5iLbcxYORlxsvdc+Pu5qh1q+VDnQt0FX2a7zj086ynTu49AFj2OCGLvRtYDPEkS5gwmkB9OQQYDwCtlwE/LFcJY4XhBy8HCq/iSwRUhU7qtZM+rcoYA5iUWAMHVEBRUvEUWYi5YP6qUYeMgF1puvnYAVIVvbRcCROO8s2eR6pA9rmYlNUbB08KbWWoJc5iK+lDZZ7Fr1TQgJ0LVb38zaFE02mBug2GGZCOdCmbMg05hd+udbfBkHS/BYmoSUpNVEAQClsBCJeciAh0GSn0YMdsVecBpCLwvzvRdjRFcXMO+JOrWNejsL4iNe37Iax/AcDAuhd3ogNPMgwr7rIm+eTIM0mXLY8GTaO0xzRavoRZok00w+y1IvMriGvqug79K0Dp5WwFMB1tFJiS7USTP1qgJeSV+9rqDXzVQrrKChhK4r6FqqbEUVOJLgTQXdSGhcQWMJ/VRBP8k6I2GWm/4sKWaRjzo5I84aXlMAL026/bR5FipccKPXUzJnJF09zeWew0LcFQXgxhk9eXP59rc0GQ37R9ogbTIsEsKOohmDo5EmUeyimy1HGw77JxLHp8izy0Lj08ErXS4UhDQgJck0jV+P5UoxEOJHZaXRybhvNEhCkHpPuqlrWvP0kY13hIFpjrVmPxG5JbwajQ2z+ZmIlrilG9BviheRW8LcGD6XC1glbmiDwbGEk1uCaSBzLhFWJX6crybulzgMjo8KDWp2wZJbtqZQu7oquctuo2+lbk2w2hIKS7XyVzL/NUXxHrbfVn3ntNaMoC1jZ7vWjLgtY+fBXUY9JWqRKTdb6wdym0l0sp/fMrPciXuqt9HJfn7LxHKb7qneRif7+S3zzT3czm6Zb+7oUvaGeYLsOl1hMebsOYJIMZoxiIcKhbfimj0Tf66I+/QXcbdS23WEbcXvtJdF/0dEmx1RROLNpDdfSHJw1T/UB4fG++fdlyfb19MD5UflmfKzoium8lJ5o5wrEwUroPyu/KH82fmr87nzpfN3Qb17Z5vzVKmtzj//AbhJtl8=</latexit>

a + bi + c + bi
<latexit sha1_base64="rSo57I/cIY0Bz7E3gY149cEoFvE=">AAAKP3icfVbLbttGFGXSV+Q2jdMuuyEqFC1awSDFWJQXARJLbrJoYtew7ACWYAxHVxSh4QPDoShmwE/o13TbLvoZ/YLuim6765CUaJJDlRtd3HPunZlzD0ZjBcQJmab9+eDhBx9+9PEnjzoHn372+PMnh0+/uA79iGKYYJ/49J2FQiCOBxPmMALvAgrItQjcWKtRht+sgYaO712xJICZi2zPWTgYMZG6O/x2GtuYo1T9Qc0jK3Xy0Mccp7tIJO8Ou9qRln+qHOjboKtsv4u7px1lOvdx5ILHMEFheKtrAZtxRJmDCaQH0yiEAOEVsuE2YovhjDteEDHwxLrfCGwREZX5arZpde5QwIwkIkCYOqKDipeIIszE0Q7qrULwkAthb752grAIw7VdBAwJXWZ8k+uWPq5VcpuiYOngTW1rHLmhi9hSSoaJa9WTEBGga7eezLYpNtlgboBiJ8xEuBDKnAfZLMIr/2KLL5NgCV6Y8oiStFooAKAUFqIwD0NgUcDz0wgDrMLnjEbQy8I893yM6OoS5j3Rp5aob2dBfMTqKatxDM/BsBB6pwdCMw9i7Lsu8uZ8GqR8ymDD+LR3lOaKVtHLlPNpJp9lqZcZXEPfVtC3aVoHzyrgmQDr6KREF+qkWXpdAa+lVW8q6E2z1IoqaCSh6wq6ljpbcQWOJXhTQTcSmlTQRELfV9D3ss5ImOW2P+PFLPJR83PirOEVBfBS3u2nzbNQ4YJbvV6SOYN39TSXew4LcacUgJtkdP766s1PKR8N+8faIG0yLBLBjqIZg+ORJlHsYjdbjjYc9k8ljk+RZ5eNxmeDl7rcKIhoQEqSaRo/nsidEiDEj8tOo9Nx32iQhCD1PemmrmnN04t7cUcYmOZYa+4nJveEl6OxYTaXiWmJW7oB/aZ4MbknzI3hM7mBVeKGNhicSDi5J5gGMucSYVXiJ/nXxP0Sh8HJcaFBzS5YcsvWFGpXVyV32W30rdStBVZbQWGpVv5K5r+iKNnD9tu675zWWhG0Vexs11qRtFXsPLirqJfELTLlZmtdILeZRCf7+S0zy524p3sbneznt0wst+me7m10sp/fMt/cw+3slvnmji5lb5gnyK7TFRZjzp4jiBSjGYN4qFB4I67Zc/HniphPvxd3K7VdR9hW/E57WfR/RLTZEUUk3kx684UkB9f9I31wZPz8rPvidPt6eqR8pXytfKfoiqm8UF4rF8pEwcovyq/Kb8rvnT86f3X+7vxTUB8+2NZ8qdS+zr//AeV5xds=</latexit>

The same logic shows that subtraction of complex numbers be-


haves as you’d expect. To compute x-y, we subtract the real part
of x from the real part of y and likewise for the imaginary part.
Geometrically, this corresponds to vector subtraction in the plane.
3.4. COMPLEX NUMBERS 105

To see what multiplication looks like, we can switch to a differ-


ent way of representing complex numbers. Instead of giving the
Cartesian coordinates (r, c) that lead to the number z = r + ci,
we use the polar coordinates. We give an angle a from the hor-
izontal axis and a distance m from the origin. The angle is also
called the phase and the distance is called the magnitude or the
modulus. When we write a number like this, we’ll use the nota-
tion z = m\a. To refer to the magnitude of a complex number
z, which we’ll be doing a lot, we use the notation |z|.

m\a
<latexit sha1_base64="g3X7oeBsG3LM+MEsdgmQz7zGo44=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeSmAV/TK/tob+mhwJFr/0bXZISTXKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjDb1ZAIzfwr1gSwtRDju/OXYyY2Lo7/MZaO5h7qYV8h4Ca/4fSu8OudqTlS5UDfRt0le26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgPbDiCEKEl8iB25jNh1Pu+mHMwMep+r3A5jFRWaBmLaozlwJmJBEBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGRRitnCJgSKgw5ZtcpfRJLZM7FIULF29qrXHkRR5iC2kzSjy7vgkxAbry6ptZm6LJBnMDFLtRJsKFUOY8zJSProKLLb5IwgX4UcpjStJqogCAUpiLxDyMgMUhz08jxr2MXjIaQy8L872XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd6IDTzYY0Dz0P+jFthyi0GG8at3lGaK1pFL1POrUw+21YvM7iGvq+g79O0Dp5VwDMB1tFJic7VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbK8r8FqCNxV0I6FJBU0k9GMF/SjrjIRZbvtTXswiHzU/J+4K3lAAP+Xdfto8CxUuuNXrKZkzeFdPc7lnMBc3SAF4SUbnb6/e/Zzy0bB/rA3SJsMmMewomjE4HmkSxSm62XK04bB/KnECKu6dstD4bPBalwuFMQ1JSTJN46cTuVIChATrstLodNw3GiQhSL0n3dQ1rXl6cQPuCAPTHGvNftbknvB6NDbM5mfWtMRt3YB+U7w1uSfMjOELuYBd4oY2GJxIOLknmAYyZxJhWeIn+WriQYnD4OS40KBmFyy5ZWsKtaurkrucNvpW6tYEuy2hsFQrfynz31CU7GEHbdV3TmvNCNsydrZrzUjaMnYe3GXUU9YtMuVma/1AbjOJTvbzW2aWO3FP9TY62c9vmVhu0z3V2+hkP79lvrmH29kt880dXcreME+YXadLLMacPUcQKUYzBvFQofBOXLPn4scVsYD+KO5W6niusK34a/Wy6P+IaLMjiki8mfTmC0kOrvtH+uDI+OVF99Xp9vX0WPlW+U75QdEVU3mlvFUulImCFa78qvym/N75o/Nn5+/OPwX14YNtznOltjr//gd+/r+j</latexit>
\a
m
<latexit

sha1_base64="g3X7oeBsG3LM+MEsdgmQz7zGo44=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeSmAV/TK/tob+mhwJFr/0bXZISTXKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjDb1ZAIzfwr1gSwtRDju/OXYyY2Lo7/MZaO5h7qYV8h4Ca/4fSu8OudqTlS5UDfRt0le26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgPbDiCEKEl8iB25jNh1Pu+mHMwMep+r3A5jFRWaBmLaozlwJmJBEBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGRRitnCJgSKgw5ZtcpfRJLZM7FIULF29qrXHkRR5iC2kzSjy7vgkxAbry6ptZm6LJBnMDFLtRJsKFUOY8zJSProKLLb5IwgX4UcpjStJqogCAUpiLxDyMgMUhz08jxr2MXjIaQy8L872XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd6IDTzYY0Dz0P+jFthyi0GG8at3lGaK1pFL1POrUw+21YvM7iGvq+g79O0Dp5VwDMB1tFJic7VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbK8r8FqCNxV0I6FJBU0k9GMF/SjrjIRZbvtTXswiHzU/J+4K3lAAP+Xdfto8CxUuuNXrKZkzeFdPc7lnMBc3SAF4SUbnb6/e/Zzy0bB/rA3SJsMmMewomjE4HmkSxSm62XK04bB/KnECKu6dstD4bPBalwuFMQ1JSTJN46cTuVIChATrstLodNw3GiQhSL0n3dQ1rXl6cQPuCAPTHGvNftbknvB6NDbM5mfWtMRt3YB+U7w1uSfMjOELuYBd4oY2GJxIOLknmAYyZxJhWeIn+WriQYnD4OS40KBmFyy5ZWsKtaurkrucNvpW6tYEuy2hsFQrfynz31CU7GEHbdV3TmvNCNsydrZrzUjaMnYe3GXUU9YtMuVma/1AbjOJTvbzW2aWO3FP9TY62c9vmVhu0z3V2+hkP79lvrmH29kt880dXcreME+YXadLLMacPUcQKUYzBvFQofBOXLPn4scVsYD+KO5W6niusK34a/Wy6P+IaLMjiki8mfTmC0kOrvtH+uDI+OVF99Xp9vX0WPlW+U75QdEVU3mlvFUulImCFa78qvym/N75o/Nn5+/OPwX14YNtznOltjr//gd+/r+j</latexit>

m\a
<latexit sha1_base64="g3X7oeBsG3LM+MEsdgmQz7zGo44=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeSmAV/TK/tob+mhwJFr/0bXZISTXKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjDb1ZAIzfwr1gSwtRDju/OXYyY2Lo7/MZaO5h7qYV8h4Ca/4fSu8OudqTlS5UDfRt0le26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgPbDiCEKEl8iB25jNh1Pu+mHMwMep+r3A5jFRWaBmLaozlwJmJBEBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGRRitnCJgSKgw5ZtcpfRJLZM7FIULF29qrXHkRR5iC2kzSjy7vgkxAbry6ptZm6LJBnMDFLtRJsKFUOY8zJSProKLLb5IwgX4UcpjStJqogCAUpiLxDyMgMUhz08jxr2MXjIaQy8L872XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd6IDTzYY0Dz0P+jFthyi0GG8at3lGaK1pFL1POrUw+21YvM7iGvq+g79O0Dp5VwDMB1tFJic7VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbK8r8FqCNxV0I6FJBU0k9GMF/SjrjIRZbvtTXswiHzU/J+4K3lAAP+Xdfto8CxUuuNXrKZkzeFdPc7lnMBc3SAF4SUbnb6/e/Zzy0bB/rA3SJsMmMewomjE4HmkSxSm62XK04bB/KnECKu6dstD4bPBalwuFMQ1JSTJN46cTuVIChATrstLodNw3GiQhSL0n3dQ1rXl6cQPuCAPTHGvNftbknvB6NDbM5mfWtMRt3YB+U7w1uSfMjOELuYBd4oY2GJxIOLknmAYyZxJhWeIn+WriQYnD4OS40KBmFyy5ZWsKtaurkrucNvpW6tYEuy2hsFQrfynz31CU7GEHbdV3TmvNCNsydrZrzUjaMnYe3GXUU9YtMuVma/1AbjOJTvbzW2aWO3FP9TY62c9vmVhu0z3V2+hkP79lvrmH29kt880dXcreME+YXadLLMacPUcQKUYzBvFQofBOXLPn4scVsYD+KO5W6niusK34a/Wy6P+IaLMjiki8mfTmC0kOrvtH+uDI+OVF99Xp9vX0WPlW+U75QdEVU3mlvFUulImCFa78qvym/N75o/Nn5+/OPwX14YNtznOltjr//gd+/r+j</latexit>

0
<latexit sha1_base64="74DnoI2VxywZbxYpGWdO1x9Hdfs=">AAAKF3icfVZNb9tGEGU+2kZu0yTNsRciQoGiEAxSjEX5ECCx5CaHJnYMyw5gCcFyNaIILT+wuxTFEPwFvbaH/prcgl577L/pkpRokkt1LxrsezOcffOwWisgDuOa9u+du/fuf/X1Nw86B99+9/D7R4+f/HDF/JBimGCf+PSDhRgQx4MJdziBDwEF5FoErq3VKMOv10CZ43uXPA5g5iLbcxYORlxsvdc+Pu5qh1q+VDnQt0FX2a7zj086ynTu49AFj2OCGLvRtYDPEkS5gwmkB9OQQYDwCtlwE/LFcJY4XhBy8HCq/iSwRUhU7qtZM+rcoYA5iUWAMHVEBRUvEUWYi5YP6qUYeMgF1puvnYAVIVvbRcCROO8s2eR6pA9rmYlNUbB08KbWWoJc5iK+lDZZ7Fr1TQgJ0LVb38zaFE02mBug2GGZCOdCmbMg05hd+udbfBkHS/BYmoSUpNVEAQClsBCJeciAh0GSn0YMdsVecBpCLwvzvRdjRFcXMO+JOrWNejsL4iNe37Iax/AcDAuhd3ogNPMgwr7rIm+eTIM0mXLY8GTaO0xzRavoRZok00w+y1IvMriGvqug79K0Dp5WwFMB1tFJiS7USTP1qgJeSV+9rqDXzVQrrKChhK4r6FqqbEUVOJLgTQXdSGhcQWMJ/VRBP8k6I2GWm/4sKWaRjzo5I84aXlMAL026/bR5FipccKPXUzJnJF09zeWew0LcFQXgxhk9eXP59rc0GQ37R9ogbTIsEsKOohmDo5EmUeyimy1HGw77JxLHp8izy0Lj08ErXS4UhDQgJck0jV+P5UoxEOJHZaXRybhvNEhCkHpPuqlrWvP0kY13hIFpjrVmPxG5JbwajQ2z+ZmIlrilG9BviheRW8LcGD6XC1glbmiDwbGEk1uCaSBzLhFWJX6crybulzgMjo8KDWp2wZJbtqZQu7oquctuo2+lbk2w2hIKS7XyVzL/NUXxHrbfVn3ntNaMoC1jZ7vWjLgtY+fBXUY9JWqRKTdb6wdym0l0sp/fMrPciXuqt9HJfn7LxHKb7qneRif7+S3zzT3czm6Zb+7oUvaGeYLsOl1hMebsOYJIMZoxiIcKhbfimj0Tf66I+/QXcbdS23WEbcXvtJdF/0dEmx1RROLNpDdfSHJw1T/UB4fG++fdlyfb19MD5UflmfKzoium8lJ5o5wrEwUroPyu/KH82fmr87nzpfN3Qb17Z5vzVKmtzj//AbhJtl8=</latexit>

We call this representation of a complex number polar notation,


and the earlier representation Cartesian notation.
The reason polar notation is so useful, is that multiplication
looks very natural in it. To see the relation, assume that we have a
number z = m\a. Then basic trigonometry tells us that in Carte-
sian notation, this number is written as z = m cos(a) + m sin(a)i.
106 CHAPTER 3—PROVING THE SPECTRAL THEOREM

Let’s see what happens if we take two numbers, in polar nota-


tion, and multiply them:

(m\a)(n\b)
= (m cos(a) + m sin(a)i) (n cos(b) + n sin(b)i)
= m cos a n cos b + m sin a n sin b + (m cos a n sin b + n cos b m sin a)i
= mn(cos a cos b - sin a sin b) + mn(cos a sin b + cos b sin a)i
= mn cos(a + b) + mn sin(a + b)i
= (mn)\(a + b)

In the third line, we apply the multiplication in Cartesian nota-


tion that we already worked out earlier. Then, in the fifth line,
we apply some basic trigonometric sum/difference identities.
What this tells us, is that when we view complex numbers in
polar coordinates, multiplication has a very natural interpreta-
tion: the angle of the result is the sum of the two original an-
gles, while the magnitude of the result is the product of the two
original magnitudes.

m\a
<latexit sha1_base64="g3X7oeBsG3LM+MEsdgmQz7zGo44=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeSmAV/TK/tob+mhwJFr/0bXZISTXKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjDb1ZAIzfwr1gSwtRDju/OXYyY2Lo7/MZaO5h7qYV8h4Ca/4fSu8OudqTlS5UDfRt0le26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgPbDiCEKEl8iB25jNh1Pu+mHMwMep+r3A5jFRWaBmLaozlwJmJBEBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGRRitnCJgSKgw5ZtcpfRJLZM7FIULF29qrXHkRR5iC2kzSjy7vgkxAbry6ptZm6LJBnMDFLtRJsKFUOY8zJSProKLLb5IwgX4UcpjStJqogCAUpiLxDyMgMUhz08jxr2MXjIaQy8L872XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd6IDTzYY0Dz0P+jFthyi0GG8at3lGaK1pFL1POrUw+21YvM7iGvq+g79O0Dp5VwDMB1tFJic7VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbK8r8FqCNxV0I6FJBU0k9GMF/SjrjIRZbvtTXswiHzU/J+4K3lAAP+Xdfto8CxUuuNXrKZkzeFdPc7lnMBc3SAF4SUbnb6/e/Zzy0bB/rA3SJsMmMewomjE4HmkSxSm62XK04bB/KnECKu6dstD4bPBalwuFMQ1JSTJN46cTuVIChATrstLodNw3GiQhSL0n3dQ1rXl6cQPuCAPTHGvNftbknvB6NDbM5mfWtMRt3YB+U7w1uSfMjOELuYBd4oY2GJxIOLknmAYyZxJhWeIn+WriQYnD4OS40KBmFyy5ZWsKtaurkrucNvpW6tYEuy2hsFQrfynz31CU7GEHbdV3TmvNCNsydrZrzUjaMnYe3GXUU9YtMuVma/1AbjOJTvbzW2aWO3FP9TY62c9vmVhu0z3V2+hkP79lvrmH29kt880dXcreME+YXadLLMacPUcQKUYzBvFQofBOXLPn4scVsYD+KO5W6niusK34a/Wy6P+IaLMjiki8mfTmC0kOrvtH+uDI+OVF99Xp9vX0WPlW+U75QdEVU3mlvFUulImCFa78qvym/N75o/Nn5+/OPwX14YNtznOltjr//gd+/r+j</latexit>
m\ a

n\b
<latexit sha1_base64="fY0ONgRsYmVpkgx/FktjbzhCrLc=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeimAV/TK/tob+mhwJFr/0bXZKSTHKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjHb1ZAIzfwr1gawtRDju/OXYyY2Lo7/MZKAsz9zEK+Q0At/rOzu8OudqQVS5UDfRN0lc26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgO7DiCEKEl8iB25jNh1Pu+mHMwMeZ+r3A5jFRWaDmLaozlwJmJBUBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGZRitnDJgSKgw5etCpexJLZM7FIULF69rrXHkRR5iC2kzSj27vgkxAbry6pt5m6LJBnMNFLtRLsKFUOY8zJWProKLDb5IwwX4UcZjSrJqogCAUpiLxCKMgMUhL04jxr2MXjIaQy8Pi72XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd2IDTzIcGB5yF/xq0w4xaDNeNW7ygrFK2ilxnnVi6fbauXOVxD31fQ91lWB88q4JkA6+hkh87VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbCcVOJHgdQVdS2haQVMJ/VhBP8o6I2GW2/6Ul7MoRs3PibuCNxTAz3i3nzXPQoULbvV6Su4M3tWzQu4ZzMUNUgJemtP526t3P2d8NOwfa4OsybBJDFuKZgyOR5pEccpuNhxtOOyfSpyAintnV2h8Nnity4XCmIZkRzJN46cTuVIKhATJrtLodNw3GiQhSL0n3dQ1rXn6xMFbwsA0x1qzn4TcE16PxobZ/ExCd7itG9BvipeQe8LMGL6QC9g73NAGgxMJJ/cE00DmTCIsd/hJsZp4sMNhcHJcalCzC5bcsjGF2tVVyV1OG30jdWuC3ZZQWqqVv5T5byhK97CDtupbp7VmhG0ZW9u1ZqRtGVsPbjPqKUmLTIXZWj9Q2Eyik/38lpkVTtxTvY1O9vNbJlbYdE/1NjrZz2+Zb+HhdnbLfAtH72RvmCfMr9MlFmPOnyOIlKMZg3ioUHgnrtlz8eOKWEB/FHcrdTxX2Fb8tXp59H9EtN4SRSTeTHrzhSQH1/0jfXBk/PKi++p083p6rHyrfKf8oOiKqbxS3ioXykTBCld+VX5Tfu/80fmz83fnn5L68MEm57lSW51//wMuY7+1</latexit>
<latexit
sha1_base64="g3X7oeBsG3LM+MEsdgmQz7zGo44=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeSmAV/TK/tob+mhwJFr/0bXZISTXKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjDb1ZAIzfwr1gSwtRDju/OXYyY2Lo7/MZaO5h7qYV8h4Ca/4fSu8OudqTlS5UDfRt0le26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgPbDiCEKEl8iB25jNh1Pu+mHMwMep+r3A5jFRWaBmLaozlwJmJBEBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGRRitnCJgSKgw5ZtcpfRJLZM7FIULF29qrXHkRR5iC2kzSjy7vgkxAbry6ptZm6LJBnMDFLtRJsKFUOY8zJSProKLLb5IwgX4UcpjStJqogCAUpiLxDyMgMUhz08jxr2MXjIaQy8L872XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd6IDTzYY0Dz0P+jFthyi0GG8at3lGaK1pFL1POrUw+21YvM7iGvq+g79O0Dp5VwDMB1tFJic7VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbK8r8FqCNxV0I6FJBU0k9GMF/SjrjIRZbvtTXswiHzU/J+4K3lAAP+Xdfto8CxUuuNXrKZkzeFdPc7lnMBc3SAF4SUbnb6/e/Zzy0bB/rA3SJsMmMewomjE4HmkSxSm62XK04bB/KnECKu6dstD4bPBalwuFMQ1JSTJN46cTuVIChATrstLodNw3GiQhSL0n3dQ1rXl6cQPuCAPTHGvNftbknvB6NDbM5mfWtMRt3YB+U7w1uSfMjOELuYBd4oY2GJxIOLknmAYyZxJhWeIn+WriQYnD4OS40KBmFyy5ZWsKtaurkrucNvpW6tYEuy2hsFQrfynz31CU7GEHbdV3TmvNCNsydrZrzUjaMnYe3GXUU9YtMuVma/1AbjOJTvbzW2aWO3FP9TY62c9vmVhu0z3V2+hkP79lvrmH29kt880dXcreME+YXadLLMacPUcQKUYzBvFQofBOXLPn4scVsYD+KO5W6niusK34a/Wy6P+IaLMjiki8mfTmC0kOrvtH+uDI+OVF99Xp9vX0WPlW+U75QdEVU3mlvFUulImCFa78qvym/N75o/Nn5+/OPwX14YNtznOltjr//gd+/r+j</latexit>
<latexit

sha1_base64="fY0ONgRsYmVpkgx/FktjbzhCrLc=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeimAV/TK/tob+mhwJFr/0bXZKSTHKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjHb1ZAIzfwr1gawtRDju/OXYyY2Lo7/MZKAsz9zEK+Q0At/rOzu8OudqQVS5UDfRN0lc26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgO7DiCEKEl8iB25jNh1Pu+mHMwMeZ+r3A5jFRWaDmLaozlwJmJBUBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGZRitnDJgSKgw5etCpexJLZM7FIULF69rrXHkRR5iC2kzSj27vgkxAbry6pt5m6LJBnMNFLtRLsKFUOY8zJWProKLDb5IwwX4UcZjSrJqogCAUpiLxCKMgMUhL04jxr2MXjIaQy8Pi72XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd2IDTzIcGB5yF/xq0w4xaDNeNW7ygrFK2ilxnnVi6fbauXOVxD31fQ91lWB88q4JkA6+hkh87VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbCcVOJHgdQVdS2haQVMJ/VhBP8o6I2GW2/6Ul7MoRs3PibuCNxTAz3i3nzXPQoULbvV6Su4M3tWzQu4ZzMUNUgJemtP526t3P2d8NOwfa4OsybBJDFuKZgyOR5pEccpuNhxtOOyfSpyAintnV2h8Nnity4XCmIZkRzJN46cTuVIKhATJrtLodNw3GiQhSL0n3dQ1rXn6xMFbwsA0x1qzn4TcE16PxobZ/ExCd7itG9BvipeQe8LMGL6QC9g73NAGgxMJJ/cE00DmTCIsd/hJsZp4sMNhcHJcalCzC5bcsjGF2tVVyV1OG30jdWuC3ZZQWqqVv5T5byhK97CDtupbp7VmhG0ZW9u1ZqRtGVsPbjPqKUmLTIXZWj9Q2Eyik/38lpkVTtxTvY1O9vNbJlbYdE/1NjrZz2+Zb+HhdnbLfAtH72RvmCfMr9MlFmPOnyOIlKMZg3ioUHgnrtlz8eOKWEB/FHcrdTxX2Fb8tXp59H9EtN4SRSTeTHrzhSQH1/0jfXBk/PKi++p083p6rHyrfKf8oOiKqbxS3ioXykTBCld+VX5Tfu/80fmz83fnn5L68MEm57lSW51//wMuY7+1</latexit>
n\
b

n\b
<latexit sha1_base64="fY0ONgRsYmVpkgx/FktjbzhCrLc=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeimAV/TK/tob+mhwJFr/0bXZKSTHKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjHb1ZAIzfwr1gawtRDju/OXYyY2Lo7/MZKAsz9zEK+Q0At/rOzu8OudqQVS5UDfRN0lc26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgO7DiCEKEl8iB25jNh1Pu+mHMwMeZ+r3A5jFRWaDmLaozlwJmJBUBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGZRitnDJgSKgw5etCpexJLZM7FIULF69rrXHkRR5iC2kzSj27vgkxAbry6pt5m6LJBnMNFLtRLsKFUOY8zJWProKLDb5IwwX4UcZjSrJqogCAUpiLxCKMgMUhL04jxr2MXjIaQy8Pi72XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd2IDTzIcGB5yF/xq0w4xaDNeNW7ygrFK2ilxnnVi6fbauXOVxD31fQ91lWB88q4JkA6+hkh87VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbCcVOJHgdQVdS2haQVMJ/VhBP8o6I2GW2/6Ul7MoRs3PibuCNxTAz3i3nzXPQoULbvV6Su4M3tWzQu4ZzMUNUgJemtP526t3P2d8NOwfa4OsybBJDFuKZgyOR5pEccpuNhxtOOyfSpyAintnV2h8Nnity4XCmIZkRzJN46cTuVIKhATJrtLodNw3GiQhSL0n3dQ1rXn6xMFbwsA0x1qzn4TcE16PxobZ/ExCd7itG9BvipeQe8LMGL6QC9g73NAGgxMJJ/cE00DmTCIsd/hJsZp4sMNhcHJcalCzC5bcsjGF2tVVyV1OG30jdWuC3ZZQWqqVv5T5byhK97CDtupbp7VmhG0ZW9u1ZqRtGVsPbjPqKUmLTIXZWj9Q2Eyik/38lpkVTtxTvY1O9vNbJlbYdE/1NjrZz2+Zb+HhdnbLfAtH72RvmCfMr9MlFmPOnyOIlKMZg3ioUHgnrtlz8eOKWEB/FHcrdTxX2Fb8tXp59H9EtN4SRSTeTHrzhSQH1/0jfXBk/PKi++p083p6rHyrfKf8oOiKqbxS3ioXykTBCld+VX5Tfu/80fmz83fnn5L68MEm57lSW51//wMuY7+1</latexit>

m\a
<latexit sha1_base64="g3X7oeBsG3LM+MEsdgmQz7zGo44=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeSmAV/TK/tob+mhwJFr/0bXZISTXKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjDb1ZAIzfwr1gSwtRDju/OXYyY2Lo7/MZaO5h7qYV8h4Ca/4fSu8OudqTlS5UDfRt0le26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgPbDiCEKEl8iB25jNh1Pu+mHMwMep+r3A5jFRWaBmLaozlwJmJBEBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGRRitnCJgSKgw5ZtcpfRJLZM7FIULF29qrXHkRR5iC2kzSjy7vgkxAbry6ptZm6LJBnMDFLtRJsKFUOY8zJSProKLLb5IwgX4UcpjStJqogCAUpiLxDyMgMUhz08jxr2MXjIaQy8L872XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd6IDTzYY0Dz0P+jFthyi0GG8at3lGaK1pFL1POrUw+21YvM7iGvq+g79O0Dp5VwDMB1tFJic7VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbK8r8FqCNxV0I6FJBU0k9GMF/SjrjIRZbvtTXswiHzU/J+4K3lAAP+Xdfto8CxUuuNXrKZkzeFdPc7lnMBc3SAF4SUbnb6/e/Zzy0bB/rA3SJsMmMewomjE4HmkSxSm62XK04bB/KnECKu6dstD4bPBalwuFMQ1JSTJN46cTuVIChATrstLodNw3GiQhSL0n3dQ1rXl6cQPuCAPTHGvNftbknvB6NDbM5mfWtMRt3YB+U7w1uSfMjOELuYBd4oY2GJxIOLknmAYyZxJhWeIn+WriQYnD4OS40KBmFyy5ZWsKtaurkrucNvpW6tYEuy2hsFQrfynz31CU7GEHbdV3TmvNCNsydrZrzUjaMnYe3GXUU9YtMuVma/1AbjOJTvbzW2aWO3FP9TY62c9vmVhu0z3V2+hkP79lvrmH29kt880dXcreME+YXadLLMacPUcQKUYzBvFQofBOXLPn4scVsYD+KO5W6niusK34a/Wy6P+IaLMjiki8mfTmC0kOrvtH+uDI+OVF99Xp9vX0WPlW+U75QdEVU3mlvFUulImCFa78qvym/N75o/Nn5+/OPwX14YNtznOltjr//gd+/r+j</latexit>

<latexit sha1_base64="FZjJJqJPnOc1WvjKcCO3ctz8VKY=">AAAKQnicfVbLbttGFGXSV+Q2rdMssyEqFEhSwSClWJIXARJLbrJoYtew7ACmYAxHVxKh4QPDoShmMP+Qr8k2WfQn+gvdFd120SEpySSH6mx0cc+5lzPnHozGDogTMsP4887dL7786utv7jX2vv3u/vc/7D/48TL0I4phhH3i03c2CoE4HoyYwwi8Cygg1yZwZS8GKX61BBo6vnfBkgDGLpp5ztTBiMnUzf7Tx1Y8w9wVVuxj7oknFvJmBPQ8jYT+i54htnhys980Doxs6WpgroOmtl5nNw8amjXxceSCxzBBYXhtGgEbc0SZgwmIPSsKIUB4gWZwHbFpf8wdL4gYeFjoP0tsGhGd+Xq6b33iUMCMJDJAmDqyg47niCLM5On2yq1C8JALYWuydIIwD8PlLA8YktKM+SqTTtwvVfIZRcHcwavS1jhyQxexuZIME9cuJyEiQJduOZluU26ywlwBxU6YinAmlTkN0nGEF/7ZGp8nwRy8UPCIElEslABQClNZmIUhsCjg2WmkBxbhc0YjaKVhlns+RHRxDpOW7FNKlLczJT5i5ZRdOYbnYJhKvcWe1MyDGPuui7wJtwLBLQYrxq3WgcgULaLngnMrlc+29fMULqFvC+hbIcrgSQE8kWAZHW3RqT6qll4WwEvlq1cF9KpaakcFNFLQZQFdKp3tuADHCrwqoCsFTQpooqDvC+h7VWckzXLdHvN8Ftmo+SlxlvCKAniCN9uiehYqXXBtlktSZ/CmKTK5JzCV10oOuElK568v3vwm+KDfPjS6osqwSQQbitHpHg4MhTLLd7PmGP1++1jh+FTeQdtGw5PuS1NtFEQ0IFtSr9f59UjtlAAhfrztNDgetjsVkhSkvCezZxpG9fTyMtwQur3e0KjuJya3hJeDYadX/UxMt7htdqBdFS8mt4RJp/9MbWBv8Y7R7R4pOLkl9DqoN1EIiy1+lK0q7m9x6B4d5hqU7IIVt6xNoTdNXXHXrI6+lrq2wK4ryC1Vy1+o/FcUJTvYfl33jdNqK4K6io3taiuSuoqNBzcV5ZK4RqbMbLUfyGym0Mlufs3MMifu6F5HJ7v5NRPLbLqjex2d7ObXzDfzcD27Zr6Zo7eyV8wTpNfpAssxp88RRPLRDEE+VCi8kdfsqfxzRcynT+XdSmeuI20rf61WGv0fEa02RBnJN5NZfSGpwWX7wOwedH5/1nxxvH493dMeaT9pjzVT62kvtNfamTbSsPZB+6h90j43/mj81fi78U9OvXtnXfNQK63Gv/8B9AfHjA==</latexit>

(mn)\(a + b)
b)
(a+
\
n)
(m
<latexit
sha1_base64="FZjJJqJPnOc1WvjKcCO3ctz8VKY=">AAAKQnicfVbLbttGFGXSV+Q2rdMssyEqFEhSwSClWJIXARJLbrJoYtew7ACmYAxHVxKh4QPDoShmMP+Qr8k2WfQn+gvdFd120SEpySSH6mx0cc+5lzPnHozGDogTMsP4887dL7786utv7jX2vv3u/vc/7D/48TL0I4phhH3i03c2CoE4HoyYwwi8Cygg1yZwZS8GKX61BBo6vnfBkgDGLpp5ztTBiMnUzf7Tx1Y8w9wVVuxj7oknFvJmBPQ8jYT+i54htnhys980Doxs6WpgroOmtl5nNw8amjXxceSCxzBBYXhtGgEbc0SZgwmIPSsKIUB4gWZwHbFpf8wdL4gYeFjoP0tsGhGd+Xq6b33iUMCMJDJAmDqyg47niCLM5On2yq1C8JALYWuydIIwD8PlLA8YktKM+SqTTtwvVfIZRcHcwavS1jhyQxexuZIME9cuJyEiQJduOZluU26ywlwBxU6YinAmlTkN0nGEF/7ZGp8nwRy8UPCIElEslABQClNZmIUhsCjg2WmkBxbhc0YjaKVhlns+RHRxDpOW7FNKlLczJT5i5ZRdOYbnYJhKvcWe1MyDGPuui7wJtwLBLQYrxq3WgcgULaLngnMrlc+29fMULqFvC+hbIcrgSQE8kWAZHW3RqT6qll4WwEvlq1cF9KpaakcFNFLQZQFdKp3tuADHCrwqoCsFTQpooqDvC+h7VWckzXLdHvN8Ftmo+SlxlvCKAniCN9uiehYqXXBtlktSZ/CmKTK5JzCV10oOuElK568v3vwm+KDfPjS6osqwSQQbitHpHg4MhTLLd7PmGP1++1jh+FTeQdtGw5PuS1NtFEQ0IFtSr9f59UjtlAAhfrztNDgetjsVkhSkvCezZxpG9fTyMtwQur3e0KjuJya3hJeDYadX/UxMt7htdqBdFS8mt4RJp/9MbWBv8Y7R7R4pOLkl9DqoN1EIiy1+lK0q7m9x6B4d5hqU7IIVt6xNoTdNXXHXrI6+lrq2wK4ryC1Vy1+o/FcUJTvYfl33jdNqK4K6io3taiuSuoqNBzcV5ZK4RqbMbLUfyGym0Mlufs3MMifu6F5HJ7v5NRPLbLqjex2d7ObXzDfzcD27Zr6Zo7eyV8wTpNfpAssxp88RRPLRDEE+VCi8kdfsqfxzRcynT+XdSmeuI20rf61WGv0fEa02RBnJN5NZfSGpwWX7wOwedH5/1nxxvH493dMeaT9pjzVT62kvtNfamTbSsPZB+6h90j43/mj81fi78U9OvXtnXfNQK63Gv/8B9AfHjA==</latexit>

<latexit sha1_base64="FZjJJqJPnOc1WvjKcCO3ctz8VKY=">AAAKQnicfVbLbttGFGXSV+Q2rdMssyEqFEhSwSClWJIXARJLbrJoYtew7ACmYAxHVxKh4QPDoShmMP+Qr8k2WfQn+gvdFd120SEpySSH6mx0cc+5lzPnHozGDogTMsP4887dL7786utv7jX2vv3u/vc/7D/48TL0I4phhH3i03c2CoE4HoyYwwi8Cygg1yZwZS8GKX61BBo6vnfBkgDGLpp5ztTBiMnUzf7Tx1Y8w9wVVuxj7oknFvJmBPQ8jYT+i54htnhys980Doxs6WpgroOmtl5nNw8amjXxceSCxzBBYXhtGgEbc0SZgwmIPSsKIUB4gWZwHbFpf8wdL4gYeFjoP0tsGhGd+Xq6b33iUMCMJDJAmDqyg47niCLM5On2yq1C8JALYWuydIIwD8PlLA8YktKM+SqTTtwvVfIZRcHcwavS1jhyQxexuZIME9cuJyEiQJduOZluU26ywlwBxU6YinAmlTkN0nGEF/7ZGp8nwRy8UPCIElEslABQClNZmIUhsCjg2WmkBxbhc0YjaKVhlns+RHRxDpOW7FNKlLczJT5i5ZRdOYbnYJhKvcWe1MyDGPuui7wJtwLBLQYrxq3WgcgULaLngnMrlc+29fMULqFvC+hbIcrgSQE8kWAZHW3RqT6qll4WwEvlq1cF9KpaakcFNFLQZQFdKp3tuADHCrwqoCsFTQpooqDvC+h7VWckzXLdHvN8Ftmo+SlxlvCKAniCN9uiehYqXXBtlktSZ/CmKTK5JzCV10oOuElK568v3vwm+KDfPjS6osqwSQQbitHpHg4MhTLLd7PmGP1++1jh+FTeQdtGw5PuS1NtFEQ0IFtSr9f59UjtlAAhfrztNDgetjsVkhSkvCezZxpG9fTyMtwQur3e0KjuJya3hJeDYadX/UxMt7htdqBdFS8mt4RJp/9MbWBv8Y7R7R4pOLkl9DqoN1EIiy1+lK0q7m9x6B4d5hqU7IIVt6xNoTdNXXHXrI6+lrq2wK4ryC1Vy1+o/FcUJTvYfl33jdNqK4K6io3taiuSuoqNBzcV5ZK4RqbMbLUfyGym0Mlufs3MMifu6F5HJ7v5NRPLbLqjex2d7ObXzDfzcD27Zr6Zo7eyV8wTpNfpAssxp88RRPLRDEE+VCi8kdfsqfxzRcynT+XdSmeuI20rf61WGv0fEa02RBnJN5NZfSGpwWX7wOwedH5/1nxxvH493dMeaT9pjzVT62kvtNfamTbSsPZB+6h90j43/mj81fi78U9OvXtnXfNQK63Gv/8B9AfHjA==</latexit>

(mn)\(a + b)
3.4. COMPLEX NUMBERS 107

The easiest way to define division is as the operation that cancels


out multipliciation. For each z, there should be a z-1 so that
multiplying by z and then by z-1 brings you back to where you
were. Put simply zz-1 = 1. Dividing by z can then be defined
as multiplying by z-1 . Using the polar notation, we can see that
the following definition of z-1 does the trick:

1
z-1 = (m\a)-1 = \ - a.
m

Note how this view of multiplication agrees with special cases


that we already know. For real numbers, the angle is always
0, and the magnitude is equal to the real value. Therefore,
multiplying real numbers together reduces to the multiplication
we already knew.
The number i is written as 1\90 deg in polar coordinates.
That means that multiplying a number z by i keeps the magnitude
of z the same, but rotates it by 90 degrees counter-clockwise. A
real number multiplied by i is rotated from the horizontal to the
vertical axis. If we multiply by i twice, we rotate 180 degrees,
which for real numbers means negating them. This makes sense
too, because z · i · i = zi2 = z · -1.
Which brings us to exponentiation. Raising complex numbers
to arbitrary values, including to complex ones, is an important
topic, but one which we can sidestep here. All we will need is
the ability to raise a complex number to a natural number. That
follows very naturally from multiplication:

(m\a)n = (m\a)(m\a) . . . (m\a) = mn \na .

Again, let’s look at some special cases. If the angle is 0, we


stay on the real number line, and the operation reduces to or-
dinary exponentiation. If the magnitude is 1 but the angle is
nonzero, then we rotate about the origin over the unit circle in
n steps of angle a.
108 CHAPTER 3—PROVING THE SPECTRAL THEOREM

m\a
<latexit sha1_base64="hReX/Yo2F+UsrNkK+mBANLaHzpg=">AAAKZXicfZZdb9s2FIbV7qvxli1di93sYsKMAd1gBJLd2M5FgDZ21l4sTRbESYHIDSj6WBZMfYCibKuEftV+TW+3i/2NUbLsSCI93uSEz3uOyMMXNO2QuBEzjE+PHn/2+RdffvVkr/H1N/vffnfw9PubKIgphhEOSEDf2ygC4vowYi4j8D6kgDybwK09H2T8dgE0cgP/miUhjD3k+O7UxYiJqfuDc2vpYO6lFvIdAnr+H0p1y2q8UJJfP/j6iV6gD34B/YLeHzSNQyMfuhyYRdDUinF5/3RPsyYBjj3wGSYoiu5MI2RjjihzMYG0YcURhAjPkQN3MZv2x9z1w5iBj1P9F8GmMdFZoGdb0ycuBcxIIgKEqSsq6HiGKMJMNKBRLRWBjzyIWpOFG0brMFo464Ah0b0xX+XdTfcrmdyhKJy5eFVZGkde5CE2kyajxLOrkxAToAuvOpktUyyyplwBxW6UNeFSdOYizE4sug4uCz5Lwhn4UcpjStJyogBAKUxFYh5GwOKQ57sRNplHJ4zG0MrCfO5kiOj8CiYtUacyUV3OlASIVafs2jZ8F8NU9DttiJ75sMSB5yF/wq0w5RaDFeNW6zDNO1qmVynnVtY+29avMlyh70r0XZpW4VkJnglYpaMtneqjeupNCd5IX70t0dt6qh2XaCzRRYkupMr2soSXEl6V6EqiSYkmEv1Yoh/lPiNhlrv2mK/PIj9qfkHcBbyhAH7Km+20vhcqXHBnVlMyZ/CmmebtnsBU3Dxr4CWZnL+9Pv8j5YN++8jopnWFTWLYSIxO92hgSBJnvZpCY/T77VNJE1Bx8WwLDc+6r025UBjTkGxFvV7n92O5UgKEBMttpcHpsN2piURDqmsye6Zh1HcvbsCNoNvrDY36epbkQfB6MOz06p9Z0i23zQ60681bkgfBpNN/KRewt7xjdLvHEicPgl4H9SaSYL7lx/mo82DLoXt8tO5BxS5YckthCr1p6pK7HJW8aLUywVYlrC2l1M9l/RuKkh3qQFV94zRlRqjK2NhOmZGoMjYe3GRUU5aKNuVmU34gt5kkJ7v1ijPLnbijukpOdusVJ5bbdEd1lZzs1ivON/ewWq0439zR27bXzBNm1+kci2POniOIrI9mCOKhQuFcXLMX4scVsYD+Ju5W6niusK34a7Wy6P+EaLURiki8mcz6C0kObtqHZvew8+fL5qvT4vX0RPtR+1l7oZlaT3ulvdUutZGGtb+0T9rf2j97/zb2G88bP6yljx8VOc+0ymj89B+WddKD</latexit>

(m\a) = m \na
m\a
<latexit sha1_base64="hReX/Yo2F+UsrNkK+mBANLaHzpg=">AAAKZXicfZZdb9s2FIbV7qvxli1di93sYsKMAd1gBJLd2M5FgDZ21l4sTRbESYHIDSj6WBZMfYCibKuEftV+TW+3i/2NUbLsSCI93uSEz3uOyMMXNO2QuBEzjE+PHn/2+RdffvVkr/H1N/vffnfw9PubKIgphhEOSEDf2ygC4vowYi4j8D6kgDybwK09H2T8dgE0cgP/miUhjD3k+O7UxYiJqfuDc2vpYO6lFvIdAnr+H0p1y2q8UJJfP/j6iV6gD34B/YLeHzSNQyMfuhyYRdDUinF5/3RPsyYBjj3wGSYoiu5MI2RjjihzMYG0YcURhAjPkQN3MZv2x9z1w5iBj1P9F8GmMdFZoGdb0ycuBcxIIgKEqSsq6HiGKMJMNKBRLRWBjzyIWpOFG0brMFo464Ah0b0xX+XdTfcrmdyhKJy5eFVZGkde5CE2kyajxLOrkxAToAuvOpktUyyyplwBxW6UNeFSdOYizE4sug4uCz5Lwhn4UcpjStJyogBAKUxFYh5GwOKQ57sRNplHJ4zG0MrCfO5kiOj8CiYtUacyUV3OlASIVafs2jZ8F8NU9DttiJ75sMSB5yF/wq0w5RaDFeNW6zDNO1qmVynnVtY+29avMlyh70r0XZpW4VkJnglYpaMtneqjeupNCd5IX70t0dt6qh2XaCzRRYkupMr2soSXEl6V6EqiSYkmEv1Yoh/lPiNhlrv2mK/PIj9qfkHcBbyhAH7Km+20vhcqXHBnVlMyZ/CmmebtnsBU3Dxr4CWZnL+9Pv8j5YN++8jopnWFTWLYSIxO92hgSBJnvZpCY/T77VNJE1Bx8WwLDc+6r025UBjTkGxFvV7n92O5UgKEBMttpcHpsN2piURDqmsye6Zh1HcvbsCNoNvrDY36epbkQfB6MOz06p9Z0i23zQ60681bkgfBpNN/KRewt7xjdLvHEicPgl4H9SaSYL7lx/mo82DLoXt8tO5BxS5YckthCr1p6pK7HJW8aLUywVYlrC2l1M9l/RuKkh3qQFV94zRlRqjK2NhOmZGoMjYe3GRUU5aKNuVmU34gt5kkJ7v1ijPLnbijukpOdusVJ5bbdEd1lZzs1ivON/ewWq0439zR27bXzBNm1+kci2POniOIrI9mCOKhQuFcXLMX4scVsYD+Ju5W6niusK34a7Wy6P+EaLURiki8mcz6C0kObtqHZvew8+fL5qvT4vX0RPtR+1l7oZlaT3ulvdUutZGGtb+0T9rf2j97/zb2G88bP6yljx8VOc+0ymj89B+WddKD</latexit>

n n

(m\a)n = mn \na m\a


<latexit sha1_base64="hReX/Yo2F+UsrNkK+mBANLaHzpg=">AAAKZXicfZZdb9s2FIbV7qvxli1di93sYsKMAd1gBJLd2M5FgDZ21l4sTRbESYHIDSj6WBZMfYCibKuEftV+TW+3i/2NUbLsSCI93uSEz3uOyMMXNO2QuBEzjE+PHn/2+RdffvVkr/H1N/vffnfw9PubKIgphhEOSEDf2ygC4vowYi4j8D6kgDybwK09H2T8dgE0cgP/miUhjD3k+O7UxYiJqfuDc2vpYO6lFvIdAnr+H0p1y2q8UJJfP/j6iV6gD34B/YLeHzSNQyMfuhyYRdDUinF5/3RPsyYBjj3wGSYoiu5MI2RjjihzMYG0YcURhAjPkQN3MZv2x9z1w5iBj1P9F8GmMdFZoGdb0ycuBcxIIgKEqSsq6HiGKMJMNKBRLRWBjzyIWpOFG0brMFo464Ah0b0xX+XdTfcrmdyhKJy5eFVZGkde5CE2kyajxLOrkxAToAuvOpktUyyyplwBxW6UNeFSdOYizE4sug4uCz5Lwhn4UcpjStJyogBAKUxFYh5GwOKQ57sRNplHJ4zG0MrCfO5kiOj8CiYtUacyUV3OlASIVafs2jZ8F8NU9DttiJ75sMSB5yF/wq0w5RaDFeNW6zDNO1qmVynnVtY+29avMlyh70r0XZpW4VkJnglYpaMtneqjeupNCd5IX70t0dt6qh2XaCzRRYkupMr2soSXEl6V6EqiSYkmEv1Yoh/lPiNhlrv2mK/PIj9qfkHcBbyhAH7Km+20vhcqXHBnVlMyZ/CmmebtnsBU3Dxr4CWZnL+9Pv8j5YN++8jopnWFTWLYSIxO92hgSBJnvZpCY/T77VNJE1Bx8WwLDc+6r025UBjTkGxFvV7n92O5UgKEBMttpcHpsN2piURDqmsye6Zh1HcvbsCNoNvrDY36epbkQfB6MOz06p9Z0i23zQ60681bkgfBpNN/KRewt7xjdLvHEicPgl4H9SaSYL7lx/mo82DLoXt8tO5BxS5YckthCr1p6pK7HJW8aLUywVYlrC2l1M9l/RuKkh3qQFV94zRlRqjK2NhOmZGoMjYe3GRUU5aKNuVmU34gt5kkJ7v1ijPLnbijukpOdusVJ5bbdEd1lZzs1ivON/ewWq0439zR27bXzBNm1+kci2POniOIrI9mCOKhQuFcXLMX4scVsYD+Ju5W6niusK34a7Wy6P+EaLURiki8mcz6C0kObtqHZvew8+fL5qvT4vX0RPtR+1l7oZlaT3ulvdUutZGGtb+0T9rf2j97/zb2G88bP6yljx8VOc+0ymj89B+WddKD</latexit>

(m\a) = m \na n n

m\a
<latexit sha1_base64="g3X7oeBsG3LM+MEsdgmQz7zGo44=">AAAKLXicfVZNb9tGEGWSfkRu0zgNeuqFqFCgKASDFGNRPgRILLnJoYldw7IDmIKxXI0oQssPLJeSmAV/TK/tob+mhwJFr/0bXZISTXKp7sXjfW+Gs28eVmuHxI2Ypv314OGjTz797PPHnYMvvnzy1dPDZ19fR0FMMUxwQAL6wUYRENeHCXMZgQ8hBeTZBG7s5SjDb1ZAIzfwr1gSwtRDju/OXYyY2Lo7/MZaO5h7qYV8h4Ca/4fSu8OudqTlS5UDfRt0le26uHvWUaxZgGMPfIYJiqJbXQvZlCPKXEwgPbDiCEKEl8iB25jNh1Pu+mHMwMep+r3A5jFRWaBmLaozlwJmJBEBwtQVFVS8QBRhJg5yUC8VgY88iHqzlRtGRRitnCJgSKgw5ZtcpfRJLZM7FIULF29qrXHkRR5iC2kzSjy7vgkxAbry6ptZm6LJBnMDFLtRJsKFUOY8zJSProKLLb5IwgX4UcpjStJqogCAUpiLxDyMgMUhz08jxr2MXjIaQy8L872XY0SXlzDriTq1jXo7cxIgVt+yG8fwXQxzoXd6IDTzYY0Dz0P+jFthyi0GG8at3lGaK1pFL1POrUw+21YvM7iGvq+g79O0Dp5VwDMB1tFJic7VSTP1ugJeS1+9qaA3zVQ7rqCxhK4q6EqqbK8r8FqCNxV0I6FJBU0k9GMF/SjrjIRZbvtTXswiHzU/J+4K3lAAP+Xdfto8CxUuuNXrKZkzeFdPc7lnMBc3SAF4SUbnb6/e/Zzy0bB/rA3SJsMmMewomjE4HmkSxSm62XK04bB/KnECKu6dstD4bPBalwuFMQ1JSTJN46cTuVIChATrstLodNw3GiQhSL0n3dQ1rXl6cQPuCAPTHGvNftbknvB6NDbM5mfWtMRt3YB+U7w1uSfMjOELuYBd4oY2GJxIOLknmAYyZxJhWeIn+WriQYnD4OS40KBmFyy5ZWsKtaurkrucNvpW6tYEuy2hsFQrfynz31CU7GEHbdV3TmvNCNsydrZrzUjaMnYe3GXUU9YtMuVma/1AbjOJTvbzW2aWO3FP9TY62c9vmVhu0z3V2+hkP79lvrmH29kt880dXcreME+YXadLLMacPUcQKUYzBvFQofBOXLPn4scVsYD+KO5W6niusK34a/Wy6P+IaLMjiki8mfTmC0kOrvtH+uDI+OVF99Xp9vX0WPlW+U75QdEVU3mlvFUulImCFa78qvym/N75o/Nn5+/OPwX14YNtznOltjr//gd+/r+j</latexit>

-1 1 2
<latexit sha1_base64="4CXIZ/uO3SheZvJzFDMmn0nUgjI=">AAAKMnicfVZdb+NEFPUuX5vCQhfe4MUiQkIoVHa8jVOhlXablN0Hdluqpl2pjqrx5MaxMv7QeBzHO7LEr+EVHvgz8IZ45UcwdhLX9jjMS67mnHt959yjydghcSOmaX8+ePje+x98+NGjzsHHnzz+9LPDJ59fR0FMMUxwQAL61kYRENeHCXMZgbchBeTZBG7s5SjHb1ZAIzfwr1gawtRDju/OXYyY2Lo7/NJKbMy/1zPrhyIqg352d9jVjrRiqXKgb4Ousl0Xd086ijULcOyBzzBBUXSrayGbckSZiwlkB1YcQYjwEjlwG7P5cMpdP4wZ+DhTvxHYPCYqC9S8T3XmUsCMpCJAmLqigooXiCLMxGkO6qUi8JEHUW+2csNoE0YrZxMwJKSY8nUhVfa4lskdisKFi9e11jjyIg+xhbQZpZ5d34SYAF159c28TdFkg7kGit0oF+FCKHMe5vJHV8HFFl+k4QL8KOMxJVk1UQBAKcxFYhFGwOKQF6cRM19GzxiNoZeHxd6zMaLLS5j1RJ3aRr2dOQkQq2/ZjWP4Loa50Ds7EJr5kODA85A/41aYcYvBmnGrd5QVilbRy4xzK5fPttXLHK6hbyromyyrg2cV8EyAdXRSonN10ky9roDX0ldvKuhNM9WOK2gsoasKupIq20kFTiR4XUHXEppW0FRC31XQd7LOSJjltj/lm1kUo+bnxF3BSwrgZ7zbz5pnocIFt3o9JXcG7+pZIfcM5uIa2QBemtP5q6vXP2V8NOwfa4OsybBJDDuKZgyOR5pEcTbdbDnacNg/lTgBRb5TFhqfDV7ocqEwpiEpSaZp/HgiV0qBkCApK41Ox32jQRKC1HvSTV3TmqdPHLwjDExzrDX7Scg94cVobJjNzyS0xG3dgH5TvITcE2bG8KlcwC5xQxsMTiSc3BNMA5kzibAs8ZNiNfGgxGFwcrzRoGYXLLllawq1q6uSu5w2+lbq1gS7LWFjqVb+Uua/pCjdww7aqu+c1poRtmXsbNeakbZl7Dy4y6inJC0yFWZr/UBhM4lO9vNbZlY4cU/1NjrZz2+ZWGHTPdXb6GQ/v2W+hYfb2S3zLRxdyt4wT5hfp0ssxpw/RxDZjGYM4qFC4bW4Zs/FnytiAf1O3K3U8VxhW/Fr9fLo/4hovSOKSLyZ9OYLSQ6u+0f64Mj4+Wn3+en29fRI+Ur5WvlW0RVTea68Ui6UiYKVX5Rfld+U3zt/dP7q/N35Z0N9+GCb84VSW51//wOUx8D5</latexit>

The main thing we need, however, is not integer exponentiation,


but its inverse: the n-th root. Given some complex number z =
m\a, which other number do we raise to the power n so that we
end up at z? The answer follows directly fromp our polar view of
the complex plane: the magnitude should be n m, which is just
the real-valued n-thp root, and the angle should be a/n.
Let’s check for -1, which started all this business. Which
number should we raise to the power 2, so that we get p -1? The
magnitude of -1 is 1, so our number has magnitude 1 = 1.
Now we need a number with magnitude one, so that twice its
angle equals 180 . This is a 90 angle, so our number is 1\90 ,
which is exactly where we find i.
Notice how this solves the problem we had when we were
constrained to the real line. Then we had negative numbers to
deal with, and the real n-th root does not exist for negative num-
bers. Now, we are only ever applying the n-th root to magnitudes,
which are positive. The rest is dealt with by rotating away from
the real numbers. This means that when it comes to complex
numbers, we can always find some number that, when p raised to
n, gives us z. We call this the complex n-th root n z.
Note however, that this is not always a unique number. Let’s
say we raise 1\10 to the power of 4. This gives us 1\40 , so
3.4. COMPLEX NUMBERS 109

1\10 is a fourth root of 1\40 . However, if we raise 1\92.5


to the power of 4, we get 1\370 , which is equal to 1\10 as
well. Any angle a 0 for which a 0 n1 mod 360 = a will give us
an n-th root of m\a.
How many solutions does this give us for any given 1 number?
<latexit sha1_base64="pBODOlY2hdjYcgHbSTq2ulLh9K0=">AAAKXnicfZZdb9s2FIbVdmsbb1nT7mbAboQZA4bBCCQrsZ2LAG3stL1YmyyIkwKRG1D0sS2Y+gBFWVYI/aP9mt0V3cV+yijJH5Iojzc+4POeI/LwBU3LJ3bANO3Lo8dPvvn26bPne43vvt//4cXBy1c3gRdSDEPsEY9+slAAxHZhyGxG4JNPATkWgVtr3k/57QJoYHvuNYt9GDlo6toTGyMmpu4P3j58NueY64l6quqqaTYePre3obENj7bh8TbsrML7g6Z2qGVDlQN9FTSV1bi8f7mnmGMPhw64DBMUBHe65rMRR5TZmEDSMMMAfITnaAp3IZv0Rtx2/ZCBixP1V8EmIVGZp6Z7Usc2BcxILAKEqS0qqHiGKMJM7LxRLhWAixwIWuOF7Qd5GCymecCQaNuIL7O2JvulTD6lyJ/ZeFlaGkdO4CA2kyaD2LHKkxASoAunPJkuUyyyolwCxXaQNuFSdObCT48quPYuV3wW+zNwg4SHlCTFRAGAUpiIxCwMgIU+z3Yj/DEPThkNoZWG2dzpANH5FYxbok5porycCfEQK09ZlW24NoaJ6HfSED1zIcKe4yB3zE0/4SaDJeNm6zDJOlqkVwnnZto+y1KvUlyiHwv0Y5KU4XkBngtYpsMNnajDaupNAd5IX70t0NtqqhUWaCjRRYEupMpWVMCRhJcFupRoXKCxRB8K9EHuMxJmuWuPeH4W2VHzC2Iv4B0FcBPebCfVvVDhgju9nJI6gzf1JGv3GCbiysmBE6dy/v76wx8J7/fax1onqSosEsJaohmd474mSab5alYarddrn0kajyJ3uik0OO+80eVCfkh9shF1u8bbE7lSDIR40aZS/2zQNioi0ZDymvSurmnV3UdTvBZ0ut2BVl1PRLaCN/2B0a1+JqIbbukGtKvNi8hWMDZ6R3IBa8MNrdM5kTjZCroG6o4lwXzDT7JR5d6GQ+fkOO9ByS5YcsvKFGpTVyV3Tevkq1bXJlh1CbmlavVzWf+OoniH2qurvnZabYZfl7G2XW1GXJex9uA6o5wS1bQpM1vtBzKbSXKyW19zZpkTd1Svk5Pd+poTy2y6o3qdnOzW15xv5uF6dc35Zo7etL1iHj+9TsWryPTT5wgi+dEMQDxUKHwQ1+yF+HNFzKO/i7uVTh1b2Fb8mq00+j8hWq6FIhJvJr36QpKDm/ah3jk0/jxqvj5bvZ6eKz8rvyi/KbrSVV4r75VLZahg5S/lb+Wr8s/ev42njf3Gi1z6+NEq50elNBo//QfNyMbT</latexit>

It’s easiest to visualize this if we plot the n-th rootszof=1.1


z1 = 1 z2 = 1
<latexit sha1_base64="pBODOlY2hdjYcgHbSTq2ulLh9K0=">AAAKXnicfZZdb9s2FIbVdmsbb1nT7mbAboQZA4bBCCQrsZ2LAG3stL1YmyyIkwKRG1D0sS2Y+gBFWVYI/aP9mt0V3cV+yijJH5Iojzc+4POeI/LwBU3LJ3bANO3Lo8dPvvn26bPne43vvt//4cXBy1c3gRdSDEPsEY9+slAAxHZhyGxG4JNPATkWgVtr3k/57QJoYHvuNYt9GDlo6toTGyMmpu4P3j58NueY64l6quqqaTYePre3obENj7bh8TbsrML7g6Z2qGVDlQN9FTSV1bi8f7mnmGMPhw64DBMUBHe65rMRR5TZmEDSMMMAfITnaAp3IZv0Rtx2/ZCBixP1V8EmIVGZp6Z7Usc2BcxILAKEqS0qqHiGKMJM7LxRLhWAixwIWuOF7Qd5GCymecCQaNuIL7O2JvulTD6lyJ/ZeFlaGkdO4CA2kyaD2LHKkxASoAunPJkuUyyyolwCxXaQNuFSdObCT48quPYuV3wW+zNwg4SHlCTFRAGAUpiIxCwMgIU+z3Yj/DEPThkNoZWG2dzpANH5FYxbok5porycCfEQK09ZlW24NoaJ6HfSED1zIcKe4yB3zE0/4SaDJeNm6zDJOlqkVwnnZto+y1KvUlyiHwv0Y5KU4XkBngtYpsMNnajDaupNAd5IX70t0NtqqhUWaCjRRYEupMpWVMCRhJcFupRoXKCxRB8K9EHuMxJmuWuPeH4W2VHzC2Iv4B0FcBPebCfVvVDhgju9nJI6gzf1JGv3GCbiysmBE6dy/v76wx8J7/fax1onqSosEsJaohmd474mSab5alYarddrn0kajyJ3uik0OO+80eVCfkh9shF1u8bbE7lSDIR40aZS/2zQNioi0ZDymvSurmnV3UdTvBZ0ut2BVl1PRLaCN/2B0a1+JqIbbukGtKvNi8hWMDZ6R3IBa8MNrdM5kTjZCroG6o4lwXzDT7JR5d6GQ+fkOO9ByS5YcsvKFGpTVyV3Tevkq1bXJlh1CbmlavVzWf+OoniH2qurvnZabYZfl7G2XW1GXJex9uA6o5wS1bQpM1vtBzKbSXKyW19zZpkTd1Svk5Pd+poTy2y6o3qdnOzW15xv5uF6dc35Zo7etL1iHj+9TsWryPTT5wgi+dEMQDxUKHwQ1+yF+HNFzKO/i7uVTh1b2Fb8mq00+j8hWq6FIhJvJr36QpKDm/ah3jk0/jxqvj5bvZ6eKz8rvyi/KbrSVV4r75VLZahg5S/lb+Wr8s/ev42njf3Gi1z6+NEq50elNBo//QfNyMbT</latexit>

z1 = 1 z2 = 1 z3 = 1
<latexit sha1_base64="pBODOlY2hdjYcgHbSTq2ulLh9K0=">AAAKXnicfZZdb9s2FIbVdmsbb1nT7mbAboQZA4bBCCQrsZ2LAG3stL1YmyyIkwKRG1D0sS2Y+gBFWVYI/aP9mt0V3cV+yijJH5Iojzc+4POeI/LwBU3LJ3bANO3Lo8dPvvn26bPne43vvt//4cXBy1c3gRdSDEPsEY9+slAAxHZhyGxG4JNPATkWgVtr3k/57QJoYHvuNYt9GDlo6toTGyMmpu4P3j58NueY64l6quqqaTYePre3obENj7bh8TbsrML7g6Z2qGVDlQN9FTSV1bi8f7mnmGMPhw64DBMUBHe65rMRR5TZmEDSMMMAfITnaAp3IZv0Rtx2/ZCBixP1V8EmIVGZp6Z7Usc2BcxILAKEqS0qqHiGKMJM7LxRLhWAixwIWuOF7Qd5GCymecCQaNuIL7O2JvulTD6lyJ/ZeFlaGkdO4CA2kyaD2LHKkxASoAunPJkuUyyyolwCxXaQNuFSdObCT48quPYuV3wW+zNwg4SHlCTFRAGAUpiIxCwMgIU+z3Yj/DEPThkNoZWG2dzpANH5FYxbok5porycCfEQK09ZlW24NoaJ6HfSED1zIcKe4yB3zE0/4SaDJeNm6zDJOlqkVwnnZto+y1KvUlyiHwv0Y5KU4XkBngtYpsMNnajDaupNAd5IX70t0NtqqhUWaCjRRYEupMpWVMCRhJcFupRoXKCxRB8K9EHuMxJmuWuPeH4W2VHzC2Iv4B0FcBPebCfVvVDhgju9nJI6gzf1JGv3GCbiysmBE6dy/v76wx8J7/fax1onqSosEsJaohmd474mSab5alYarddrn0kajyJ3uik0OO+80eVCfkh9shF1u8bbE7lSDIR40aZS/2zQNioi0ZDymvSurmnV3UdTvBZ0ut2BVl1PRLaCN/2B0a1+JqIbbukGtKvNi8hWMDZ6R3IBa8MNrdM5kTjZCroG6o4lwXzDT7JR5d6GQ+fkOO9ByS5YcsvKFGpTVyV3Tevkq1bXJlh1CbmlavVzWf+OoniH2qurvnZabYZfl7G2XW1GXJex9uA6o5wS1bQpM1vtBzKbSXKyW19zZpkTd1Svk5Pd+poTy2y6o3qdnOzW15xv5uF6dc35Zo7etL1iHj+9TsWryPTT5wgi+dEMQDxUKHwQ1+yF+HNFzKO/i7uVTh1b2Fb8mq00+j8hWq6FIhJvJr36QpKDm/ah3jk0/jxqvj5bvZ6eKz8rvyi/KbrSVV4r75VLZahg5S/lb+Wr8s/ev42njf3Gi1z6+NEq50elNBo//QfNyMbT</latexit>

z2 = 1 z3 = 1 z4 = 1
z3 = 1 z4 = 1 z5 = 1
z4 = 1 z5 = 1 z6 = 1
z5 = 1 z6 = 1
z6 = 1
z1 = 1
<latexit sha1_base64="pBODOlY2hdjYcgHbSTq2ulLh9K0=">AAAKXnicfZZdb9s2FIbVdmsbb1nT7mbAboQZA4bBCCQrsZ2LAG3stL1YmyyIkwKRG1D0sS2Y+gBFWVYI/aP9mt0V3cV+yijJH5Iojzc+4POeI/LwBU3LJ3bANO3Lo8dPvvn26bPne43vvt//4cXBy1c3gRdSDEPsEY9+slAAxHZhyGxG4JNPATkWgVtr3k/57QJoYHvuNYt9GDlo6toTGyMmpu4P3j58NueY64l6quqqaTYePre3obENj7bh8TbsrML7g6Z2qGVDlQN9FTSV1bi8f7mnmGMPhw64DBMUBHe65rMRR5TZmEDSMMMAfITnaAp3IZv0Rtx2/ZCBixP1V8EmIVGZp6Z7Usc2BcxILAKEqS0qqHiGKMJM7LxRLhWAixwIWuOF7Qd5GCymecCQaNuIL7O2JvulTD6lyJ/ZeFlaGkdO4CA2kyaD2LHKkxASoAunPJkuUyyyolwCxXaQNuFSdObCT48quPYuV3wW+zNwg4SHlCTFRAGAUpiIxCwMgIU+z3Yj/DEPThkNoZWG2dzpANH5FYxbok5porycCfEQK09ZlW24NoaJ6HfSED1zIcKe4yB3zE0/4SaDJeNm6zDJOlqkVwnnZto+y1KvUlyiHwv0Y5KU4XkBngtYpsMNnajDaupNAd5IX70t0NtqqhUWaCjRRYEupMpWVMCRhJcFupRoXKCxRB8K9EHuMxJmuWuPeH4W2VHzC2Iv4B0FcBPebCfVvVDhgju9nJI6gzf1JGv3GCbiysmBE6dy/v76wx8J7/fax1onqSosEsJaohmd474mSab5alYarddrn0kajyJ3uik0OO+80eVCfkh9shF1u8bbE7lSDIR40aZS/2zQNioi0ZDymvSurmnV3UdTvBZ0ut2BVl1PRLaCN/2B0a1+JqIbbukGtKvNi8hWMDZ6R3IBa8MNrdM5kTjZCroG6o4lwXzDT7JR5d6GQ+fkOO9ByS5YcsvKFGpTVyV3Tevkq1bXJlh1CbmlavVzWf+OoniH2qurvnZabYZfl7G2XW1GXJex9uA6o5wS1bQpM1vtBzKbSXKyW19zZpkTd1Svk5Pd+poTy2y6o3qdnOzW15xv5uF6dc35Zo7etL1iHj+9TsWryPTT5wgi+dEMQDxUKHwQ1+yF+HNFzKO/i7uVTh1b2Fb8mq00+j8hWq6FIhJvJr36QpKDm/ah3jk0/jxqvj5bvZ6eKz8rvyi/KbrSVV4r75VLZahg5S/lb+Wr8s/ev42njf3Gi1z6+NEq50elNBo//QfNyMbT</latexit>

z1 = 1 z2 = 1
<latexit sha1_base64="pBODOlY2hdjYcgHbSTq2ulLh9K0=">AAAKXnicfZZdb9s2FIbVdmsbb1nT7mbAboQZA4bBCCQrsZ2LAG3stL1YmyyIkwKRG1D0sS2Y+gBFWVYI/aP9mt0V3cV+yijJH5Iojzc+4POeI/LwBU3LJ3bANO3Lo8dPvvn26bPne43vvt//4cXBy1c3gRdSDEPsEY9+slAAxHZhyGxG4JNPATkWgVtr3k/57QJoYHvuNYt9GDlo6toTGyMmpu4P3j58NueY64l6quqqaTYePre3obENj7bh8TbsrML7g6Z2qGVDlQN9FTSV1bi8f7mnmGMPhw64DBMUBHe65rMRR5TZmEDSMMMAfITnaAp3IZv0Rtx2/ZCBixP1V8EmIVGZp6Z7Usc2BcxILAKEqS0qqHiGKMJM7LxRLhWAixwIWuOF7Qd5GCymecCQaNuIL7O2JvulTD6lyJ/ZeFlaGkdO4CA2kyaD2LHKkxASoAunPJkuUyyyolwCxXaQNuFSdObCT48quPYuV3wW+zNwg4SHlCTFRAGAUpiIxCwMgIU+z3Yj/DEPThkNoZWG2dzpANH5FYxbok5porycCfEQK09ZlW24NoaJ6HfSED1zIcKe4yB3zE0/4SaDJeNm6zDJOlqkVwnnZto+y1KvUlyiHwv0Y5KU4XkBngtYpsMNnajDaupNAd5IX70t0NtqqhUWaCjRRYEupMpWVMCRhJcFupRoXKCxRB8K9EHuMxJmuWuPeH4W2VHzC2Iv4B0FcBPebCfVvVDhgju9nJI6gzf1JGv3GCbiysmBE6dy/v76wx8J7/fax1onqSosEsJaohmd474mSab5alYarddrn0kajyJ3uik0OO+80eVCfkh9shF1u8bbE7lSDIR40aZS/2zQNioi0ZDymvSurmnV3UdTvBZ0ut2BVl1PRLaCN/2B0a1+JqIbbukGtKvNi8hWMDZ6R3IBa8MNrdM5kTjZCroG6o4lwXzDT7JR5d6GQ+fkOO9ByS5YcsvKFGpTVyV3Tevkq1bXJlh1CbmlavVzWf+OoniH2qurvnZabYZfl7G2XW1GXJex9uA6o5wS1bQpM1vtBzKbSXKyW19zZpkTd1Svk5Pd+poTy2y6o3qdnOzW15xv5uF6dc35Zo7etL1iHj+9TsWryPTT5wgi+dEMQDxUKHwQ1+yF+HNFzKO/i7uVTh1b2Fb8mq00+j8hWq6FIhJvJr36QpKDm/ah3jk0/jxqvj5bvZ6eKz8rvyi/KbrSVV4r75VLZahg5S/lb+Wr8s/ev42njf3Gi1z6+NEq50elNBo//QfNyMbT</latexit>

z1 = 1 z2 = 1 z3 = 1
<latexit sha1_base64="pBODOlY2hdjYcgHbSTq2ulLh9K0=">AAAKXnicfZZdb9s2FIbVdmsbb1nT7mbAboQZA4bBCCQrsZ2LAG3stL1YmyyIkwKRG1D0sS2Y+gBFWVYI/aP9mt0V3cV+yijJH5Iojzc+4POeI/LwBU3LJ3bANO3Lo8dPvvn26bPne43vvt//4cXBy1c3gRdSDEPsEY9+slAAxHZhyGxG4JNPATkWgVtr3k/57QJoYHvuNYt9GDlo6toTGyMmpu4P3j58NueY64l6quqqaTYePre3obENj7bh8TbsrML7g6Z2qGVDlQN9FTSV1bi8f7mnmGMPhw64DBMUBHe65rMRR5TZmEDSMMMAfITnaAp3IZv0Rtx2/ZCBixP1V8EmIVGZp6Z7Usc2BcxILAKEqS0qqHiGKMJM7LxRLhWAixwIWuOF7Qd5GCymecCQaNuIL7O2JvulTD6lyJ/ZeFlaGkdO4CA2kyaD2LHKkxASoAunPJkuUyyyolwCxXaQNuFSdObCT48quPYuV3wW+zNwg4SHlCTFRAGAUpiIxCwMgIU+z3Yj/DEPThkNoZWG2dzpANH5FYxbok5porycCfEQK09ZlW24NoaJ6HfSED1zIcKe4yB3zE0/4SaDJeNm6zDJOlqkVwnnZto+y1KvUlyiHwv0Y5KU4XkBngtYpsMNnajDaupNAd5IX70t0NtqqhUWaCjRRYEupMpWVMCRhJcFupRoXKCxRB8K9EHuMxJmuWuPeH4W2VHzC2Iv4B0FcBPebCfVvVDhgju9nJI6gzf1JGv3GCbiysmBE6dy/v76wx8J7/fax1onqSosEsJaohmd474mSab5alYarddrn0kajyJ3uik0OO+80eVCfkh9shF1u8bbE7lSDIR40aZS/2zQNioi0ZDymvSurmnV3UdTvBZ0ut2BVl1PRLaCN/2B0a1+JqIbbukGtKvNi8hWMDZ6R3IBa8MNrdM5kTjZCroG6o4lwXzDT7JR5d6GQ+fkOO9ByS5YcsvKFGpTVyV3Tevkq1bXJlh1CbmlavVzWf+OoniH2qurvnZabYZfl7G2XW1GXJex9uA6o5wS1bQpM1vtBzKbSXKyW19zZpkTd1Svk5Pd+poTy2y6o3qdnOzW15xv5uF6dc35Zo7etL1iHj+9TsWryPTT5wgi+dEMQDxUKHwQ1+yF+HNFzKO/i7uVTh1b2Fb8mq00+j8hWq6FIhJvJr36QpKDm/ah3jk0/jxqvj5bvZ6eKz8rvyi/KbrSVV4r75VLZahg5S/lb+Wr8s/ev42njf3Gi1z6+NEq50elNBo//QfNyMbT</latexit>

z2 = 1 z3 = 1 z4 = 1
z3 = 1 z4 = 1 z5 = 1
z4 = 1 z5 = 1 z6 = 1
z5 = 1 z6 = 1
For each,
z6 =of1 course, the real value 1 is a solution, but for the
higher
p powers, there are additional solutions on the unit circle.
For 2 1, for instance, multiplying -1
p by itself rotates it by 180
degrees to coincide with 1. For 3 1, we get three roots, two
of which are non-real. The solution with angle 120 deg, when
raised to the power of 3 gives us an angle of 360 deg = 0 deg.
The solution with angle 240 deg puts the angle after cubing at
720 deg = 0 deg.
In short, every multiple of 360: 0, 360, 720, 1080, . . ., can
be divided by n to give us a solution. Once we get to 360n,
dividing by n gets us back to a solution we’ve already seen, so
we get n unique solutions in total.
To translate this to roots of any complex number m\a, we
110 CHAPTER 3—PROVING THE SPECTRAL THEOREM
p
simply scale the circle so that its radius is n m and then rotate it
so that the first solution points in the direction of a/n.

1
<latexit sha1_base64="+3/TkhpnlcNnDAFcMDGVD8qmiUY=">AAAKQ3icfVbLbttGFGXSV6Q2rdMuuyEqFChawSDFWJIXBhJLbrJoYtew7ACmYgxHVxKh4QPDoSh6MB/Rr+m2XfQj+g3dFd0W6JB6mORQnY0u5px7eefcg9E4IXEjZhh/Pnr8wYcfffzJk0bz08+efv7FwbMvr6MgphhGOCABfeegCIjrw4i5jMC7kALyHAI3zmKQ4TdLoJEb+FcsDWHsoZnvTl2MmNy6O/jh/r2ln+h2EmBuTynC3BS8I2zkzwhwy3hvY5diIXTbbt4dtIxDI1+6GpiboKVt1sXds4ZmTwIce+AzTFAU3ZpGyMYcUeZiAqJpxxGECC/QDG5jNu2PueuHMQMfC/1biU1jorNAzxrXJy4FzEgqA4SpKyvoeI5kx0wer1kuFYGPPIjak6UbRuswWs7WAUNSmzFf5dqJp6VMPqMonLt4VWqNIy/yEJsrm1HqOeVNiAnQpVfezNqUTVaYK6DYjTIRLqQy52E2j+gquNjg8zScgx8JHlMiiokSAEphKhPzMAIWhzw/jTTBIjphNIZ2FuZ7J0NEF5cwacs6pY1yO1MSIFbecirH8F0MmUNEU2rmQ4IDz0P+hNuh4DaDFeN2+1DkihbRS8G5ncnnOPplBpfQtwX0rRBl8KwAnkmwjI526FQfVVOvC+C18tWbAnpTTXXiAhor6LKALpXKTlKAEwVeFdCVgqYFNFXQ+wJ6r+qMpFluO2O+nkU+an5O3CW8ogC+4K2OqJ6FShfcmuWUzBm8ZYpc7glM5b2yBrw0o/PXV29+EnzQ7xwZXVFlOCSGLcWwukcDQ6HM1t1sOEa/3zlVOAGV18+u0PCs+9JUC4UxDcmO1OtZPx6rlVIgJEh2lQanw45VIUlByj2ZPdMwqqdPZnhL6PZ6Q6PaT0IeCC8HQ6tX/UxCd7hjWtCpipeQB8LE6j9XCzg73DK63WMFJw+EnoV6E4Ww2OHH+ariwQ6H7vHRWoOSXbDilo0p9JapK+6a1dE3UtcmOHUJa0vV8hcq/xVF6R52UFd967TajLAuY2u72oy0LmPrwW1GOSWpkSk3W+0HcpspdLKfXzOz3Il7qtfRyX5+zcRym+6pXkcn+/k18809XM+umW/u6J3sFfOE2XW6kK+eMHuOILIezRDkQ4XCG3nNnss/V8QC+r28W+nMc6Vt5a/dzqL/I6LVligj+WYyqy8kNbjuHJrdQ+vn560Xp5vX0xPta+0b7TvN1HraC+21dqGNNKz9ov2q/ab93vij8Vfj78Y/a+rjR5ucr7TSavz7H6Okx28=</latexit>

3
z3 = \30
<latexit sha1_base64="fyuu+1b9+g2lSL5IqYZ/fOuMID8=">AAAKKXicfVZNb9tGEGXSr0htWrs99kJUKFAUgkGKsSQfDCSW3OTQxK5h2QFMxViuRhSh5QeWS1H0gj+l1/bQX9Nb22v/SJeURJNcqnvRYN+b4eybh9VaAXFCpml/P3n60ceffPrZs1b78y+ef/nVweHXN6EfUQwT7BOfvrdQCMTxYMIcRuB9QAG5FoFbaznK8NsV0NDxvWuWBDB1ke05cwcjJrbuDw4fPhjqqWrGPuZ6qppm+/6gox1p+VLlQN8GHWW7Lu8PW4o583HkgscwQWF4p2sBm3JEmYMJpG0zCiFAeIlsuIvYfDjljhdEDDycqt8LbB4Rlflq1p46cyhgRhIRIEwdUUHFC0QRZuIQ7WqpEDzkQtidrZwg3IThyt4EDAkFpnydK5Q+r2Rym6Jg4eB1pTWO3NBFbCFtholrVTchIkBXbnUza1M0WWOugWInzES4FMpcBJnq4bV/ucUXSbAAL0x5RElaThQAUApzkZiHIbAo4PlpxKiX4SmjEXSzMN87HSO6vIJZV9SpbFTbmRMfseqWVTuG52CYC73TttDMgxj7rou8GTeDlJsM1oyb3aM0V7SMXqWcm5l8lqVeZXAFfVdC36VpFTwvgecCrKKTAp2rk3rqTQm8kb56W0Jv66lWVEIjCV2V0JVU2YpLcCzB6xK6ltCkhCYS+lBCH2SdkTDLXW/KN7PIR80viLOC1xTAS3mnl9bPQoUL7vRqSuYM3tHTXO4ZzMXtsQHcJKPzN9dvf075aNg71vppnWGRCHYUzegfjzSJYm+62XK04bB3JnF8ijy7KDQ+77/S5UJBRANSkAYD46cTuVIChPhxUWl0Nu4ZNZIQpNqTPtA1rX762MY7Qn8wGGv1fmLySHg1GhuD+mdiWuCWbkCvLl5MHgkzY/hCLmAVuKH1+ycSTh4JAwMNZhJhWeAn+arjfoFD/+R4o0HFLlhyy9YUakdXJXfZTfSt1I0JVlPCxlKN/KXMf01RsoftN1XfOa0xI2jK2NmuMSNpyth5cJdRTYkbZMrN1viB3GYSneznN8wsd+Ke6k10sp/fMLHcpnuqN9HJfn7DfHMPN7Mb5ps7upC9Zp4gu06XWIw5e44gshnNGMRDhcJbcc1eiD9XxHz6o7hbqe06wrbi1+xm0f8R0XpHFJF4M+n1F5Ic3PSO9P6R8cuLzsuz7evpmfKt8p3yg6IrA+Wl8ka5VCYKVmLlV+U35ffWH60/W3+1/tlQnz7Z5nyjVFbr3/8A+ru8OA==</latexit>

z =1
2

<latexit sha1_base64="C9nzuJ2P+e4KBEbQbg2oxBcAXrA=">AAAKJ3icfVZNb9tGEGXSr0htGqc55kJUKFAUgkGKsSgfAiSW3OTQxK5h2QFM1ViuRhKh5QeWS1HMgr8k1/SQX5Nb0R77T7okJZnkUt2LBvveDGffPKzWDogTMk375979L7786utvHrTa33738PtHB49/uAr9iGIYY5/49J2NQiCOB2PmMALvAgrItQlc28thhl+vgIaO712yJICJi+aeM3MwYmLr9uCRFduY69ofFnYoTtu3Bx3tUMuXKgf6Jugom3V++7ilWFMfRy54DBMUhje6FrAJR5Q5mEDatqIQAoSXaA43EZsNJtzxgoiBh1P1J4HNIqIyX82aU6cOBcxIIgKEqSMqqHiBKMJMHKFdLRWCh1wIu9OVE4RFGK7mRcCQOP+Er3N90oeVTD6nKFg4eF1pjSM3dBFbSJth4trVTYgI0JVb3czaFE3WmGug2AkzEc6FMmdBpnl46Z9v8EUSLMALUx5RkpYTBQCUwkwk5mEILAp4fhox6GX4nNEIulmY7z0fIbq8gGlX1KlsVNuZER+x6pZdO4bnYJgJvdO20MyDGPuui7wpt4KUWwzWjFvdwzRXtIxepJxbmXy2rV5kcAV9W0LfpmkVPC2BpwKsouMdOlPH9dSrEnglffW6hF7XU+2ohEYSuiqhK6myHZfgWILXJXQtoUkJTST0fQl9L+uMhFluehNezCIfNT8jzgpeUQAv5Z1eWj8LFS640aspmTN4R09zuacwE3dHAbhJRuevL9/8lvLhoHek9dM6wyYRbCma0T8aahJlXnSz4WiDQe9E4vgUefNdodFp/6UuFwoiGpAdyTSNX4/lSgkQ4se7SsOTUc+okYQg1Z50U9e0+unjOd4S+qY50ur9xOSO8HI4Msz6Z2K6w23dgF5dvJjcEabG4JlcwN7hhtbvH0s4uSOYBjKnEmG5w4/zVcf9HQ7946NCg4pdsOSWjSnUjq5K7po30TdSNybYTQmFpRr5S5n/iqJkD9tvqr51WmNG0JSxtV1jRtKUsfXgNqOaEjfIlJut8QO5zSQ62c9vmFnuxD3Vm+hkP79hYrlN91RvopP9/Ib55h5uZjfMN3f0TvaaeYLsOl1iMebsOYJIMZoRiIcKhTfimj0Tf66I+fQXcbfSuesI24pfq5tF/0dE6y1RROLNpNdfSHJw1TvU+4fG7886L042r6cHylPlR+VnRVdM5YXyWjlXxgpWIuWD8lH5s/Wp9bn1V+vvgnr/3ibniVJZrX//A78EvEI=</latexit>

10

<latexit sha1_base64="cle3i0FjQ0v1X7jX4N9AHDDKzBs=">AAAKNHicfVbLbttGFGXSV6Q2jdMu2wVRoUBRCAYpxpK8CJBYcpNFE7uGZQcQBWM4upIIDR+dGYpiBtz0a7ptF/2XAt0V3fYbOqQeJjlUZ+OLOedc3rn3eDROSFzGDePPBw8/+PCjjz951Gh++tnjz58cPf3ihgURxTDCAQnoOwcxIK4PI+5yAu9CCshzCNw6y0GG366AMjfwr3kSwsRDc9+duRhxuXV39LUdO1jY7GfKx9ZE2DOKsDBT0UnTtHl31DKOjXzpamBug5a2XZd3TxuaPQ1w5IHPMUGMjU0j5BOBKHcxgbRpRwxChJdoDuOIz/oT4fphxMHHqf6txGYR0XmgZ5XqU5cC5iSRAcLUlRl0vECyPi7P0yynYuAjD1h7unJDtgnZar4JOJLNmIh13qz0cUkp5hSFCxevS6UJ5DEP8YWyyRLPKW9CRICuvPJmVqYsssJcA8Uuy5pwKTtzEWYDYNfB5RZfJOECfJaKiJK0KJQAUAozKcxDBjwKRX4aOfUle85pBO0szPeeDxFdXsG0LfOUNsrlzEiAeHnLqRzDdzFkfkibsmc+xDjwPORPhR2mwuaw5sJuH6d5R4voVSqEnbXPcfSrDC6hbwvoW+mxEnheAM8lWEZHe3Smj6rSmwJ4o3z1toDeVqVOVEAjBV0V0JWS2YkLcKzA6wK6VtCkgCYK+r6Avlf7jKRZxp2J2MwiH7W4IO4KXlEAPxWt/H+4JKHSBWOzLMmcIVpmmrd7CjN5kWwAL8no4vX1mx9TMeh3ToxuWmU4JIIdxbC6JwNDocw31Ww5Rr/fOVM4AUX+fJ9oeN59aaqJwoiGZE/q9awfTtVMCRASxPtMg7Nhx6qQZEPKNZk90zCqp4/neEfo9npDo1pPTO4JLwdDq1f9TEz3uGNa0Kk2Lyb3hKnVf6YmcPa4ZXS7pwpO7gk9C/WmCmG5x0/zVcWDPQ7d05NND0p2wYpbtqbQW6auuGteR9+2ulbg1Ak2lqrlL1X+K4qSA+ygLvvOabWKsE6xs12tIqlT7Dy4U5QlcU2bcrPVfiC3mUInh/k1M8udeCB7HZ0c5tdMLLfpgex1dHKYXzPf3MP17Jr55o7et71injC7TpfysRNmzxFENqMZgnyoUHgjr9kL+eOKeEC/l3crnXuutK38a7ez6P+IaL0jyki+mczqC0kNbjrHZvfY+ulZ68XZ9vX0SPtK+0b7TjO1nvZCe61daiMNa79ov2q/ab83/mj81fi78c+G+vDBVvOlVlqNf/8DwhfCXA==</latexit>

3 1
2

3.5 The fundamental theorem of algebra


The reason we are bringing in complex numbers, is that we are
interested in talking, in general terms, about the roots of the
chracteristic polynomial. When all we have access to are real-
valued numbers, this becomes a messy and unpredictable busi-
ness. A polynomial of order n can have anywhere between 0
and n roots. When we add complex numbers to our toolbelt, the
whole picture becomes a lot simpler. And that is down to a result
called the fundamental theorem of algebra.
The theorem has many equivalent statements, but this is the
one most directly relevant to our purposes.

The fundamental theorem of algebra Any non-constant poly-


nomial of order n has exactly n complex roots, counting multi-
plicities.

For now, don’t worry about what is meant by multiplicities. We’ll


dig into that later. A constant polynomial is a function like
f(x) = 3, which will not have any roots.
3.5. THE FUNDAMENTAL THEOREM OF ALGEBRA 111

To prove this theorem, the first thing we need is to show that


each such polynomial has one root. After that, the rest is straight-
forward. So straightforward (to some) that this is often seen as
an alternative statement of the fundamental theorem:

The fundamental theorem of algebra (variant) Any non-


constant polynomial of order n has at least one complex root.

Let
p(z) = cn zn + . . . + c1 z + c0
be our polynomial. For our purposes, we can think of the co-
efficients as real-valued, but the theorem holds for complex co-
efficients as well. The argument z and the result p(z) can al-
ways be complex.
To find a root of p, we will consider the function |p(z)|. That
is, the magnitude of the complex number that we get out of p.
This provides the following benefits:

• The magnitude is always non-negative. That means that


the lowest possible value that |p(z)| can take on is 0, at
which point we must have p(z) = 0. In short, for a root of
p(z), |p(z)| is both 0 and at a minimum.

• Since the magnitude of a complex number is a single real


value, |p(x)| is a function from two dimensions (the com-
plex plane) to one dimension (the reals) and we can easily
visualize it in three dimensions. This is not so easy for p(x)
itself, since we have a two-dimensional input and a two-
dimensional output.

Our big shortcut in this proof will be to look at what the mag-
nitude does in extreme cases: for very large inputs, and for in-
puts very close to the minimum. We will see that in both cases,
the function can be approximated well by the magnitude of a
simple polynomial.
To see this, let’s start with a simple real-valued example. The
polynomial p(x) = x3 + x2 + x in the positive range. In this
112 CHAPTER 3—PROVING THE SPECTRAL THEOREM

area, p(x) is equal to its magnitude, so we don’t need to worry


about the distinction yet.

What we see here, is that as x gets bigger, the term x3 domi-


nates. Almost all the contribution to the magnitude comes from
this term, and pretty soon, the simpler polynomial x3 becomes a
pretty good approximation of the polynomial x3 + x2 + x. This is
not surprising, since the cube grows much faster than the square
which grows much faster than the identity.
Toward x = 0, where p has a minimum, the opposite hap-
pens. As x3 grows the fastest when x > 1, so it shrinks the
fastest when 0 < x < 1. In this regime the term that shrinks
the slowest, x begins to dominate, and x becomes a good approx-
imation of the function p.
Of course, this is just one polynomial. If we move to com-
plex polynomials, and we allow for any order and all possible
coefficients, does this pattern still hold? Let’s imagine a generic
complex polynomial. In this case, all terms in the polynomial are
complex numbers, and the value of the polynomial is their sum.

p(z) = c4 z4 + c3 z3 + c2 z2 + c1 z + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

r + ci
<latexit sha1_base64="R5eWalK8yve3V9gokX+hhA590A4=">AAAKKnicfVZNb9tGEGXSr8htWic95kJUKFC0gkGKsSgfAiSW3OTQxK5h2QFMwViuRhSh5QeWS1HMgn8l1/bQX9Nb0Gt/SJekJJNcqnvRYN+b4eybh9XaIXEjpmmfHjz87PMvvvzqUefg628ef/vd4ZOn11EQUwwTHJCAvrdRBMT1YcJcRuB9SAF5NoEbeznK8ZsV0MgN/CuWhjD1kOO7cxcjJrbuDp9aiY05zdRfVCuhmOPMvTvsakdasVQ50DdBV9msi7snHcWaBTj2wGeYoCi61bWQTTmizMUEsgMrjiBEeIkcuI3ZfDjlrh/GDHycqT8KbB4TlQVq3p86cylgRlIRIExdUUHFC0QRZuIUB/VSEfjIg6g3W7lhVIbRyikDhoQEU74uJMoe1zK5Q1G4cPG61hpHXuQhtpA2o9Sz65sQE6Arr76ZtymabDDXQLEb5SJcCGXOw1z26Cq42OCLNFyAH2U8piSrJgoAKIW5SCzCCFgc8uI0YtbL6AWjMfTysNh7MUZ0eQmznqhT26i3MycBYvUtu3EM38UwF3pnB0IzHxIceB7yZ9wKM24xWDNu9Y6yQtEqeplxbuXy2bZ6mcM19F0FfZdldfCsAp4JsI5OduhcnTRTryvgtfTVmwp600y14woaS+iqgq6kynZSgRMJXlfQtYSmFTSV0A8V9IOsMxJmue1PeTmLYtT8nLgreE0B/Ix3+1nzLFS44Favp+TO4F09K+SewVxcHyXgpTmdv7l6+1vGR8P+sTbImgybxLClaMbgeKRJFKfsZsPRhsP+qcQJKPKdXaHx2eCVLhcKYxqSHck0jV9P5EopEBIku0qj03HfaJCEIPWedFPXtObpEwdvCQPTHGvNfhJyT3g1Ghtm8zPiDt3itm5AvyleQu4JM2P4XC5g73BDGwxOJJzcE0wDmTOJsNzhJ8Vq4sEOh8HJcalBzS5YcsvGFGpXVyV3OW30jdStCXZbQmmpVv5S5r+mKN3DDtqqb53WmhG2ZWxt15qRtmVsPbjNqKckLTIVZmv9QGEziU7281tmVjhxT/U2OtnPb5lYYdM91dvoZD+/Zb6Fh9vZLfMtHL2TvWGeML9Ol1iMOX+OIFKOZgzioULhrbhmz8WfK2IB/VncrdTxXGFb8Wv18uj/iGi9JYpIvJn05gtJDq77R/rgyPj9effl6eb19Eh5pvyg/KToiqm8VN4oF8pEwcpa+aj8ofzZ+avzd+dT55+S+vDBJud7pbY6//4HA9m94w==</latexit>

p(z) = c4 z4 + c3 z3 + c2 z2 + c1 z + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

p(z) = c4 z4 + c3 z3p(z)
+ c2=z2c +z4c1+z c+ zc30 + c z2 + c z + c
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

4 3 2 1 0

p(z) = c4 z4 + c3 z3 + c2 z2 + c1 z + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

p(z) = c4 z4 + c3 z3 + c2 z2 + c1 z + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

r + ci
<latexit sha1_base64="R5eWalK8yve3V9gokX+hhA590A4=">AAAKKnicfVZNb9tGEGXSr8htWic95kJUKFC0gkGKsSgfAiSW3OTQxK5h2QFMwViuRhSh5QeWS1HMgn8l1/bQX9Nb0Gt/SJekJJNcqnvRYN+b4eybh9XaIXEjpmmfHjz87PMvvvzqUefg628ef/vd4ZOn11EQUwwTHJCAvrdRBMT1YcJcRuB9SAF5NoEbeznK8ZsV0MgN/CuWhjD1kOO7cxcjJrbuDp9aiY05zdRfVCuhmOPMvTvsakdasVQ50DdBV9msi7snHcWaBTj2wGeYoCi61bWQTTmizMUEsgMrjiBEeIkcuI3ZfDjlrh/GDHycqT8KbB4TlQVq3p86cylgRlIRIExdUUHFC0QRZuIUB/VSEfjIg6g3W7lhVIbRyikDhoQEU74uJMoe1zK5Q1G4cPG61hpHXuQhtpA2o9Sz65sQE6Arr76ZtymabDDXQLEb5SJcCGXOw1z26Cq42OCLNFyAH2U8piSrJgoAKIW5SCzCCFgc8uI0YtbL6AWjMfTysNh7MUZ0eQmznqhT26i3MycBYvUtu3EM38UwF3pnB0IzHxIceB7yZ9wKM24xWDNu9Y6yQtEqeplxbuXy2bZ6mcM19F0FfZdldfCsAp4JsI5OduhcnTRTryvgtfTVmwp600y14woaS+iqgq6kynZSgRMJXlfQtYSmFTSV0A8V9IOsMxJmue1PeTmLYtT8nLgreE0B/Ix3+1nzLFS44Favp+TO4F09K+SewVxcHyXgpTmdv7l6+1vGR8P+sTbImgybxLClaMbgeKRJFKfsZsPRhsP+qcQJKPKdXaHx2eCVLhcKYxqSHck0jV9P5EopEBIku0qj03HfaJCEIPWedFPXtObpEwdvCQPTHGvNfhJyT3g1Ghtm8zPiDt3itm5AvyleQu4JM2P4XC5g73BDGwxOJJzcE0wDmTOJsNzhJ8Vq4sEOh8HJcalBzS5YcsvGFGpXVyV3OW30jdStCXZbQmmpVv5S5r+mKN3DDtqqb53WmhG2ZWxt15qRtmVsPbjNqKckLTIVZmv9QGEziU7281tmVjhxT/U2OtnPb5lYYdM91dvoZD+/Zb6Fh9vZLfMtHL2TvWGeML9Ol1iMOX+OIFKOZgzioULhrbhmz8WfK2IB/VncrdTxXGFb8Wv18uj/iGi9JYpIvJn05gtJDq77R/rgyPj9effl6eb19Eh5pvyg/KToiqm8VN4oF8pEwcpa+aj8ofzZ+avzd+dT55+S+vDBJud7pbY6//4HA9m94w==</latexit>

0
<latexit sha1_base64="74DnoI2VxywZbxYpGWdO1x9Hdfs=">AAAKF3icfVZNb9tGEGU+2kZu0yTNsRciQoGiEAxSjEX5ECCx5CaHJnYMyw5gCcFyNaIILT+wuxTFEPwFvbaH/prcgl577L/pkpRokkt1LxrsezOcffOwWisgDuOa9u+du/fuf/X1Nw86B99+9/D7R4+f/HDF/JBimGCf+PSDhRgQx4MJdziBDwEF5FoErq3VKMOv10CZ43uXPA5g5iLbcxYORlxsvdc+Pu5qh1q+VDnQt0FX2a7zj086ynTu49AFj2OCGLvRtYDPEkS5gwmkB9OQQYDwCtlwE/LFcJY4XhBy8HCq/iSwRUhU7qtZM+rcoYA5iUWAMHVEBRUvEUWYi5YP6qUYeMgF1puvnYAVIVvbRcCROO8s2eR6pA9rmYlNUbB08KbWWoJc5iK+lDZZ7Fr1TQgJ0LVb38zaFE02mBug2GGZCOdCmbMg05hd+udbfBkHS/BYmoSUpNVEAQClsBCJeciAh0GSn0YMdsVecBpCLwvzvRdjRFcXMO+JOrWNejsL4iNe37Iax/AcDAuhd3ogNPMgwr7rIm+eTIM0mXLY8GTaO0xzRavoRZok00w+y1IvMriGvqug79K0Dp5WwFMB1tFJiS7USTP1qgJeSV+9rqDXzVQrrKChhK4r6FqqbEUVOJLgTQXdSGhcQWMJ/VRBP8k6I2GWm/4sKWaRjzo5I84aXlMAL026/bR5FipccKPXUzJnJF09zeWew0LcFQXgxhk9eXP59rc0GQ37R9ogbTIsEsKOohmDo5EmUeyimy1HGw77JxLHp8izy0Lj08ErXS4UhDQgJck0jV+P5UoxEOJHZaXRybhvNEhCkHpPuqlrWvP0kY13hIFpjrVmPxG5JbwajQ2z+ZmIlrilG9BviheRW8LcGD6XC1glbmiDwbGEk1uCaSBzLhFWJX6crybulzgMjo8KDWp2wZJbtqZQu7oquctuo2+lbk2w2hIKS7XyVzL/NUXxHrbfVn3ntNaMoC1jZ7vWjLgtY+fBXUY9JWqRKTdb6wdym0l0sp/fMrPciXuqt9HJfn7LxHKb7qneRif7+S3zzT3czm6Zb+7oUvaGeYLsOl1hMebsOYJIMZoxiIcKhbfimj0Tf66I+/QXcbdS23WEbcXvtJdF/0dEmx1RROLNpDdfSHJw1T/UB4fG++fdlyfb19MD5UflmfKzoium8lJ5o5wrEwUroPyu/KH82fmr87nzpfN3Qb17Z5vzVKmtzj//AbhJtl8=</latexit>
3.5. THE FUNDAMENTAL THEOREM OF ALGEBRA 113

The magnitude |p(z)| of the polynomial at z is the distance of the


end result to the origin. Each term contributes to this magni-
tude in a different direction. If we want to show that a partic-
ular term dominates, we can look at the worst case: that term
points in one direction, and all other terms point in the exact
opposite direction.

r + ci
<latexit sha1_base64="R5eWalK8yve3V9gokX+hhA590A4=">AAAKKnicfVZNb9tGEGXSr8htWic95kJUKFC0gkGKsSgfAiSW3OTQxK5h2QFMwViuRhSh5QeWS1HMgn8l1/bQX9Nb0Gt/SJekJJNcqnvRYN+b4eybh9XaIXEjpmmfHjz87PMvvvzqUefg628ef/vd4ZOn11EQUwwTHJCAvrdRBMT1YcJcRuB9SAF5NoEbeznK8ZsV0MgN/CuWhjD1kOO7cxcjJrbuDp9aiY05zdRfVCuhmOPMvTvsakdasVQ50DdBV9msi7snHcWaBTj2wGeYoCi61bWQTTmizMUEsgMrjiBEeIkcuI3ZfDjlrh/GDHycqT8KbB4TlQVq3p86cylgRlIRIExdUUHFC0QRZuIUB/VSEfjIg6g3W7lhVIbRyikDhoQEU74uJMoe1zK5Q1G4cPG61hpHXuQhtpA2o9Sz65sQE6Arr76ZtymabDDXQLEb5SJcCGXOw1z26Cq42OCLNFyAH2U8piSrJgoAKIW5SCzCCFgc8uI0YtbL6AWjMfTysNh7MUZ0eQmznqhT26i3MycBYvUtu3EM38UwF3pnB0IzHxIceB7yZ9wKM24xWDNu9Y6yQtEqeplxbuXy2bZ6mcM19F0FfZdldfCsAp4JsI5OduhcnTRTryvgtfTVmwp600y14woaS+iqgq6kynZSgRMJXlfQtYSmFTSV0A8V9IOsMxJmue1PeTmLYtT8nLgreE0B/Ix3+1nzLFS44Favp+TO4F09K+SewVxcHyXgpTmdv7l6+1vGR8P+sTbImgybxLClaMbgeKRJFKfsZsPRhsP+qcQJKPKdXaHx2eCVLhcKYxqSHck0jV9P5EopEBIku0qj03HfaJCEIPWedFPXtObpEwdvCQPTHGvNfhJyT3g1Ghtm8zPiDt3itm5AvyleQu4JM2P4XC5g73BDGwxOJJzcE0wDmTOJsNzhJ8Vq4sEOh8HJcalBzS5YcsvGFGpXVyV3OW30jdStCXZbQmmpVv5S5r+mKN3DDtqqb53WmhG2ZWxt15qRtmVsPbjNqKckLTIVZmv9QGEziU7281tmVjhxT/U2OtnPb5lYYdM91dvoZD+/Zb6Fh9vZLfMtHL2TvWGeML9Ol1iMOX+OIFKOZgzioULhrbhmz8WfK2IB/VncrdTxXGFb8Wv18uj/iGi9JYpIvJn05gtJDq77R/rgyPj9effl6eb19Eh5pvyg/KToiqm8VN4oF8pEwcpa+aj8ofzZ+avzd+dT55+S+vDBJud7pbY6//4HA9m94w==</latexit>

p(z) = c4 z4 + c3 z3 + c2 z2 + c1 z + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

= c4 z=4 c+4 zc43 z+3 c+3 zc32 z+2 c+2 zc21 z++c1cz0 + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

p(z) p(z)

p(z) = c4 z4 + c3 z3 + c2 z2 + c1 z + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

p(z) = c4 z4 + c3 z3 + c2 z2 + c1 z + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

p(z) = c4 z4 + c3 z3 + c2 z2 + c1 z + c0
<latexit sha1_base64="VQlWOg42CcezSYWSDsYIGgi9u2c=">AAAKYnicfVZdb9s2FFW7r8Zbt2R93B6EGQO6zQgkK7GdhwBt7KzFsDZZECcFYi+g6GtbMPUBirIsE/xP+zUD9rQ97IeMkr8kUR5ffMlz7hV57gFNOyBOyAzjrydPP/r4k08/e3ZQ+/yL519+dXj09V3oRxRDH/vEpx9sFAJxPOgzhxH4EFBArk3g3p51U/x+DjR0fO+WJQEMXTTxnLGDEZNLj4e/BC+XP+jn+iC2McePJ2L5+4n+02Zqyam1mzbltLmbmmK5mxji8bBuHBvZ0NXAXAd1bT2uH48OtMHIx5ELHsMEheGDaQRsyBFlDiYgaoMohADhGZrAQ8TGnSF3vCBi4GGhfy+xcUR05uvpufSRQwEzksgAYerICjqeIoowk6evFUuF4CEXwsZo7gThKgznk1XAkJRuyBeZtOJ5IZNPKAqmDl4UtsaRG7qITZXFMHHt4iJEBOjcLS6m25SbLDEXQLETpiJcS2WugrRd4a1/vcanSTAFLxQ8okTkEyUAlMJYJmZhCCwKeHYa6ZFZeM5oBI00zNbOe4jObmDUkHUKC8XtjImPWHHJLh3DczCMpd6iJjXzIMa+6yJvxAeB4AMGC8YHjWORKZpHbwTng1Q+29ZvUriAvs+h74Uogpc58FKCRbS/Rcd6v5x6lwPvlK/e59D7cqod5dBIQec5dK5UtuMcHCvwIocuFDTJoYmCLnPoUtUZSbM8NId81Yus1fyKOHN4QwE8wetNUT4LlS54MIspqTN43RSZ3CMYy2tnBbhJSudvb9/9Kni30zw1WqLMsEkEG4phtU67hkKZrHaz5hidTvNC4fgUeZNtod5l67WpFgoiGpAtqd22fj5TKyVAiB9vK3Uvek2rRJKCFPdktk3DKJ8+nuANodVu94zyfmKyI7zu9qx2+TMx3eK2aUGzLF5MdoSR1TlRC9hb3DJarTMFJztC20LtkUKYbfGzbJRxf4tD6+x0pUHBLlhxy9oUet3UFXdNquhrqSsT7KqElaUq+TOV/4aiZA/br6q+cVplRlCVsbFdZUZSlbHx4CajmBJXyJSZrfIDmc0UOtnPr+hZ5sQ91avoZD+/omOZTfdUr6KT/fyK/mYermZX9Ddz9Fb2knmC9DqdYdnm9DmCyKo1PZAPFQrv5DV7Jf9cEfPpj/JupRPXkbaVv4NGGv0fES02RBnJN5NZfiGpwV3z2GwdW7+d1F9drF9Pz7RvtO+0l5qptbVX2lvtWutrWPtD+1P7W/vn4F95nKPaixX16ZN1zgutMGrf/gfBN83R</latexit>

r + ci
<latexit sha1_base64="R5eWalK8yve3V9gokX+hhA590A4=">AAAKKnicfVZNb9tGEGXSr8htWic95kJUKFC0gkGKsSgfAiSW3OTQxK5h2QFMwViuRhSh5QeWS1HMgn8l1/bQX9Nb0Gt/SJekJJNcqnvRYN+b4eybh9XaIXEjpmmfHjz87PMvvvzqUefg628ef/vd4ZOn11EQUwwTHJCAvrdRBMT1YcJcRuB9SAF5NoEbeznK8ZsV0MgN/CuWhjD1kOO7cxcjJrbuDp9aiY05zdRfVCuhmOPMvTvsakdasVQ50DdBV9msi7snHcWaBTj2wGeYoCi61bWQTTmizMUEsgMrjiBEeIkcuI3ZfDjlrh/GDHycqT8KbB4TlQVq3p86cylgRlIRIExdUUHFC0QRZuIUB/VSEfjIg6g3W7lhVIbRyikDhoQEU74uJMoe1zK5Q1G4cPG61hpHXuQhtpA2o9Sz65sQE6Arr76ZtymabDDXQLEb5SJcCGXOw1z26Cq42OCLNFyAH2U8piSrJgoAKIW5SCzCCFgc8uI0YtbL6AWjMfTysNh7MUZ0eQmznqhT26i3MycBYvUtu3EM38UwF3pnB0IzHxIceB7yZ9wKM24xWDNu9Y6yQtEqeplxbuXy2bZ6mcM19F0FfZdldfCsAp4JsI5OduhcnTRTryvgtfTVmwp600y14woaS+iqgq6kynZSgRMJXlfQtYSmFTSV0A8V9IOsMxJmue1PeTmLYtT8nLgreE0B/Ix3+1nzLFS44Favp+TO4F09K+SewVxcHyXgpTmdv7l6+1vGR8P+sTbImgybxLClaMbgeKRJFKfsZsPRhsP+qcQJKPKdXaHx2eCVLhcKYxqSHck0jV9P5EopEBIku0qj03HfaJCEIPWedFPXtObpEwdvCQPTHGvNfhJyT3g1Ghtm8zPiDt3itm5AvyleQu4JM2P4XC5g73BDGwxOJJzcE0wDmTOJsNzhJ8Vq4sEOh8HJcalBzS5YcsvGFGpXVyV3OW30jdStCXZbQmmpVv5S5r+mKN3DDtqqb53WmhG2ZWxt15qRtmVsPbjNqKckLTIVZmv9QGEziU7281tmVjhxT/U2OtnPb5lYYdM91dvoZD+/Zb6Fh9vZLfMtHL2TvWGeML9Ol1iMOX+OIFKOZgzioULhrbhmz8WfK2IB/VncrdTxXGFb8Wv18uj/iGi9JYpIvJn05gtJDq77R/rgyPj9effl6eb19Eh5pvyg/KToiqm8VN4oF8pEwcpa+aj8ofzZ+avzd+dT55+S+vDBJud7pbY6//4HA9m94w==</latexit>

0
<latexit sha1_base64="74DnoI2VxywZbxYpGWdO1x9Hdfs=">AAAKF3icfVZNb9tGEGU+2kZu0yTNsRciQoGiEAxSjEX5ECCx5CaHJnYMyw5gCcFyNaIILT+wuxTFEPwFvbaH/prcgl577L/pkpRokkt1LxrsezOcffOwWisgDuOa9u+du/fuf/X1Nw86B99+9/D7R4+f/HDF/JBimGCf+PSDhRgQx4MJdziBDwEF5FoErq3VKMOv10CZ43uXPA5g5iLbcxYORlxsvdc+Pu5qh1q+VDnQt0FX2a7zj086ynTu49AFj2OCGLvRtYDPEkS5gwmkB9OQQYDwCtlwE/LFcJY4XhBy8HCq/iSwRUhU7qtZM+rcoYA5iUWAMHVEBRUvEUWYi5YP6qUYeMgF1puvnYAVIVvbRcCROO8s2eR6pA9rmYlNUbB08KbWWoJc5iK+lDZZ7Fr1TQgJ0LVb38zaFE02mBug2GGZCOdCmbMg05hd+udbfBkHS/BYmoSUpNVEAQClsBCJeciAh0GSn0YMdsVecBpCLwvzvRdjRFcXMO+JOrWNejsL4iNe37Iax/AcDAuhd3ogNPMgwr7rIm+eTIM0mXLY8GTaO0xzRavoRZok00w+y1IvMriGvqug79K0Dp5WwFMB1tFJiS7USTP1qgJeSV+9rqDXzVQrrKChhK4r6FqqbEUVOJLgTQXdSGhcQWMJ/VRBP8k6I2GWm/4sKWaRjzo5I84aXlMAL026/bR5FipccKPXUzJnJF09zeWew0LcFQXgxhk9eXP59rc0GQ37R9ogbTIsEsKOohmDo5EmUeyimy1HGw77JxLHp8izy0Lj08ErXS4UhDQgJck0jV+P5UoxEOJHZaXRybhvNEhCkHpPuqlrWvP0kY13hIFpjrVmPxG5JbwajQ2z+ZmIlrilG9BviheRW8LcGD6XC1glbmiDwbGEk1uCaSBzLhFWJX6crybulzgMjo8KDWp2wZJbtqZQu7oquctuo2+lbk2w2hIKS7XyVzL/NUXxHrbfVn3ntNaMoC1jZ7vWjLgtY+fBXUY9JWqRKTdb6wdym0l0sp/fMrPciXuqt9HJfn7LxHKb7qneRif7+S3zzT3czm6Zb+7oUvaGeYLsOl1hMebsOYJIMZoxiIcKhbfimj0Tf66I+/QXcbdS23WEbcXvtJdF/0dEmx1RROLNpDdfSHJw1T/UB4fG++fdlyfb19MD5UflmfKzoium8lJ5o5wrEwUroPyu/KH82fmr87nzpfN3Qb17Z5vzVKmtzj//AbhJtl8=</latexit>

In this case, we can ignore the angles of the terms and focus only
on their magnitudes. If we assume the highest-order term points
in the opposite direction of the rest, the total magnitude is

|p(z)| = |cn zn | - cn-1 zn-1 - . . . - |c1 z| - |c0 |


= |cn ||z|n - |cn-1 ||z|n-1 - . . . - |c1 ||z| - |c0 |

Note that the terms we are subtracting are all magnitudes, so


they are all positive.

We will first use this to show that |p(z)| has some definite mini-
mum. One alternative situation would be if p(z) is a function that
is positive everywhere and monotonically increasing in some di-
rection, like ex is on the real number line. We’ll need to exclude
such possibilities first.
Assume that |z| > 1 for some z. If so, we make the total
114 CHAPTER 3—PROVING THE SPECTRAL THEOREM

value of the sum smaller if we replace all lower-order powers


by zn-1 . This means that

X
|p(z)| > |cn ||z|n - |c0 | - |ci ||z|n-1
i21..n-1
X
= |cn ||z| - |c0 | - |z|n-1
n
|ci |
i21..n-1
!
n-1
X
= |z| |cn ||z| - |ci | - |c0 |
i21..n-1
P
If we choose z so that its magnitude is larger than |c1n | i |ci |,
the factor in brackets becomes positive. Beyond that, we know
that there is some value of |z| large enough that the first term is
bigger than the second. In short, for a suffciently large B, we can
always choose a value of z such that |p(z)| is larger than B.
This means we can draw some large circle with radius R, find
the smallest value of |p(z)| inside the circle, and then draw a
second circle with radius B so that all values outside of p(z)
outside the second circle are larger than this minimum inside
the first circle. This means |p(x)| has a definite minimum in-
side the second circle.
Now, all we need to do is show that this minimum can be
expressed by a complex number. To do that, we’ll follow the
same sort of argument, but with the magnitude going to 0, so
that the lower-order terms dominate.
First, let z0 be the minimum we’ve just shown must exist.
Translate p so that this minimum coincides with the origin, and
call the result q(z). Specifically, q(z) = p(z - z0 ).
This is another n-th order polynomial. We’ll call its coeffi-
cients di . Note that q(0) has the same value as p(z0 ) by con-
struction. What we want to show is that p(z0 ) = 0.
In many polynomials the lowest-order term is the first-order
term c1 z. However, we need to account for cases where this term
is always zero. To be general, we write q as
q(z) = d0 + dk zk + dk+1 zk+1 + . . . + dn zn
where k is the order of the lowest-order, non-constant term.
3.5. THE FUNDAMENTAL THEOREM OF ALGEBRA 115

We’ll show that the proportion of the sum contributed by the


higher order terms vanishes as we get near zero, so we can take
the simpler function q 0 (z) = c0 + ck zk as a good approximation,
that becomes perfect at the origin.

Note that q(0) = q 0 (0).

More formally, lets look at the ratio between the higher-order


terms and the kth-order term:

|dk+1 ||z|k+1 + . . . + |dn ||z|n


r= .
|dk ||z|k

Assuming that |z| < 1, the numerator is made bigger by reducing


all exponents to zk+1 , so

P P
|z|k+1 |di | i |di |
r< i2k+1..n
= |z| .
|dk ||z|k |dk |

The second factor is a constant, so if we want to make the contri-


bution of the higher order terms less than some given ✏, we just
need to choose a z with small enough magnitude.

Specifically |z| < ✏ P|d|d


k|
i|
. Fill this into the above, and note that
i
the two fractions cancel out, leaving only ✏.

This allows us to continue our analysis with q 0 instead of q. Next,


we can show that because |q 0 (z)| has a minimum at 0, that min-
imum must be equal to 0, so that d0 = 0.
At first, this may not be obvious. Why should a function
q 0 (z) = ck zk + c0 necessarily have c0 = 0 if its magnitude has
a minimum at 0? It becomes clearer if we take the image of the
roots of zn - 1 that we showed earlier, and create a 3d plot of
the corresponding magnitude functions |zn - 1|.
116 CHAPTER 3—PROVING THE SPECTRAL THEOREM

This kind of picture applies to the more general function |ck zk +


c0 | as well: the two constants rotate the image and change how
close the roots are to the origin, but unless c0 = 0, we get a ring
of roots some distance from the origin and no root at the origin.
This contradicts what we know: that |q(z)| has a minimum at 0,
so we know that c0 must be 0. p
To prove this formally, let z = k -d0 /dk , one of the roots of
q , and let ✏ be some real value near 0. Then we have:
0

d0
|q 0 (✏z)| = |d0 + dk · -✏k |
dk
= |d0 - ✏k d0 |

Note that both terms in this last line point in the same direction,
so if d0 > 0, the resulting magnitude is smaller than |d0 |, which
contradicts what we already know: that |d0 | is the minimum of
q 0 . Therefore d0 = 0, q(0) is a root of q and p(z0 ) is a root of p.
3.5. THE FUNDAMENTAL THEOREM OF ALGEBRA 117

It’s instructive to look over this proof, and try to figure out why
the same argument wouldn’t work for real-valued polynomials.
p
The answer is in the step where we chose z = k -✏d0 /dk .
This allowed us to approach the origin from one of the roots of
ck zk + c0 , and to observe that the magnitude increases. In the
real-valued world, we cannot always make this choice, because
the root may be of a negative number.

3.5.1 From one root to n roots


Now that we know that each polynomial has at least one root,
how do we get to multiple roots? In high school, we learned that
when we were faced with a (real valued) second-degree poly-
nomial to solve, sometimes, if we were lucky, we could find its
factors. For instance, the function:

f(x) = x2 - 3x + 2
can be rewritten as

f(x) = (x - 1)(x - 2) .
Now, the function is expressed as a multiplication of two factors,
and we can deduce that if x is equal to 1 or to 2, then one of the
factors is 0, so the whole multiplication is zero. Put simply, if we
can factorize our polynomial into factors of the form x - r, called
linear factors, then we know that the r’s are its roots.
This is how we’ll show that any p(z) of degree n has n com-
plex roots: we’ll factorize it into n factors of the form z-r, where
we will allow r to be complex.
To allow us to factor any polynomial into linear factors, we’ll
use a technique called Euclidian division, which allows us to break
up polynomials into factors. The general method works for any
polynomial factor, but we can keep things simple by sticking to
one specific setting.

Euclidian division (simplified) Given a polynomial p(z) of de-


gree n and a linear factor z - r, there is a polynomial q(z) of
degree n - 1 and a constant d, called the remainder such that

p(z) = (z - r)q(z) + d .
118 CHAPTER 3—PROVING THE SPECTRAL THEOREM

The proof is short, but a bit dense and it doesn’t add much to the
intuition we need. It’s in the appendix if you’re curious.

Now, let r be a root of p(z)—we know there must be one—and


apply the Euclidian division so that we get

p(z) = (z - r)q(z) + d .

Since r is a root, p(r) must be zero. The first term is zero because
of the factor (z - r), so d must be zero as well. In short, if we
apply Euclidean division with a root r, we get

p(z) = (z - r)q(z) .
And with that, we can just keep applying Euclidean division. First
to q(z), then to the n - 1 polynomial resulting from that and so
on. Each time we do this, we get one more factor, and the degree
of q(x) is reduced by one.
This tells us what we were looking for: every polynomial p(z)
of degree n can be decomposed into a product of n linear terms

p(z) = (x - r1 )(x - r2 ) . . . (x - rn )
so it must have n roots.
What we haven’t proved yet, is whether all of these roots are
distinct. And indeed, it turns out they need not be. We can factor
any p(z) into n linear factors, but it may be the case that some
of them are the same. For instance,

p(z) = z2 - 6z + 9 = (z - 3)(z - 3)
We call these multiplicities. If we count every root by the num-
ber of factors it occurs in, then the total comes to n.

We also haven’t shown yet that an n-th degree polynomial can’t


have more than n roots. This follows from the fact that any
roots r 0 of p we don’t use, must be roots of q, since p(r 0 ) = 0 =
(r 0 - r)q(r 0 ). If we start with n + 1 distinct roots, we therefore
end up with a 0-order polynomial, a constant function, with a
root, giving us a contradiction. If we start with n + 1 roots with
3.5. THE FUNDAMENTAL THEOREM OF ALGEBRA 119

multiplicities, we can add some small noise to them make the


roots distinct, deriving a contradiction that way.

We’ll call a root real if it is a real number and complex other-


wise. Given the n roots of a particular polynomial, what can
we say about how many of them are real and how many of
them are complex?
We can have a polynomial with all roots complex and one with
all roots real, or something in between, but there is a constraint:
if our polynomial has only real-valued coefficients, then complex
roots always come in pairs. This is because if a complex number
r + ci is a root of p(z), then that same number with the complex
part subtracted instead of added, r - ci, is also a root of p(z).
The second number is called the conjugate of the first. We
denote this with a vertical bar: z is the conjugate of z. Visually,
the conjugate is just the reflection image in the real number line.
Why is it the case that if z is a root that z is a root too?
Well, it turns out that taking the conjugate distributes over many
operations. For our purposes, we can easily show that it dis-
tributes over addition and multiplication and that it commutes
over integer powers.

z + w = z + w since
a + bi + c + di = a + c - (b + d)i
= a - bi + c - di = a + bi + c + di

zw = z w since
m\a · n\b = mn\(a + b) = mn\(-a - b)
= m\ - a · n\ - b = m\a · n\b

zn = zn since
n
(m\a)n = mn \na = mn \ - na = (m\ - a)n = m\a
120 CHAPTER 3—PROVING THE SPECTRAL THEOREM

If p(z) = 0, then p(z) = 0 too, since 0 is a real value, so


it’s equal to its own conjugate. For the roots of a polynomial
p, this gives us:

0 = p(z)
= cn zn + . . . + c1 z + c0
= cn zn + . . . + c1 z + c0
= p(z) .

In short, if a complex number is a root of p(z), then its conju-


gate is too. This means that if we have a 2nd degree polynomial
(with real coefficients), we can have two real roots, or one pair
of complex roots. What we can’t have is only one real root, since
then the other would be complex by itself, and complex roots
have to come in pairs.
Similarly, if we have a 3rd degree polynomial, we must have
at least one real root, since all complex roots together must make
an even number.
This should make sense if you think about the way real valued
polynomials behave at their extremes. A 2nd degree polynomial
either moves off to positive or to negative infinity in both direc-
tions. That means it potentially never crosses the x axis, resulting
in two complex roots, or it does cross the x axis, resulting in two
real roots. If it touches rather than crosses the x axis, we get a
multiplicity: a single real-valued root that occurs twice.

A 3rd degree polynomial always moves off to negative infinity in


one direction, and positive infinity in the other. Somewhere in
3.5. THE FUNDAMENTAL THEOREM OF ALGEBRA 121

between, it has to cross the x axis, so we get one real-valued root


at least. The rest of the curve can take on a single bowl shape, so
the remaining two roots can be both real, when this bowl crosses
the horizontal axis, or both complex when it doesn’t.

Remember, this only holds if the coefficients are real-valued. If


the coefficients of the polynomial are complex, then we get

cn zn + . . . + c1 z + c0 = cn zn + . . . + c1 z + c0

where the conjugation of the coefficients cannot be removed.

3.5.2 Back to eigenvalues


This was a long detour, so let’s restate what brought us here.
We were interested in learning more about the eigenvalues of
some square matrix A. These eigenvalues, as we saw, could be
expressed as the solutions to the following equation

| A - I | = 0.
We found that the determinant on the left is a polynomial in , so
we can use what we’ve learned about polynomials on this prob-
lem: if we allow for complex roots, then we know that the char-
acteristic polynomial of A has exactly n complex roots, counting
multiplicities. The entries of A are real values, so the polynomial
has real coefficients, and the complex roots must come in pairs.
In the last part, we said that a given square matrix had be-
tween 0 and n eigenvalues. Now, we can refine that by allowing
complex eigenvalues. An n ⇥ n matrix always has n eigenvalues,
counting multiplicities, some of which may be complex, in which
case, they come in pairs. Let’s see what all these concepts mean
in the domain of matrices and vectors.
First, lets look at a simple rotation matrix:
✓ ◆
0 -1
R= .
1 0
This matrix rotates points by 90 degrees counter-clockwise around
the origin. All non-zero vectors change direction under this
122 CHAPTER 3—PROVING THE SPECTRAL THEOREM

transformation, so previously, we would have said that R has


no eigenvalues. Now, let’s look at the roots of its characteris-
tic polynomial:

- -1
|R- I|=
1 -
= (- )(- ) - (-1)(1)
2
= +1=0

As expected, a polynomial with complex roots. In fact, a classic.


Its roots are i, and its conjugate -i.
What does this mean for the eigenvectors? Remember that
an eigenvector is the vector that doesn’t change direction when
multiplied by the matrix. There are no such vectors containing
real values, but if we allow vectors filled with complex num-
bers, there are.
This can get a little confusing: a vector in R2 is a list of two
real numbers. A vector in C2 is a list of two complex num-
bers. For instance:
✓ ◆
2 + 3i
x= .
1 - 2i
The confusion usually stems from the fact that we’ve been imag-
ining complex numbers as 2-vectors, so now we are in danger of
confusing the two. Just remember, a complex number is a single
value. It just so happens there are ways to represent it by two
real values, which can help with our intuition. When we start
thinking about complex matrices and vectors, however, it may
hurt our intuition, so it’s best to think of complex numbers as
just that: single numbers. Complex matrices and vectors are just
the same thing we know already but with their elements taken
from C instead of R.
Linear algebra with complex matrices and vectors is a very
useful field with many applications, but here, we will only need
the basics. Addition and multiplication are well-defined for com-
plex numbers, and all basic operations of linear algebra are sim-
ply repeated multiplication and/or addition. If we write things
3.5. THE FUNDAMENTAL THEOREM OF ALGEBRA 123

down symbolically, they usually look exactly the same as in the


real-valued case.
For instance, if x and y are two complex vectors, then
0 1 0 1 0 1
x1 y1 x1 + y1
B x2 C B y2 C B x2 + y2 C
B C B C B C
B .. C + B .. C = B .. C
@ . A @ . A @ . A
xn yn xn + yn

where xi +yi represents complex addition as we defined it before.


Similarly, for a complex number z and a complex vector x:
0 1
zx1
B zx2 C
B C
zx = B . C .
@ .. A
zxn
Matrix multiplication also works the same as it does in the real-
valued case: the result of multiplying a complex matrix A with
complex matrix B is the matrix C where Cij is the Psum-product
of the elements of row i in A and column j in B: k Aik Bkj .
In the real-valued case we would describe such a sum-and-
product operation as the dot product or inner product of two
vectors. But this is where we have to be careful in the complex
world. The definition of the inner product takes a little bit of care.
The problem is that if we define the inner product of vectors x
and y as xT y, as we do in the real-valued case, it doesn’t behave
quite as we want to. Specifically, when we take the inner product
of a vector with itself, it doesn’t give us a well-behaved norm.
A norm is (roughly) an indication of the length of the vector,
and one important property is that there is only one vector that
should have norm 0, which is the zero vector.
However, a complex vector like:
✓ ◆
i
x=
1
will also lead to xT x = 0. The problem is in the transpose. When
we move from the real-valued to the complex-valued world, it
turns out that simply transposing a matrix doesn’t always behave
124 CHAPTER 3—PROVING THE SPECTRAL THEOREM

analogously to how it did before. For things to keep behaving


as we expect them to, we need to replace the transpose with
the conjugate transpose.
The conjugate transpose is a very simple operation: to take the
conjugate transpose of a complex matrix, we simply replace all its
elements by their conjugates, and then transpose it. If we write
the conjugation of a matrix with an overline as we do in the scalar
case, and the conjugate transpose with a ⇤, then we can define:
T
A⇤ = A
This may seem like a fairly arbitrary thing to do. Why should this
particular operation be so fundamental in the complex world? To
get some motivation for this, we can look at one more represen-
tation of complex numbers. We’ve seen the cartesian representa-
tion, and the polar representation, and here is one more: we can
also represent a single complex number as a 2 ⇥ 2 matrix.
Let r + ci be a complex number. We can then arrange the two
components in a 2 ⇥ 2 matrix as follows:
✓ ◆
r -c
c r
This is a single complex number represented as a real-valued
matrix. The benefit of this representation is that if we matrix-
multiply the complex numbers x and y in their matrix represen-
tations, it is equivalent to multiplying the two complex numbers
together: the result is another 2 ⇥ 2 matrix representing the re-
sult of the multiplication xy as a matrix.
✓ ◆
c -d
✓ ◆ ✓ d c ◆
a -b ac - bd -ad - bc
b a ad + bc ac - bd

Another way to see this is to write the Cartesian coordinates in


terms of the angle and magnitude: m cos(a) + m sin(a)i. If you
arrange these values into a matrix, you see that the result is a
rotation matrix for angle a, multiplied by a scalar m. Rota-
tion, and uniform scaling is exactly the operation of complex
multiplication.
3.5. THE FUNDAMENTAL THEOREM OF ALGEBRA 125

With this perspective in hand, we can also rewrite complex ma-


trix multiplication. Start with a normal multiplication of a com-
plex matrix A by a complex matrix B. Now replace each ele-
ment Aij in both with a 2 ⇥ 2 matrix of real values, representing
the complex number Aij as described above. Then, concatenate
these back into a matrix AR , which is twice as tall and twice as
wide as A, and filled with only real values. Do the same for B.
Multiplying AR and BR together performs exactly the same
operation as multiplying A and B together, except that the result
is also in this 2 ⇥ 2 representation. This way, we can transform a
complex matrix multiplication into a real-valued matrix multipli-
cation.
This shows us the motivation for the complex conjugate. Com-
pare the number r+ci in a 2⇥2 representation to its conjugate r-
ci:
✓ ◆ ✓ ◆
r -c r c
.
c r -c r
They are transposes of each other. That means that if we take
a complex matrix like A, transform it to 2 ⇥ 2 representation
T
AR and then transpose it, the result AR interpreted as a 2 ⇥ 2
representation of a complex matrix, is not the transpose of A,
but the conjugate transpose.
The conjugate transpose will be important for what’s coming
up, so let’s look at a few of its properties.
First, note that if A contains only real values, the conjugate
transpose reduces to the regular transpose: real values are un-
changed by conjugation, so the conjugation step doesn’t change
A and only the transpose remains.
Second, note that the conjugate transpose distributes over
multiplication the same way the transpose does: (AB)⇤ = B⇤ A⇤ .
This is because the conjugation distributes over the sums and
multiplications inside the matrix multiplication so that we get:

AB = A B .
With the conjugate transpose, we can also define a dot product
that will give us a proper norm. By analogy with the real-valued
126 CHAPTER 3—PROVING THE SPECTRAL THEOREM

dot product written as xT y, we define the dot product of com-


plex vectors x and y as
X
x · y = y⇤ x = xi yi .
i

Note that this is not a symmetric function anymore: it matters


for which of the two vectors we take the conjugate transpose. By
convention, it’s the second argument of the dot product.
This suggests a natural norm for complex vectors. In the real-
valued case, the norm
p is the square of the vector’s dot product
with itself: |x| = x · x. The same holds here.
This is a little more abstract and hard to visualize than the dot
product in the real-valued case. We’ll just have to accept for now
that the math works out. We’ll need to carry over the following
properties of norms and dot products from the real-valued case:

1. A vector with norm 1 is called a unit vector.

2. Two vectors whose dot product is 0 are called orthogonal.


In this case it doesn’t matter in which order we take the dot
product: if it’s zero one way around, it’s also zero the other
way around. This is easiest to see in the 2⇥2 representation
of the dot product.

3. A matrix U whose column vectors are all unit vectors, and


all mutually orthogonal is called unitary. This is the com-
plex analogue of the orthogonal matrix we introducted in
Chapter 2. Just like we had UT U = I and U-1 = UT for
orthonormal matrices, we have U⇤ U = I and U-1 = U⇤
for unitary matrices.

4. The standard basis for Rn , the columns of I, serves as a


basis for Cn as well. For I to be a basis, we should be able
to construct any complex vector z as a linear combination
of the basis vectors. Here we can simply say z = z1 e1 +
. . . + zn en , where ei are the columns of I.

With these properties in place, we can return to the question of


eigenvalues and eigenvectors.
3.6. THE SPECTRAL THEOREM 127

Let’s go back to our example. Here is the operation of our rotation


matrix:
✓ ◆
z1

✓ ◆ ✓ z2 ◆ .
0 -1 -z2
1 0 z1
An eigenvector of this matrix is one for which this operation is
the same as multiplying by the eigenvalue i (or -i):

z1 i = -z2
z2 i = z1 .

Remember that z1 and z2 are both complex numbers. We know


already that on individual complex numbers, multiplying by i has
the effect of rotating in the complex plane by 90 degrees counter-
clockwise. That means that we’re looking for a pair of complex
numbers, such that rotating them this way turns the first into the
negative of the of the second and the second into the first. This
is true for any complex numbers of equal magnitude with a 90
degree angle between its two elements. For instance
✓ ◆
1
i
is an eigenvector.
In the real-valued case, a given eigenvector could be multi-
plied by a scalar and it would still be an eigenvector. The same
is true here as well. If we multiply the eigenvector above by a
complex scalar s = m\a, this multiplication rotates both com-
plex numbers in the vector above by the same angle a, so the
angle between them stays 90 degrees.
This allows us to scale the eigenvector so that its norm be-
comes 1, giving us a unit eigenvector.

3.6 The spectral theorem


We are finally ready to begin our attack on the spectral theo-
rem. The structure of the proof is as follows. We will first de-
128 CHAPTER 3—PROVING THE SPECTRAL THEOREM

fine a slightly different decomposition of a matrix, called the


Schur decomposition.
We first show that any square matrix, complex or real, can
be Schur-decomposed. Then, we show that the Schur decom-
position of a symmetric real-valued matrix coincides with the
eigendecomposition.

3.6.1 The Schur decomposition


Let A be any complex-valued, n ⇥ n matrix. The Schur decom-
position rewrites A as the following product: A = U⇤ T U, where
U is a unitary matrix, and T is an upper triangular matrix (i.e.
a matrix with non-zero values only on or above the diagonal).
Compare this to the eigendecomposition A = PT DP, where P
is orthogonal, and D is diagonal.
Unlike the eigendecomposition, however, we can show that
the Schur decomposition exists for any square matrix.

This is a proof by induction. If you’ve never seen that before,


it may look a little confusing.
The idea is that we state our problem first for some specific
value n, for instance the size of the matrix (n ⇥ n) we’re dealing
with. We prove the specific case n = 1 first, and then we prove
that if the result holds for n - 1, we can prove that it does for n
as well. Combining the two shows that the result must hold for
all n. If you’re struggling with this, try following the inductive
step with n = 2 first, and then again with n = 3.

Schur decomposition. Any n⇥n complex matrix A has a Schur


decomposition A = U⇤ T U, where U is unitary, and T is upper
triangular.

Proof. We will prove this by induction on n.

Base case. First, assume n = 1. That is, let A be a 1 ⇥ 1 matrix


with value a. Then, the Schur decomposition reduces to simple

scalar multiplication with A = u a u = uau , which is
3.6. THE SPECTRAL THEOREM 129

true for u = 1 and a = A11 .

Induction step. Now we assume that the theorem holds for n-


1, from which we will prove that it also holds for n.
We know that A has n eigenvalues, counting multiplicities
and allowing complex values. Let be one of these, and let u be
a corresponding unit eigenvector.
Let W be a matrix with u as its first column, and the remain-
ing unit vectors orthogonal to u and to each other. This makes
W a unitary matrix.

In Rn we know that sufficient orthogonal vectors are always


available. In the appendix, we prove that this property carries
over to Cn .

Now consider the matrix W ⇤ AW. As illustrated below, the first


column of AW is Au, which is equal to u since u is an eigen-
vector. This means that (W ⇤ AW)11 is equal to u⇤ u = . The
other elements in the first column are equal to the dot product of
a scaled u and another column of W. Since the columns of W
are mutually orthogonal, these are all 0.

A
<latexit sha1_base64="nwgNH8KuDakNS1u7htHlFpvQbso=">AAAKgHicfZZdb9s2FIbV7KvV1i3dLncj1ChQFEYm2Y3tDCjQxM7ai7XJgjguEHkBRR/bgqkPUJRlldBf7P3+x243lPqwI4lydeMjPu85Ig9f0LR8YgdM1/95cPDV1998+93DR+r3Pzz+8afDJz/fBF5IMYyxRzz6wUIBENuFMbMZgQ8+BeRYBCbWapjyyRpoYHvuNYt9mDpo4dpzGyMmhu4Ol2ZkYW5ajnaaaKapll7NiObhpCC7179ffEmWEiJmMEP3LEzuDlv6kZ49mhwYRdBSiufy7skjxZx5OHTAZZigILg1dJ9NOaLMxgQS1QwD8BFeoQXchmw+mHLb9UMGLk60Z4LNQ6IxT0tXrc1sCpiRWAQIU1tU0PASUYSZ6I1aLRWAixwI2rO17Qd5GKwXecCQaOyUb7LGJ48rmXxBkb+08aYyNY6cwEFsKQ0GsWNVByEkQNdOdTCdpphkTbkBiu0gbcKl6MyFn25mcO1dFnwZ+0twg4SHlCTlRAGAUpiLxCwMgIU+z1YjHLQKXjEaQjsNs7FXI0RXVzBrizqVgep05sRDrDpk1Zbh2hjmot+JKnrmQoQ9x0HujJt+wk0GG8bN9lGSdbRMrxLOzbR9lqVdpbhC35fo+ySpwvMSPBewSsc7OtfG9dSbEryRvjop0Uk91QpLNJToukTXUmUrKuFIwpsS3Ug0LtFYoh9L9KPcZyTMctuZ8nwvsq3mF8RewxsK4Ca81Unqa6HCBbdGNSV1Bm8ZSdbuGczFoZQDJ07l/O31uz8TPhx0jvVeUldYJIStRO/2joe6JFnksyk0+mDQOZM0HkXuYldodN47NeRCfkh9shP1+90/TuRKMRDiRbtKw7NRp1sTiYZU52T0DV2vrz5a4K2g1++P9Pp8InIvOB2Ouv36Z8QxuuWW0YVOvXkRuRfMuoOXcgFrx7t6r3cicXIv6HdRfyYJVjt+kj117u049E6O8x5U7IIltxSm0FqGJrlr0SQvWt2YYDUl5JZq1K9k/RuK4j1qr6n61mmNGX5TxtZ2jRlxU8bWg9uMakrU0KbMbI0fyGwmycl+fcOeZU7cU71JTvbrG3Yss+me6k1ysl/fsL+Zh5vVDfubOXrX9pp5/PQ4XYm7jZ9eRxDJt2YE4qJC4Z04Zi/EnytiHn0hzla6cGxhW/FrttPoS0K02QpFJO5MRv2GJAc3nSOjd9T962Xr9Vlxe3qo/Ko8VZ4rhtJXXitvlUtlrGDlk/Kv8p/yv3qgPld/U41cevCgyPlFqTzq758B3xzbYA==</latexit>

<latexit sha1_base64="nwgNH8KuDakNS1u7htHlFpvQbso=">AAAKgHicfZZdb9s2FIbV7KvV1i3dLncj1ChQFEYm2Y3tDCjQxM7ai7XJgjguEHkBRR/bgqkPUJRlldBf7P3+x243lPqwI4lydeMjPu85Ig9f0LR8YgdM1/95cPDV1998+93DR+r3Pzz+8afDJz/fBF5IMYyxRzz6wUIBENuFMbMZgQ8+BeRYBCbWapjyyRpoYHvuNYt9mDpo4dpzGyMmhu4Ol2ZkYW5ajnaaaKapll7NiObhpCC7179ffEmWEiJmMEP3LEzuDlv6kZ49mhwYRdBSiufy7skjxZx5OHTAZZigILg1dJ9NOaLMxgQS1QwD8BFeoQXchmw+mHLb9UMGLk60Z4LNQ6IxT0tXrc1sCpiRWAQIU1tU0PASUYSZ6I1aLRWAixwI2rO17Qd5GKwXecCQaOyUb7LGJ48rmXxBkb+08aYyNY6cwEFsKQ0GsWNVByEkQNdOdTCdpphkTbkBiu0gbcKl6MyFn25mcO1dFnwZ+0twg4SHlCTlRAGAUpiLxCwMgIU+z1YjHLQKXjEaQjsNs7FXI0RXVzBrizqVgep05sRDrDpk1Zbh2hjmot+JKnrmQoQ9x0HujJt+wk0GG8bN9lGSdbRMrxLOzbR9lqVdpbhC35fo+ySpwvMSPBewSsc7OtfG9dSbEryRvjop0Uk91QpLNJToukTXUmUrKuFIwpsS3Ug0LtFYoh9L9KPcZyTMctuZ8nwvsq3mF8RewxsK4Ca81Unqa6HCBbdGNSV1Bm8ZSdbuGczFoZQDJ07l/O31uz8TPhx0jvVeUldYJIStRO/2joe6JFnksyk0+mDQOZM0HkXuYldodN47NeRCfkh9shP1+90/TuRKMRDiRbtKw7NRp1sTiYZU52T0DV2vrz5a4K2g1++P9Pp8InIvOB2Ouv36Z8QxuuWW0YVOvXkRuRfMuoOXcgFrx7t6r3cicXIv6HdRfyYJVjt+kj117u049E6O8x5U7IIltxSm0FqGJrlr0SQvWt2YYDUl5JZq1K9k/RuK4j1qr6n61mmNGX5TxtZ2jRlxU8bWg9uMakrU0KbMbI0fyGwmycl+fcOeZU7cU71JTvbrG3Yss+me6k1ysl/fsL+Zh5vVDfubOXrX9pp5/PQ4XYm7jZ9eRxDJt2YE4qJC4Z04Zi/EnytiHn0hzla6cGxhW/FrttPoS0K02QpFJO5MRv2GJAc3nSOjd9T962Xr9Vlxe3qo/Ko8VZ4rhtJXXitvlUtlrGDlk/Kv8p/yv3qgPld/U41cevCgyPlFqTzq758B3xzbYA==</latexit>

A AW
AW W ⇤ AW
W ⇤ AW u
⇥ u
A
<latexit sha1_base64="nwgNH8KuDakNS1u7htHlFpvQbso=">AAAKgHicfZZdb9s2FIbV7KvV1i3dLncj1ChQFEYm2Y3tDCjQxM7ai7XJgjguEHkBRR/bgqkPUJRlldBf7P3+x243lPqwI4lydeMjPu85Ig9f0LR8YgdM1/95cPDV1998+93DR+r3Pzz+8afDJz/fBF5IMYyxRzz6wUIBENuFMbMZgQ8+BeRYBCbWapjyyRpoYHvuNYt9mDpo4dpzGyMmhu4Ol2ZkYW5ajnaaaKapll7NiObhpCC7179ffEmWEiJmMEP3LEzuDlv6kZ49mhwYRdBSiufy7skjxZx5OHTAZZigILg1dJ9NOaLMxgQS1QwD8BFeoQXchmw+mHLb9UMGLk60Z4LNQ6IxT0tXrc1sCpiRWAQIU1tU0PASUYSZ6I1aLRWAixwI2rO17Qd5GKwXecCQaOyUb7LGJ48rmXxBkb+08aYyNY6cwEFsKQ0GsWNVByEkQNdOdTCdpphkTbkBiu0gbcKl6MyFn25mcO1dFnwZ+0twg4SHlCTlRAGAUpiLxCwMgIU+z1YjHLQKXjEaQjsNs7FXI0RXVzBrizqVgep05sRDrDpk1Zbh2hjmot+JKnrmQoQ9x0HujJt+wk0GG8bN9lGSdbRMrxLOzbR9lqVdpbhC35fo+ySpwvMSPBewSsc7OtfG9dSbEryRvjop0Uk91QpLNJToukTXUmUrKuFIwpsS3Ug0LtFYoh9L9KPcZyTMctuZ8nwvsq3mF8RewxsK4Ca81Unqa6HCBbdGNSV1Bm8ZSdbuGczFoZQDJ07l/O31uz8TPhx0jvVeUldYJIStRO/2joe6JFnksyk0+mDQOZM0HkXuYldodN47NeRCfkh9shP1+90/TuRKMRDiRbtKw7NRp1sTiYZU52T0DV2vrz5a4K2g1++P9Pp8InIvOB2Ouv36Z8QxuuWW0YVOvXkRuRfMuoOXcgFrx7t6r3cicXIv6HdRfyYJVjt+kj117u049E6O8x5U7IIltxSm0FqGJrlr0SQvWt2YYDUl5JZq1K9k/RuK4j1qr6n61mmNGX5TxtZ2jRlxU8bWg9uMakrU0KbMbI0fyGwmycl+fcOeZU7cU71JTvbrG3Yss+me6k1ysl/fsL+Zh5vVDfubOXrX9pp5/PQ4XYm7jZ9eRxDJt2YE4qJC4Z04Zi/EnytiHn0hzla6cGxhW/FrttPoS0K02QpFJO5MRv2GJAc3nSOjd9T962Xr9Vlxe3qo/Ko8VZ4rhtJXXitvlUtlrGDlk/Kv8p/yv3qgPld/U41cevCgyPlFqTzq758B3xzbYA==</latexit>

A AW
<latexit sha1_base64="jIXFZZmUabcYev7K5rHWgqvGIRo=">AAAKY3icfZZdb9s2FIbV7qtx1zXtdjcMEGYMGAojkOzGci4KNLGzFtjaZEEcF4i8gKKPZcHUByjKsiroR+3XDLvbLvY/Rkm2I4l0daNjPu85Ig9f0LQC4oRM0/5+8PCzz7/48qtHB63HXz/55unhs+c3oR9RDGPsE59+sFAIxPFgzBxG4ENAAbkWgYm1HOZ8sgIaOr53zZIApi6yPWfuYMT40N3hr2Zs4dS0XPU0U02zVflZvCbF6Cb848UncJTdHba1I614VDHQN0Fb2TyXd88OFHPm48gFj2GCwvBW1wI2TRFlDiaQtcwohADhJbLhNmLzwTR1vCBi4OFM/YmzeURU5qv5wtSZQwEzkvAAYerwCipeIIow48tv1UuF4CEXws5s5QRhGYYruwwY4r2bpuuit9mTWmZqUxQsHLyuTS1FbugithAGw8S16oMQEaArtz6YT5NPsqFcA8VOmDfhknfmIsj3K7z2Lzd8kQQL8MIsjSjJqokcAKUw54lFGAKLgrRYDTfJMnzFaASdPCzGXo0QXV7BrMPr1Abq05kTH7H6kNVYhudgmPN+Zy3eMw9i7Lsu8mapGXCvMFiz1OwcZUVHq/QqS1Mzb59lqVc5rtH3Ffo+y+rwvALPOazT8Y7O1XEz9aYCb4SvTip00ky1ogqNBLqq0JVQ2YorOBbwukLXAk0qNBHoxwr9KPYZcbPcdqdpuRfFVqcXxFnBGwrgZWm7mzXXQrkLbvV6Su6MtK1nRbtnMOfnTgncJJenb6/f/Zalw0H3WOtnTYVFIthKtF7/eKgJEruczUajDQbdM0HjU+TZu0Kj8/6pLhYKIhqQncgwer+ciJUSIMSPd5WGZ6NuryHiDanPSTd0TWuuPrbxVtA3jJHWnE9M7gWnw1HPaH4mpjtu6T3oNpsXk3vBrDd4KRawdryn9fsnAif3AqOHjJkgWO74SfE0ub/j0D85LntQswsW3LIxhdrWVcFdtky+abU0wZIllJaS6pei/g1FyR61L6u+dZo0I5BlbG0nzUhkGVsPbjPqKbGkTYXZpB8obCbIyX69ZM8KJ+6pLpOT/XrJjhU23VNdJif79ZL9LTwsV0v2t3D0ru0N8wT5cbrkt50gv44gUm7NCPhFhcI7fsxe8D9XxHz6gp+t1HYdblv+Njt59CkhWm+FPOJ3Jr15QxKDm+6R3j/q/f6y/fpsc3t6pHyv/Kj8rOiKobxW3iqXyljByp/KX8o/yr8H/7Uet563viulDx9scr5Vak/rh/8BaIDPUA==</latexit>

A
<latexit sha1_base64="nwgNH8KuDakNS1u7htHlFpvQbso=">AAAKgHicfZZdb9s2FIbV7KvV1i3dLncj1ChQFEYm2Y3tDCjQxM7ai7XJgjguEHkBRR/bgqkPUJRlldBf7P3+x243lPqwI4lydeMjPu85Ig9f0LR8YgdM1/95cPDV1998+93DR+r3Pzz+8afDJz/fBF5IMYyxRzz6wUIBENuFMbMZgQ8+BeRYBCbWapjyyRpoYHvuNYt9mDpo4dpzGyMmhu4Ol2ZkYW5ajnaaaKapll7NiObhpCC7179ffEmWEiJmMEP3LEzuDlv6kZ49mhwYRdBSiufy7skjxZx5OHTAZZigILg1dJ9NOaLMxgQS1QwD8BFeoQXchmw+mHLb9UMGLk60Z4LNQ6IxT0tXrc1sCpiRWAQIU1tU0PASUYSZ6I1aLRWAixwI2rO17Qd5GKwXecCQaOyUb7LGJ48rmXxBkb+08aYyNY6cwEFsKQ0GsWNVByEkQNdOdTCdpphkTbkBiu0gbcKl6MyFn25mcO1dFnwZ+0twg4SHlCTlRAGAUpiLxCwMgIU+z1YjHLQKXjEaQjsNs7FXI0RXVzBrizqVgep05sRDrDpk1Zbh2hjmot+JKnrmQoQ9x0HujJt+wk0GG8bN9lGSdbRMrxLOzbR9lqVdpbhC35fo+ySpwvMSPBewSsc7OtfG9dSbEryRvjop0Uk91QpLNJToukTXUmUrKuFIwpsS3Ug0LtFYoh9L9KPcZyTMctuZ8nwvsq3mF8RewxsK4Ca81Unqa6HCBbdGNSV1Bm8ZSdbuGczFoZQDJ07l/O31uz8TPhx0jvVeUldYJIStRO/2joe6JFnksyk0+mDQOZM0HkXuYldodN47NeRCfkh9shP1+90/TuRKMRDiRbtKw7NRp1sTiYZU52T0DV2vrz5a4K2g1++P9Pp8InIvOB2Ouv36Z8QxuuWW0YVOvXkRuRfMuoOXcgFrx7t6r3cicXIv6HdRfyYJVjt+kj117u049E6O8x5U7IIltxSm0FqGJrlr0SQvWt2YYDUl5JZq1K9k/RuK4j1qr6n61mmNGX5TxtZ2jRlxU8bWg9uMakrU0KbMbI0fyGwmycl+fcOeZU7cU71JTvbrG3Yss+me6k1ysl/fsL+Zh5vVDfubOXrX9pp5/PQ4XYm7jZ9eRxDJt2YE4qJC4Z04Zi/EnytiHn0hzla6cGxhW/FrttPoS0K02QpFJO5MRv2GJAc3nSOjd9T962Xr9Vlxe3qo/Ko8VZ4rhtJXXitvlUtlrGDlk/Kv8p/yv3qgPld/U41cevCgyPlFqTzq758B3xzbYA==</latexit>

AW AW W ⇤ AW
W ⇤ AW ⇤
W AW u
A A
<latexit sha1_base64="nwgNH8KuDakNS1u7htHlFpvQbso=">AAAKgHicfZZdb9s2FIbV7KvV1i3dLncj1ChQFEYm2Y3tDCjQxM7ai7XJgjguEHkBRR/bgqkPUJRlldBf7P3+x243lPqwI4lydeMjPu85Ig9f0LR8YgdM1/95cPDV1998+93DR+r3Pzz+8afDJz/fBF5IMYyxRzz6wUIBENuFMbMZgQ8+BeRYBCbWapjyyRpoYHvuNYt9mDpo4dpzGyMmhu4Ol2ZkYW5ajnaaaKapll7NiObhpCC7179ffEmWEiJmMEP3LEzuDlv6kZ49mhwYRdBSiufy7skjxZx5OHTAZZigILg1dJ9NOaLMxgQS1QwD8BFeoQXchmw+mHLb9UMGLk60Z4LNQ6IxT0tXrc1sCpiRWAQIU1tU0PASUYSZ6I1aLRWAixwI2rO17Qd5GKwXecCQaOyUb7LGJ48rmXxBkb+08aYyNY6cwEFsKQ0GsWNVByEkQNdOdTCdpphkTbkBiu0gbcKl6MyFn25mcO1dFnwZ+0twg4SHlCTlRAGAUpiLxCwMgIU+z1YjHLQKXjEaQjsNs7FXI0RXVzBrizqVgep05sRDrDpk1Zbh2hjmot+JKnrmQoQ9x0HujJt+wk0GG8bN9lGSdbRMrxLOzbR9lqVdpbhC35fo+ySpwvMSPBewSsc7OtfG9dSbEryRvjop0Uk91QpLNJToukTXUmUrKuFIwpsS3Ug0LtFYoh9L9KPcZyTMctuZ8nwvsq3mF8RewxsK4Ca81Unqa6HCBbdGNSV1Bm8ZSdbuGczFoZQDJ07l/O31uz8TPhx0jvVeUldYJIStRO/2joe6JFnksyk0+mDQOZM0HkXuYldodN47NeRCfkh9shP1+90/TuRKMRDiRbtKw7NRp1sTiYZU52T0DV2vrz5a4K2g1++P9Pp8InIvOB2Ouv36Z8QxuuWW0YVOvXkRuRfMuoOXcgFrx7t6r3cicXIv6HdRfyYJVjt+kj117u049E6O8x5U7IIltxSm0FqGJrlr0SQvWt2YYDUl5JZq1K9k/RuK4j1qr6n61mmNGX5TxtZ2jRlxU8bWg9uMakrU0KbMbI0fyGwmycl+fcOeZU7cU71JTvbrG3Yss+me6k1ysl/fsL+Zh5vVDfubOXrX9pp5/PQ4XYm7jZ9eRxDJt2YE4qJC4Z04Zi/EnytiHn0hzla6cGxhW/FrttPoS0K02QpFJO5MRv2GJAc3nSOjd9T962Xr9Vlxe3qo/Ko8VZ4rhtJXXitvlUtlrGDlk/Kv8p/yv3qgPld/U41cevCgyPlFqTzq758B3xzbYA==</latexit>

<latexit sha1_base64="nwgNH8KuDakNS1u7htHlFpvQbso=">AAAKgHicfZZdb9s2FIbV7KvV1i3dLncj1ChQFEYm2Y3tDCjQxM7ai7XJgjguEHkBRR/bgqkPUJRlldBf7P3+x243lPqwI4lydeMjPu85Ig9f0LR8YgdM1/95cPDV1998+93DR+r3Pzz+8afDJz/fBF5IMYyxRzz6wUIBENuFMbMZgQ8+BeRYBCbWapjyyRpoYHvuNYt9mDpo4dpzGyMmhu4Ol2ZkYW5ajnaaaKapll7NiObhpCC7179ffEmWEiJmMEP3LEzuDlv6kZ49mhwYRdBSiufy7skjxZx5OHTAZZigILg1dJ9NOaLMxgQS1QwD8BFeoQXchmw+mHLb9UMGLk60Z4LNQ6IxT0tXrc1sCpiRWAQIU1tU0PASUYSZ6I1aLRWAixwI2rO17Qd5GKwXecCQaOyUb7LGJ48rmXxBkb+08aYyNY6cwEFsKQ0GsWNVByEkQNdOdTCdpphkTbkBiu0gbcKl6MyFn25mcO1dFnwZ+0twg4SHlCTlRAGAUpiLxCwMgIU+z1YjHLQKXjEaQjsNs7FXI0RXVzBrizqVgep05sRDrDpk1Zbh2hjmot+JKnrmQoQ9x0HujJt+wk0GG8bN9lGSdbRMrxLOzbR9lqVdpbhC35fo+ySpwvMSPBewSsc7OtfG9dSbEryRvjop0Uk91QpLNJToukTXUmUrKuFIwpsS3Ug0LtFYoh9L9KPcZyTMctuZ8nwvsq3mF8RewxsK4Ca81Unqa6HCBbdGNSV1Bm8ZSdbuGczFoZQDJ07l/O31uz8TPhx0jvVeUldYJIStRO/2joe6JFnksyk0+mDQOZM0HkXuYldodN47NeRCfkh9shP1+90/TuRKMRDiRbtKw7NRp1sTiYZU52T0DV2vrz5a4K2g1++P9Pp8InIvOB2Ouv36Z8QxuuWW0YVOvXkRuRfMuoOXcgFrx7t6r3cicXIv6HdRfyYJVjt+kj117u049E6O8x5U7IIltxSm0FqGJrlr0SQvWt2YYDUl5JZq1K9k/RuK4j1qr6n61mmNGX5TxtZ2jRlxU8bWg9uMakrU0KbMbI0fyGwmycl+fcOeZU7cU71JTvbrG3Yss+me6k1ysl/fsL+Zh5vVDfubOXrX9pp5/PQ4XYm7jZ9eRxDJt2YE4qJC4Z04Zi/EnytiHn0hzla6cGxhW/FrttPoS0K02QpFJO5MRv2GJAc3nSOjd9T962Xr9Vlxe3qo/Ko8VZ4rhtJXXitvlUtlrGDlk/Kv8p/yv3qgPld/U41cevCgyPlFqTzq758B3xzbYA==</latexit>

u ⇥ u
AW AW
u
<latexit sha1_base64="sFarUJkJ9bJ8ztdKsE1SkK8D9Ww=">AAAKLHicfVZNb9tGEGXSr8htGqftrReiQoGiEAxSjEX5ECCx5CaHJnYNyw5gCsZyOZIILT+wXIqiF/wvvbaH/ppeiqLX/o4uSUkmuVT3osG+N8PZNw+rtUPiRkzT/nr0+KOPP/n0syedg8+/ePrls8PnX11HQUwxTHBAAvrBRhEQ14cJcxmBDyEF5NkEbuzlKMdvVkAjN/CvWBrC1ENz3525GDGxdXf4jZXYmFtEZDgo45btqXF2d9jVjrRiqXKgb4KuslkXd887iuUEOPbAZ5igKLrVtZBNOaLMxQSyAyuOIER4ieZwG7PZcMpdP4wZ+DhTvxfYLCYqC9S8Q9VxKWBGUhEgTF1RQcULRBFm4hwH9VIR+MiDqOes3DAqw2g1LwOGhAhTvi5Eyp7WMvmconDh4nWtNY68yENsIW1GqWfXNyEmQFdefTNvUzTZYK6BYjfKRbgQypyHufDRVXCxwRdpuAA/ynhMSVZNFABQCjORWIQRsDjkxWnEtJfRS0Zj6OVhsfdyjOjyEpyeqFPbqLczIwFi9S27cQzfxTATemcHQjMfEhx4HvIdboXCIAzWjFu9o6xQtIpeZpxbuXy2rV7mcA19X0HfZ1kdPKuAZwKso5MdOlMnzdTrCngtffWmgt40U+24gsYSuqqgK6mynVTgRILXFXQtoWkFTSX0voLeyzojYZbb/pSXsyhGzc+Ju4I3FMDPeLefNc9ChQtu9XpK7gze1bNCbgdm4gIpAS/N6fzt1bufMz4a9o+1QdZk2CSGLUUzBscjTaLMy242HG047J9KnIAif74rND4bvNblQmFMQ7Ijmabx04lcKQVCgmRXaXQ67hsNkhCk3pNu6prWPH0yx1vCwDTHWrOfhDwQXo/Ghtn8TEJ3uK0b0G+Kl5AHgmMMX8gF7B1uaIPBiYSTB4JpINORCMsdflKsJh7scBicHJca1OyCJbdsTKF2dVVy17yNvpG6NcFuSygt1cpfyvw3FKV72EFb9a3TWjPCtoyt7Voz0raMrQe3GfWUpEWmwmytHyhsJtHJfn7LzAon7qneRif7+S0TK2y6p3obneznt8y38HA7u2W+haN3sjfME+bX6VI8csL8OYJIOZoxiIcKhXfimj0Xf66IBfRHcbfSuecK24pfq5dH/0dE6y1RROLNpDdfSHJw3T/SB0fGLy+6r043r6cnyrfKd8oPiq6YyivlrXKhTBSs3Cu/Kr8pv3f+6PzZ+bvzT0l9/GiT87VSW51//wPQjL8j</latexit>

W ⇤ AW W ⇤ AW
0
<latexit sha1_base64="YMrU1hUiOJd7Vnfurxz67aXfcrM=">AAAKHHicfVbLjts2FFWSPuJp0zy67EaoUaAojIFkZSzPIkAy9jRZNJnpYOwJMDICir6WVVMPUJRlhdA/dNsu+jXdFd0W6N+UkmyPJMrlxhc8515dnntA0w6JGzFN+/fe/QeffPrZ5w87R198+eirx0+ePptGQUwxTHBAAvreRhEQ14cJcxmB9yEF5NkEbuzVKMdv1kAjN/CvWRrCzEOO7y5cjJjYmlorzLXsw5OudqwVS5UDfRt0le26/PC0o1jzAMce+AwTFEW3uhayGUeUuZhAdmTFEYQIr5ADtzFbDGfc9cOYgY8z9TuBLWKiskDNO1LnLgXMSCoChKkrKqh4iSjCTPR9VC8VgY88iHrztRtGZRitnTJgSBx6xjeFKNmjWiZ3KAqXLt7UWuPIizzEltJmlHp2fRNiAnTt1TfzNkWTDeYGKHajXIRLocxFmAsdXQeXW3yZhkvwo4zHlGTVRAEApbAQiUUYAYtDXpxGTHcVvWA0hl4eFnsvxoiurmDeE3VqG/V2FiRArL5lN47huxgWQu/sSGjmQ4IDz0P+nFthxi0GG8at3nFWKFpFrzLOrVw+21avcriGvqug77KsDp5XwHMB1tHJHl2ok2bqtAJOpa/eVNCbZqodV9BYQtcVdC1VtpMKnEjwpoJuJDStoKmEfqygH2WdkTDLbX/Gy1kUo+YXxF3DawrgZ7zbz5pnocIFt3o9JXcG7+pZIfccFuLCKAEvzen8zfXbnzI+GvZPtEHWZNgkhh1FMwYnI02iOGU3W442HPbPJE5Ake/sC43PB690uVAY05DsSaZp/HgqV0qBkCDZVxqdjftGgyQEqfekm7qmNU+fOHhHGJjmWGv2k5A7wqvR2DCbn0noHrd1A/pN8RJyR5gbw+dyAXuPG9pgcCrh5I5gGsicS4TVHj8tVhMP9jgMTk9KDWp2wZJbtqZQu7oquctpo2+lbk2w2xJKS7XyVzL/NUXpAXbQVn3ntNaMsC1jZ7vWjLQtY+fBXUY9JWmRqTBb6wcKm0l0cpjfMrPCiQeqt9HJYX7LxAqbHqjeRieH+S3zLTzczm6Zb+HovewN84T5dSreNVaYP0cQKUczBvFQofBWXLMX4s8VsYD+IO5W6niusK34tXp59H9EtNkRRSTeTHrzhSQH0/6xPjg2fn7efXm2fT09VL5RvlW+V3TFVF4qb5RLZaJg5RflV+U35ffOH50/O391/i6p9+9tc75Waqvzz39Qk7iz</latexit>

u 0
<latexit sha1_base64="YMrU1hUiOJd7Vnfurxz67aXfcrM=">AAAKHHicfVbLjts2FFWSPuJp0zy67EaoUaAojIFkZSzPIkAy9jRZNJnpYOwJMDICir6WVVMPUJRlhdA/dNsu+jXdFd0W6N+UkmyPJMrlxhc8515dnntA0w6JGzFN+/fe/QeffPrZ5w87R198+eirx0+ePptGQUwxTHBAAvreRhEQ14cJcxmB9yEF5NkEbuzVKMdv1kAjN/CvWRrCzEOO7y5cjJjYmlorzLXsw5OudqwVS5UDfRt0le26/PC0o1jzAMce+AwTFEW3uhayGUeUuZhAdmTFEYQIr5ADtzFbDGfc9cOYgY8z9TuBLWKiskDNO1LnLgXMSCoChKkrKqh4iSjCTPR9VC8VgY88iHrztRtGZRitnTJgSBx6xjeFKNmjWiZ3KAqXLt7UWuPIizzEltJmlHp2fRNiAnTt1TfzNkWTDeYGKHajXIRLocxFmAsdXQeXW3yZhkvwo4zHlGTVRAEApbAQiUUYAYtDXpxGTHcVvWA0hl4eFnsvxoiurmDeE3VqG/V2FiRArL5lN47huxgWQu/sSGjmQ4IDz0P+nFthxi0GG8at3nFWKFpFrzLOrVw+21avcriGvqug77KsDp5XwHMB1tHJHl2ok2bqtAJOpa/eVNCbZqodV9BYQtcVdC1VtpMKnEjwpoJuJDStoKmEfqygH2WdkTDLbX/Gy1kUo+YXxF3DawrgZ7zbz5pnocIFt3o9JXcG7+pZIfccFuLCKAEvzen8zfXbnzI+GvZPtEHWZNgkhh1FMwYnI02iOGU3W442HPbPJE5Ake/sC43PB690uVAY05DsSaZp/HgqV0qBkCDZVxqdjftGgyQEqfekm7qmNU+fOHhHGJjmWGv2k5A7wqvR2DCbn0noHrd1A/pN8RJyR5gbw+dyAXuPG9pgcCrh5I5gGsicS4TVHj8tVhMP9jgMTk9KDWp2wZJbtqZQu7oquctpo2+lbk2w2xJKS7XyVzL/NUXpAXbQVn3ntNaMsC1jZ7vWjLQtY+fBXUY9JWmRqTBb6wcKm0l0cpjfMrPCiQeqt9HJYX7LxAqbHqjeRieH+S3zLTzczm6Zb+HovewN84T5dSreNVaYP0cQKUczBvFQofBWXLMX4s8VsYD+IO5W6niusK34tXp59H9EtNkRRSTeTHrzhSQH0/6xPjg2fn7efXm2fT09VL5RvlW+V3TFVF4qb5RLZaJg5RflV+U35ffOH50/O391/i6p9+9tc75Waqvzz39Qk7iz</latexit>

u
0
<latexit sha1_base64="YMrU1hUiOJd7Vnfurxz67aXfcrM=">AAAKHHicfVbLjts2FFWSPuJp0zy67EaoUaAojIFkZSzPIkAy9jRZNJnpYOwJMDICir6WVVMPUJRlhdA/dNsu+jXdFd0W6N+UkmyPJMrlxhc8515dnntA0w6JGzFN+/fe/QeffPrZ5w87R198+eirx0+ePptGQUwxTHBAAvreRhEQ14cJcxmB9yEF5NkEbuzVKMdv1kAjN/CvWRrCzEOO7y5cjJjYmlorzLXsw5OudqwVS5UDfRt0le26/PC0o1jzAMce+AwTFEW3uhayGUeUuZhAdmTFEYQIr5ADtzFbDGfc9cOYgY8z9TuBLWKiskDNO1LnLgXMSCoChKkrKqh4iSjCTPR9VC8VgY88iHrztRtGZRitnTJgSBx6xjeFKNmjWiZ3KAqXLt7UWuPIizzEltJmlHp2fRNiAnTt1TfzNkWTDeYGKHajXIRLocxFmAsdXQeXW3yZhkvwo4zHlGTVRAEApbAQiUUYAYtDXpxGTHcVvWA0hl4eFnsvxoiurmDeE3VqG/V2FiRArL5lN47huxgWQu/sSGjmQ4IDz0P+nFthxi0GG8at3nFWKFpFrzLOrVw+21avcriGvqug77KsDp5XwHMB1tHJHl2ok2bqtAJOpa/eVNCbZqodV9BYQtcVdC1VtpMKnEjwpoJuJDStoKmEfqygH2WdkTDLbX/Gy1kUo+YXxF3DawrgZ7zbz5pnocIFt3o9JXcG7+pZIfccFuLCKAEvzen8zfXbnzI+GvZPtEHWZNgkhh1FMwYnI02iOGU3W442HPbPJE5Ake/sC43PB690uVAY05DsSaZp/HgqV0qBkCDZVxqdjftGgyQEqfekm7qmNU+fOHhHGJjmWGv2k5A7wqvR2DCbn0noHrd1A/pN8RJyR5gbw+dyAXuPG9pgcCrh5I5gGsicS4TVHj8tVhMP9jgMTk9KDWp2wZJbtqZQu7oquctpo2+lbk2w2xJKS7XyVzL/NUXpAXbQVn3ntNaMsC1jZ7vWjLQtY+fBXUY9JWmRqTBb6wcKm0l0cpjfMrPCiQeqt9HJYX7LxAqbHqjeRieH+S3zLTzczm6Zb+HovewN84T5dSreNVaYP0cQKUczBvFQofBWXLMX4s8VsYD+IO5W6niusK34tXp59H9EtNkRRSTeTHrzhSQH0/6xPjg2fn7efXm2fT09VL5RvlW+V3TFVF4qb5RLZaJg5RflV+U35ffOH50/O391/i6p9+9tc75Waqvzz39Qk7iz</latexit>

0
<latexit sha1_base64="YMrU1hUiOJd7Vnfurxz67aXfcrM=">AAAKHHicfVbLjts2FFWSPuJp0zy67EaoUaAojIFkZSzPIkAy9jRZNJnpYOwJMDICir6WVVMPUJRlhdA/dNsu+jXdFd0W6N+UkmyPJMrlxhc8515dnntA0w6JGzFN+/fe/QeffPrZ5w87R198+eirx0+ePptGQUwxTHBAAvreRhEQ14cJcxmB9yEF5NkEbuzVKMdv1kAjN/CvWRrCzEOO7y5cjJjYmlorzLXsw5OudqwVS5UDfRt0le26/PC0o1jzAMce+AwTFEW3uhayGUeUuZhAdmTFEYQIr5ADtzFbDGfc9cOYgY8z9TuBLWKiskDNO1LnLgXMSCoChKkrKqh4iSjCTPR9VC8VgY88iHrztRtGZRitnTJgSBx6xjeFKNmjWiZ3KAqXLt7UWuPIizzEltJmlHp2fRNiAnTt1TfzNkWTDeYGKHajXIRLocxFmAsdXQeXW3yZhkvwo4zHlGTVRAEApbAQiUUYAYtDXpxGTHcVvWA0hl4eFnsvxoiurmDeE3VqG/V2FiRArL5lN47huxgWQu/sSGjmQ4IDz0P+nFthxi0GG8at3nFWKFpFrzLOrVw+21avcriGvqug77KsDp5XwHMB1tHJHl2ok2bqtAJOpa/eVNCbZqodV9BYQtcVdC1VtpMKnEjwpoJuJDStoKmEfqygH2WdkTDLbX/Gy1kUo+YXxF3DawrgZ7zbz5pnocIFt3o9JXcG7+pZIfccFuLCKAEvzen8zfXbnzI+GvZPtEHWZNgkhh1FMwYnI02iOGU3W442HPbPJE5Ake/sC43PB690uVAY05DsSaZp/HgqV0qBkCDZVxqdjftGgyQEqfekm7qmNU+fOHhHGJjmWGv2k5A7wqvR2DCbn0noHrd1A/pN8RJyR5gbw+dyAXuPG9pgcCrh5I5gGsicS4TVHj8tVhMP9jgMTk9KDWp2wZJbtqZQu7oquctpo2+lbk2w2xJKS7XyVzL/NUXpAXbQVn3ntNaMsC1jZ7vWjLQtY+fBXUY9JWmRqTBb6wcKm0l0cpjfMrPCiQeqt9HJYX7LxAqbHqjeRieH+S3zLTzczm6Zb+HovewN84T5dSreNVaYP0cQKUczBvFQofBWXLMX4s8VsYD+IO5W6niusK34tXp59H9EtNkRRSTeTHrzhSQH0/6xPjg2fn7efXm2fT09VL5RvlW+V3TFVF4qb5RLZaJg5RflV+U35ffOH50/O391/i6p9+9tc75Waqvzz39Qk7iz</latexit>

A0
<latexit sha1_base64="73UYTcr3LH5fHZMDg0IudiLqESE=">AAAKInicfVZNb9tGEGXSr8htmqQ99kJUKFoUgkGKsSgfAsSW3OTQxK5h2QFMIViuRhKh5QeWS1HMgn+j1+SQX5Nb0VOB/pguSYkmuVT3osG+N8PZNw+rtQPihEzT/rl3/7PPv/jyqwedg6+/efjto8dPvrsO/YhimGCf+PSNjUIgjgcT5jACbwIKyLUJ3NirUYbfrIGGju9dsSSAqYsWnjN3MGJiy7JiG3PLdtWT9Oe3j7vaoZYvVQ70bdBVtuvi7ZOOYs18HLngMUxQGN7qWsCmHFHmYALpgRWFECC8Qgu4jdh8OOWOF0QMPJyqPwlsHhGV+WrWlzpzKGBGEhEgTB1RQcVLRBFmovuDeqkQPORC2JutnSAswnC9KAKGxNGnfJNLkz6sZfIFRcHSwZtaaxy5oYvYUtoME9eub0JEgK7d+mbWpmiywdwAxU6YiXAhlDkPMrnDK/9iiy+TYAlemPKIkrSaKACgFOYiMQ9DYFHA89OIGa/CZ4xG0MvCfO/ZGNHVJcx6ok5to97OnPiI1bfsxjE8B8Nc6J0eCM08iLHvusibcStIucVgw7jVO0xzRavoZcq5lcln2+plBtfQ1xX0dZrWwbMKeCbAOjop0bk6aaZeV8Br6as3FfSmmWpHFTSS0HUFXUuV7bgCxxK8qaAbCU0qaCKh7yroO1lnJMxy25/yYhb5qPk5cdbwggJ4Ke/20+ZZqHDBrV5PyZzBu3qayz2Dubg2CsBNMjp/efXq95SPhv0jbZA2GTaJYEfRjMHRSJMoi6KbLUcbDvunEsenyFuUhcZngxNdLhRENCAlyTSN347lSgkQ4sdlpdHpuG80SEKQek+6qWta8/TxAu8IA9Mca81+YnJHOBmNDbP5mZiWuK0b0G+KF5M7wswYPpUL2CVuaIPBsYSTO4JpIHMmEVYlfpyvJu6XOAyOjwoNanbBklu2plC7uiq5a9FG30rdmmC3JRSWauWvZP4LipI9bL+t+s5prRlBW8bOdq0ZSVvGzoO7jHpK3CJTbrbWD+Q2k+hkP79lZrkT91Rvo5P9/JaJ5TbdU72NTvbzW+abe7id3TLf3NGl7A3zBNl1uhKPmyB7jiBSjGYM4qFC4ZW4Zs/FnytiPv1V3K104TrCtuLX6mXR/xHRZkcUkXgz6c0Xkhxc9w/1waHxx9Pu89Pt6+mB8oPyo/KLoium8lx5qVwoEwUrgfKn8l750PnY+dT5q/N3Qb1/b5vzvVJbnX//A0jtuuA=</latexit>
130 CHAPTER 3—PROVING THE SPECTRAL THEOREM

Call this matrix R (note that it is not triangular yet). So far we


have shown that W ⇤ AW = R, or if we multiply both sides by
W ⇤ and W, that A = WRW ⇤ .
Call R’s bottom-right block A 0 as indicated in the image. A 0
is an n - 1 ⇥ n - 1 matrix, so by the assumption made above,
it can be factorized: A 0 = VQV ⇤ , with V unitary and Q upper
triangular.
Now note that if we extend the matrix V to an n ⇥ n matrix
as follows
0 1
1 0...0
B0 C
B C
V0 = B . C
@ .. V A
0
we can move it out of the submatrix A 0 , so that
0 1
⇤ ... ⇤
B0 C
B C 0⇤ ⇤
A = WV 0 B . CV W
@ .. Q A
0
with the ⇤’s representing arbitrary values.
We call the matrix in the middle T . Note that this is now
upper triangular. Note also that V 0 is unitary, since the column
we added is a unit vector, and it’s orthogonal to all other columns.
Let U = WV 0 . Since W and V 0 are unitary, their product is as
well, and with that we have

A = UT U⇤
proving the theorem.

The important thing about the Schur decomposition is that it al-


ways works. No matter what kind of square matrix we feed it,
real or complex valued, with or without real eigenvalues, sym-
metric or asymmetric, singular or invertible, we always get a
Schur decomposition out of it.
With this result in hand, the main difficulty of proving the
3.6. THE SPECTRAL THEOREM 131

spectral theorem is solved. We simply need to look at how the


Schur decomposition behaves if we feed it a real-valued symmet-
ric matrix

3.6.2 Proof of the spectral theorem

The spectral theorem A matrix is orthogonally diagonalizable


if and only if it is symmetric.

Proof. We will prove the two directions separately.

(1) If a real-valued matrix A is orth. diagonalizable, it must


be symmetric. Note that in an orthogonal diagonalization we
have D = DT because D is diagonal. Thus, if A is orthogonally
diagonalizable, we know that
T
A = PDPT = PT DT PT = (PDPT )T = AT
which implies that A is symmetric.

(2) If a real-valued matrix A is symmetric, it must be orth.


diagonalizable. For this direction, we need the Schur decom-
position. By assumption A is symmetric and real-valued, so that
A⇤ = AT = A. Let A = UT U⇤ be its Schur decomposition.

Note that we have assumed that A is real-valued, but U and T


could still contain complex values.

From the symmetry of A, we have UT U⇤ = (UT U⇤ )⇤ = UT ⇤ U⇤ ,


so T = T ⇤ . This tells us two things. First that all values off the
diagonal are zero (remember that T is upper triangular), so T is
actually diagonal. Second, that the values on the diagonal are
equal to their own conjugate so they must be real values.
This gives us a real-valued diagonalization A = UT U⇤ . But
what about the matrix U? Could that still contain complex val-
ues? It could, but knowing that A is real-valued, and so are the
diagonal values of T , we can perform the Schur decomposition
specifically so that U ends up real valued as well.
132 CHAPTER 3—PROVING THE SPECTRAL THEOREM

Follow the construction of the Schur decomposition. In the base


case, U is real-valued by definition. In the inductive step, as-
sume that we can choose V real-valued for the case n - 1. When
we choose the eigenvector for , we choose a real eigenvector.
These always exist for real eigenvalues (proof in the appendix).
When we choose the other columns of W we choose real valued
vectors as well.
This means that W and V 0 are real, so their product U is real
as well.
That completes the proof: if we perform the Schur decompo-
sition in such a way that we choose real vectors for W where
possible, the decomposition of a symmetric matrix gives us a di-
agonal real-valued matrix T and an orthogonal matrix U.

It’s been a long road, but we have finally reached the end. It’s
worth looking back at all the preliminaries we discussed, and
briefly seeing why exactly they were necessary to show this re-
sult. Let’s retrace our steps in reverse order.
The last thing we discussed, before the proof of the spectral
theorem was the Schur decomposition. Its usefulness was clear:
the Schur decomposition is the eigendecomposition, if we’re care-
ful about its construction. The main benefit of the Schur decom-
position is that it always works. With the real-valued eigende-
composition, we knew that it sometimes exists and sometimes
doesn’t. From that perspective it’s very difficult to characterize
the set of matrices for which it exists. The Schur decomposi-
tion allowed us to zoom out to the set of all matrices, so that
we could ask what the decomposition looks like for real-valued,
symmetric matrices.
The complex numbers make this possible. Filling matrices
and vectors with complex numbers gives us a Schur decomposi-
tion that always works. The key to this is that the construction of
the Schur decomposition requires us to pick one eigenvalue and
corresponding eigenvector for various matrices. If we allow for
complex eigenvalues, we ensure that this is always possible.
This result, that every n ⇥ n matrix has n eigenvalues if com-
plex values are allowed, follows from two ideas. The first is the
the characteristic polynomial. This is an n-th order polynomial,
constructed from an n ⇥ n matrix A, that is zero, exactly when
3.6. THE SPECTRAL THEOREM 133

the determinant of A - I is zero. This means that the roots


of the characteristic polynomial are the eigenvalues. The second
idea is the fundamental theorem of algebra which tells us that
every n-th order polynomial has exactly n roots in the complex
plane, counting multiplicities.
That gives us the spectral theorem and, as we saw in the last
part, the spectral theorem gives us principal component analysis.
Now that we know how PCA works, why it works, and we
have a thorough understanding of the spectral theorem, there is
only one question left: how do we implement it? There are a
couple of ways to go about this, but the best option by far is to
make use of the singular value decomposition (SVD). This is a
very versatile operation, which has many uses beyond the imple-
mentation of PCA. We will dig into the SVD into the next chapter.
C HAPTER 4 · T HE SINGULAR VALUE
DECOMPOSITION

In the previous chapters, we learned that principal components


are eigenvectors. Specifically, they are the eigenvectors of the
covariance matrix S of our data X.
In this chapter, we’ll develop a slightly different perspective:
that the principal components are singular vectors. Not of the
covariance matrix S, but of the data matrix X itself. Singular
vectors, which we will define below, are closely related to eigen-
vectors, but unlike eigenvectors they are defined for all matrices,
even non-square ones.
The two perspectives are complementary: they are simply dif-
ferent ways of looking at the same thing. The benefit we get
from the singular vectors is that we can develop the singular
value decomposition (SVD).
The SVD is firstly, the most popular and robust way of com-
puting a principal component analysis. But it is also something
of a Swiss army knife of linear algebra. It allows us to compute
linear projections, linear regressions, and many other things.
Put simply, if you want to use linear algebra effectively in data
science or machine learning, the singular value decomposition is
the beginning and the end.
So, a topic well worth a chapter. We’ll focus on the PCA use
case first, since that’s what brought us here, but like the eigende-
composition, the rabbit hole goes much deeper, and we’ll finish
by looking at some of the other uses of the SVD. We will look
in particular detail at the concepts of matrix rank, the pseudo-
inverse and the Eckart-Young-Mirsky theorem.

4.1 Eigenvalues and singular values


In the previous two chapters we explained eigenvalues and eigen-
vectors in detail. As we saw, these are well-defined and pre-
136 CHAPTER 4—THE SINGULAR VALUE DECOMPOSITION

dictable for square, symmetric matrices. What happens if our


matrix is not so well-behaved? What if it’s not symmetric, not
invertible, or not even square? Do some of the ideas that under-
lie eigenvalues and eigenvectors still carry over?
The best place to start is the intuition we built at the end
of Chapter 2, when we discussed normalization. Let’s review
what we said there.
Let’s say that we have a dataset consisting of a set of instances
x. We assume that there is some unobserved data z, which is
standard-normally distributed, and that a linear transformation
x = Az + t has transformed it into to the data we observed.

(imagined) x = Az + t
<latexit sha1_base64="RhL2JZX+06zsFz0mMlhmkB+S5xc=">AAALTXicfVbLbttGFGWStkmUpnGaZTdEhRRpqxikZEvywoBtyUkWTey6fgQwVWNIXUqEhg/MDPXwYH6uP9B1/qDbFuiuKDqkKJnkUJ2NLu45587MvYcU7Qh7lBnGp3v3H3z2+RcPHz2uPfny6VfPtp5/fUnDmDhw4YQ4JB9tRAF7AVwwj2H4GBFAvo3hyp70EvxqCoR6YXDOFhEMfDQKPNdzEJOpmy3Lmuvf7evWzHa45SM2tl39UFi3+o8yF97lmLCsmkyr3F/5a1O8kmVelxXf32zVjW0jXboamFlQ17J1evP88SdrGDqxDwFzMKL02jQiNuCIMM/BIGpWTCFCzgSN4DpmbnfAvSCKGQSO0F9KzI2xzkI9uak+9Ag4DC9kgBziyQq6M0YEOUz2o1YsRSFAPtDGcOpFdBnS6WgZMCSbOeDztNniaUHJRwRFY8+ZF47GkU+TLihJuvDtYhJiDGTqF5PJMeUhS8w5EMejSRNOZWdOomSA9Dw8zfDxIhpDQAWPCRZ5oQSAEHClMA0psDji6W2kayZ0n5EYGkmY5vb7iEzOYNiQdQqJ4nFcHCJWTNmlawSeA67st6jJ9VLvjVEwAj2UR0AsJHrSjBDTZF7ZhWtWH+TYCfySQm/CgHHqrgTyaifnplh3h/uCB6JCc4ijMbKBLa1IXVEoIufnowkg+aiwdM8huNYKPnCTPVPdiIRxZMlDFsSZNmRjKU2WFcDMCX0fBUNuRYJbDOayQmNbpFbJo2eCZ0+HrZ8lcAH9kEM/CFEEj3PgsRClwhdr1NUvytLLHHip7HqVQ6/KUjvOobGCTnPoVKlsz3LwTIHnOXSuoIsculDQ2xx6q/YZyafgujngy1mkHuYn2JvCWwIQCF5vivJdiLT3tVmUJJbndVOk7ZYmkW/YJeAvEjp/d/7+J8F73eau0RZlho1jWFGMVnu3ZyiU0fI0GcfodptHCickyTOzIvWP24emWiiKSYTXpE6n9WZPrbQAjMPZulLvqN9slUiyIcUzmR3TMMq3n42cFaHd6fSN8nlm+I5w2Ou3OuVtZmSN22YLmuXmzfAdYdjq7qgF7DXeMtrtPQXHd4ROC3WGCmGyxvfSVcbDNQ7tvd1lDwp2cRS3ZKbQ66auuGtURc9aXSmwqwRLS1XyJyr/LUGLDeywqvrKaZWKqEqxsl2lYlGlWHlwpShKZhVtSs1WuUFqM4WON/MrZpY6cUP1KjrezK+YWGrTDdWr6Hgzv2K+qYer2RXzTR29bnvJPFHyOp3IL7co+c5CeDma7G/1vXzNnmT/fz/IdysZ+Z60rfy1Gkn0f0Q0XxFlJD8GzfKnnxpcNrfN9vbOzzv1g6Pss/CR9o32rfZKM7WOdqC90061C83RftP+0P7U/qr9Xvu79k/t3yX1/r1M80IrrCcP/wOsUCzH</latexit>

sampling process
z = A-1 (x - t)

x = Az + t
<latexit sha1_base64="RhL2JZX+06zsFz0mMlhmkB+S5xc=">AAALTXicfVbLbttGFGWStkmUpnGaZTdEhRRpqxikZEvywoBtyUkWTey6fgQwVWNIXUqEhg/MDPXwYH6uP9B1/qDbFuiuKDqkKJnkUJ2NLu45587MvYcU7Qh7lBnGp3v3H3z2+RcPHz2uPfny6VfPtp5/fUnDmDhw4YQ4JB9tRAF7AVwwj2H4GBFAvo3hyp70EvxqCoR6YXDOFhEMfDQKPNdzEJOpmy3Lmuvf7evWzHa45SM2tl39UFi3+o8yF97lmLCsmkyr3F/5a1O8kmVelxXf32zVjW0jXboamFlQ17J1evP88SdrGDqxDwFzMKL02jQiNuCIMM/BIGpWTCFCzgSN4DpmbnfAvSCKGQSO0F9KzI2xzkI9uak+9Ag4DC9kgBziyQq6M0YEOUz2o1YsRSFAPtDGcOpFdBnS6WgZMCSbOeDztNniaUHJRwRFY8+ZF47GkU+TLihJuvDtYhJiDGTqF5PJMeUhS8w5EMejSRNOZWdOomSA9Dw8zfDxIhpDQAWPCRZ5oQSAEHClMA0psDji6W2kayZ0n5EYGkmY5vb7iEzOYNiQdQqJ4nFcHCJWTNmlawSeA67st6jJ9VLvjVEwAj2UR0AsJHrSjBDTZF7ZhWtWH+TYCfySQm/CgHHqrgTyaifnplh3h/uCB6JCc4ijMbKBLa1IXVEoIufnowkg+aiwdM8huNYKPnCTPVPdiIRxZMlDFsSZNmRjKU2WFcDMCX0fBUNuRYJbDOayQmNbpFbJo2eCZ0+HrZ8lcAH9kEM/CFEEj3PgsRClwhdr1NUvytLLHHip7HqVQ6/KUjvOobGCTnPoVKlsz3LwTIHnOXSuoIsculDQ2xx6q/YZyafgujngy1mkHuYn2JvCWwIQCF5vivJdiLT3tVmUJJbndVOk7ZYmkW/YJeAvEjp/d/7+J8F73eau0RZlho1jWFGMVnu3ZyiU0fI0GcfodptHCickyTOzIvWP24emWiiKSYTXpE6n9WZPrbQAjMPZulLvqN9slUiyIcUzmR3TMMq3n42cFaHd6fSN8nlm+I5w2Ou3OuVtZmSN22YLmuXmzfAdYdjq7qgF7DXeMtrtPQXHd4ROC3WGCmGyxvfSVcbDNQ7tvd1lDwp2cRS3ZKbQ66auuGtURc9aXSmwqwRLS1XyJyr/LUGLDeywqvrKaZWKqEqxsl2lYlGlWHlwpShKZhVtSs1WuUFqM4WON/MrZpY6cUP1KjrezK+YWGrTDdWr6Hgzv2K+qYer2RXzTR29bnvJPFHyOp3IL7co+c5CeDma7G/1vXzNnmT/fz/IdysZ+Z60rfy1Gkn0f0Q0XxFlJD8GzfKnnxpcNrfN9vbOzzv1g6Pss/CR9o32rfZKM7WOdqC90061C83RftP+0P7U/qr9Xvu79k/t3yX1/r1M80IrrCcP/wOsUCzH</latexit>

normalization z = A-1 (x - t)

Summary of normalization. We imagine some linear process


has transformed the data from some standard-normally dis-
tributed data, and we invert that transformation. The sphere
of all unit vectors in the “original” space is transformed into
an ellipse in the space of the data we observed.

What we saw in Chapter 2, is that if we can find A from the


observed data, then the way in which A stretches space gives us
the principal components of the data.
More specifically, imagine taking the (hyper)sphere of all unit
vectors, and transforming them by A. The result is an ellipsoid,
and the major axis of this ellipsoid—the direction in which it
bulges out the most—is the principal component of the data.
4.1. EIGENVALUES AND SINGULAR VALUES 137

We’ll start with that intuition: the direction in which space is


stretched the most by a transformation is important, and see what
it gets us when we apply it to general matrices.
Note that this is not an eigenvector of A (It’s an eigenvec-
tor of the covariance S = AT A, but we’ll get to that later). An
eigenvector of A is a vector whose direction doesn’t change when
multiplied by A. This is different, in general, from the direction
in which the transformation has the greatest effect.

y = Ax
y = Ax y = Ax

The vector that is stretched the most by the transformation is


different from the eigenvectors, which are vectors that don’t
change direction under the transformation.

If you are wondering why the eigenvectors in the image above


aren’t orthogonal to each other it’s because the transformation
matrix isn’t symmetric, so the spectral theorem doesn’t apply. It
has two real eigenvalues, but not with orthogonal eigenvectors.

We can now define the rest of the axes of our resulting ellipsoid
in the same way we did with the principal components: for the
second axis, we constrain it to be orthogonal to the first, and
then see in which direction A causes the greatest stretch. For
the third, we constrain it to be orthogonal to the first and sec-
ond, and so on.
138 CHAPTER 4—THE SINGULAR VALUE DECOMPOSITION

In this fashion, we get a series of vectors of “greatest stretch”


that are all orthogonal to each other.
The reason we are revisiting these intuitions is that they still
hold if A isn’t square. If we allow any matrix, including non-
square ones, we can still carry this particular intuition over from
the principal components. Let M be any matrix, with dimensions
n ⇥ m. The multiplication

y = Mx
maps a vector x 2 Rm to a vector y 2 Rn . If we constrain x to
be a unit vector, then the possible inputs form a (hyper)sphere in
Rm . This sphere is mapped to an ellipsoid in Rn . We don’t need
to know all the details about what ellipsoids are, just the basics:
they are linear transformations of spheres, and the axes of the
ellipsoid are the directions where it bulges the most (with the
second axis the biggest bulge orthogonal to the first, and so on).
For some matrices, for instance singular square ones, the re-
sulting ellipsoid is not fully n-dimensional, but of some lower
dimensionality. For instance, if m = n = 3, there are matrices
that produce a 2d ellipse, or even a 1d ellipsoid (a line segment).
The matrix could even compress everything into a single point,
which we’ll call a 0d ellipsoid.

input (3D) projection to 3D to 2D to 1D to 0D

invertible non-invertible

The same is true for matrices where n is bigger than m, for in-
stance a 3 ⇥ 2 matrix. No linear transformation will turn a 2d
sphere into a 3d ellipsoid, so the ellipsoid we get from a 3 ⇥ 2
matrix must be at most 2 dimensional.

We can put this more precisely when we’ve discussed matrix


rank.
4.1. EIGENVALUES AND SINGULAR VALUES 139

Nevertheless, the output is always an ellipsoid of some dimen-


sion, so our intuition carries over. We can always ask in which
direction the resulting ellipsoid bulges out the most. It’s just that
after a few directions, the remainder may all be compressed to 0.
For now, let’s see what we can say about the directions that aren’t.
So, the question is which unit vector is stretched the most by
our matrix M? we can state this as an optimization problem:

argmax kMxk
x
such that kxk = 1 .
This problem simply asks for the input x for which the result-
ing vector Mx has maximal magnitude, subject to the constraint
that x is a unit vector.
To find a solution to this problem, we can rewrite both norms
as dot products. In the constraint, we know that setting the norm
equal to 1 is the same as setting the dot product xT x equal to 1.
In the optimization objective, the norm and the dot product aren’t
the same value (one is the square root of the other), but they are
maximized at the same x. So, we get the equivalent problem:

argmax xT MT Mx
x
such that xT x = 1 .
Something quite exciting has happened here. By one simple
rewriting step, the maximum of the linear function Mx has be-
come the maximum of the quadratic function xT Sx, with S =
MT M.
This is similar to what we derived in Chapter 2: there, we
started with a (square) transformation matrix A, which took
our imagined standard-normally distributed data into the form
we observed. We showed that A could be derived from the co-
variance matrix n1 XT X by the relation AAT = n1 XT X, and that
optimizing for the direction of maximum stretch under A corre-
sponds to the direction in which the quadratic form xT AAT x
is maximized.
140 CHAPTER 4—THE SINGULAR VALUE DECOMPOSITION

Most importantly, we saw there that maximizing the value xT Sx


gives us the first eigenvector of S. What we are seeing here is that
the same thing holds for any matrix M. Optimizing for maximum
stretch gives us a direction that is equal to the first eigenvector
v of MT M. Using this we can work out the relation between
the amount that v is stretched in the multiplication Mv, and the
corresponding eigenvalue of MT M:

kMvk2 = vT MT Mv = vT v = vT v = .

That is, if v is an eigenvector of MTpM with eigenvalue , then


multiplying v by M stretches v by . Moreover, even though
M may be non-square or singular, we know that MT M is always
symmetric, which tells us that it has exactly m real eigenvalues,
including multiplicities. The square roots of these eigenvalues
indicate how much M stretches space along the various axes of
the ellipsoid we get if we transform the unit vectors by it.
We call these values the singular values of M. For each sin-
gular value , we have used two vectors in its definition. The
unit vector v 2 Rm which we multiplied by M and the vector
that resulted from the multiplication. The latter has length so
we can represent it as u, where u 2 Rn is also unit vector. With
this, we can make our definition of the singular vectors similar
to that of the eigenvectors: if is a singular value of M, then
for its two singular vectors v and u

Mv = u

We call v a right singular vector of and u a left singular vec-


tor.

We currently have the right singular vectors on the left and vice
versa, which I admit is confusing, but they will change place as
we develop a more complete definition.

You may ask what happens if an eigenvalue of MT M is nega-


tive. We know that all eigenvalues are real, but we don’t know
4.1. EIGENVALUES AND SINGULAR VALUES 141

that they are all positive. And we’ve defined the corresponding
singular values as their square roots.
So, does the corresponding singular value of M, become un-
defined, or complex? The answer is that we can prove that this
doesn’t happen. Note that if we pass a unit eigenvector v of A
into its quadratic vT Av, the result is its eigenvalue vT v = vT v.
Now, let y = Mx. Then if x is an eigenvector of MT M, the
quadratic xT MT Mx = yT y is the corresponding eigenvalue.
This is simply the sum of the squared elements of y. Whether
these are positive or negative, the result is always nonnega-
tive, which shows an eigenvalue of MT M will always be non-
negative.

In technical terms, we have just shown that MT M is positive


semidefinite.

So, we can be sure that there are always m nonnegative singular


values for M, even though some of them may be zero.

4.1.1 Singular vectors and principal components


Before we develop the singular values and vectors further, let’s
see what their relevance is in the context of principal component
analysis. In PCA, we start with a data matrix X, of size n ⇥ m,
which arranges the n instances in our data along the rows, and
the m features of each along the columnns.
We can ask what the singular values of X are. You may be able
to predict the answer, but first let’s look at what how this squares
with our intuition of singular values. We said that they represent
the maximal amount multiplying a vector by X stretches that vec-
tor. What does it mean to multiply a vector by the data matrix?
Let’s take a vector p 2 Rm (where m is the number of features
in our data), and let’s assume p is a unit vector. That way, we
are limited in how big we can make each entry. Now multiply
y = Xp. We get a vector of length n: one value for each instance
x in our data, which is the dot product of that instance and our
vector p. This value is high if the features of the instance match
the corresponding values of p in magnitude and sign: they should
be big where p is big and negative where p is negative.
142 CHAPTER 4—THE SINGULAR VALUE DECOMPOSITION

That is, the dot product expresses a kind of similarity between p


and the instances in our data. The elements of y tell us which in-
stances in the data “match” p the best. This means that the vector
p that is stretched the most by X, is the vector that is most similar
to all instances. This is a tradeoff: making it more similar to one
instance may make it less similar to other instances. Balancing
this over all instances, we get the direction that maximizes kyk:
the first singular vector of X.
If we assume that the data is mean-centered, we can imagine
this direction pivoting around the origin, averaging the angles
to all instances, proportional to how far from the origin each
instance is. This should remind you of how we plotted the first
principal component earlier.

From Chapter 1: the first PC for a simple dataset.

It’s not hard to prove that this direction of maximal stretch is in-
deed the first principal component. As we’ve seen the maximal di-
rection is an eigenvector of the matrix XT X. What we also saw, in
Chapter 2, is that the covariance matrix is estimated with n1 XT X.
The constant factor n1 doesn’t really matter much for our
purposes. If we diagonalize XT X as PT DP, then we can write
PT n1 DP for the properly normalized covariance. In other words
XT X and n1 XT X have the same eigenvectors, and their eigenval-
ues only differ by a multiplicative constant n1 .
Thus, the directions corresponding to the singular values of X
are the same as the eigenvectors of the covariance matrix. We’ve
already established that the singular values of X are the square
roots of the corresponding eigenvalues of XT X, so we can say
4.1. EIGENVALUES AND SINGULAR VALUES 143

that they are proportional to the eigenvalues of S.


We can now show that the singular values and eigenvalues are
in some sense generalizations of standard deviation and variance.
To illustrate, assume that our data X is one-dimensional. In this
case, we could estimate the variance with the formula:

1X
var X = Xi1 2
n
i

which we can also write as

1 T
X X = X 0T X 0
n
with X 0 = p1 X.
n
0T
In this case X X 0 has one eigenvalue, which is simply the
scalar value X 0 T X 0 . As shown above, this corresponds to the
estimated variance of the data. The corresponding singular value
of X 0 is its square root: the standard deviation.
In short, if we ignore a scaling factor of p1n , the singular val-
ues of X are analogous to the standard deviation and the eigen-
values of the covariance matrix XT X are analogous to the vari-
ance.

4.1.2 The singular value decomposition


When we developed eigenvalues and eigenvectors, we saw that
they allowed us to decompose square matrices as the product of
three simpler matrices: A = PDPT . We can do the same thing
with singular values and vectors.
We’ve seen that any matrix has singular values, which corre-
spond to two kinds of singular vectors, defined in a way that is
similar to the way eigenvectors are defined. Here, we have the
eigenvalue definition on the left and the singular value defini-
tion on the right

Aw = w Mv = u .

From the definition of the eigenvalue and vector, we managed to


construct a decomposition of A.
144 CHAPTER 4—THE SINGULAR VALUE DECOMPOSITION

A straightforward way to do this is the following recipe.


1. Assume that the n ⇥ n matrix A has n real eigenvalues i,
with unit eigenvectors wi .
2. Arrange the eigenvectors as columns of a matrix so that we
get A [w1 . . . wn ] = [w1 1 . . . wn n ]
3. Let P = [w1 . . . wn ] and D = diag( 1 ... n) so that AP =
PD
4. Note that P is orthogonal, so that A = PDPT
The fact that the eigenvectors are orthogonal comes from the
iterative way in which we chose them: each was chosen to be
orthogonal to all the previous choices. We did the same for the
singular vectors. The assumption that we have n real eigenvalues
comes from the spectral theorem.
One thing we haven’t examined yet in any detail, is what hap-
pens if our eigenvectors are zero. This can happen if we have a
square, symmetric matrix which is non-invertible, like
✓ ◆
1 0
.
0 0
In such cases, we get one or more 0 eigenvalues. What are the
corresponding eigenvectors? In many ways it doesn’t matter:
the defining relation Aw = w0 always holds, whatever w we
choose. To make the above recipe work, all we need to do is
to choose w so that it is a unit vector, and orthogonal to any
w we already chose.
If we try to follow the same recipe with the singular vectors,
we might get something like this.

1. Assume that the n ⇥ m matrix M has k singular values


i , with unit left and right singular vectors ui 2 R and
n

vi 2 R .
m

2. Arrange the singular vectors as columns of matrices so that


we get M [v1 . . . vk ] = [u1 1 . . . uk k ].
3. Let V = [v1 . . . vk ], U = [u1 . . . uk ] and ⌃ = diag( 1 ... k)
so that MV = U⌃.
4.1. EIGENVALUES AND SINGULAR VALUES 145

This is as far as we get. There aren’t necessarily sufficient singular


values to ensure that V is square. That means V is not invertable,
so we can’t take it to the other side of the equation.

We have V T V = I, but to take V to the other side, we’d need


VV T = I, which we only get if V is square.

A simple solution is to extend V to a complete basis. We know


that the k singular vectors making up V are mutually orthogonal,
and we know that we can always choose m - k additional unit
vectors orthogonal to these and to each other. If we add them to
V as columns, V becomes an m ⇥ m orthonormal matrix.
If we add one such vector v 0 to V, what do we need to change
to keep our equation M [v1 . . . vk ] = [u1 1 . . . uk k ] intact? We
need to extend the right with the vector Mv 0 . What can we say
about this vector? First, we can always write any vector as a unit
vector times a scalar length, so let’s call it u 0 0 .

M [v1 . . . vk v 0 ] = [u1 1 . . . uk k u0 0
]

Note that we assumed that we have all k singular values of M


already. This means that there is no vector v orthogonal to all vi
such that kMvk > 0, or it would be a candidate for the k + 1th
singular vector. Therefore, kMv 0 k = 0.

v 0 is in the null space of M.

This means that we can set 0 = 0 and choose u 0 however we


like. We will choose some vector that is orthogonal to all ui
already chosen.

Note the similarity to the eigenvector case: once we’ve run out
of nonzero singular values we extend the rest of the diagonal
with zeros.

We could just keep adding vectors like this until V is square,


and it would be orthogonal so we would be allowed to move
it to the other side. But we’d be missing a trick if we stopped
there. The matrix M represents a function from one space, Rm
146 CHAPTER 4—THE SINGULAR VALUE DECOMPOSITION

to another Rn . If we make V square and orthogonal, we have


constructed an orthonormal basis for Rm , just like the analogous
P is a basis in the eigendecomposition. It would be quite nice, if
we could ensure that U is also an orthonormal basis for Rn . That
way, we would have two orthonormal bases U and V for the
two spaces that M transforms between, and a diagonal matrix
of singular ⌃ values in the middle.
To do this, we need to add different numbers of orthogonal
vectors to V and U. V needs k - m extra vectors to become
m ⇥ m, and U needs k - n extra vectors to become n ⇥ n. The
trick to accomplish this is to make ⌃ non-square. We make ⌃
an m ⇥ n matrix, with all zeros except for the diagonal of the
top k ⇥ k matrix in the top left, which contains the singular val-
ues. Here is an illustration.
k m n
<latexit sha1_base64="gIN/VYtQBz58E9bSa3L/vLYJdMg=">AAAK+nicfVbLbttGFGXSV6I2bdIssyEqBCgKwSAlW5IRBLAtOcmiiV3XsgOIQjCkLilCwwdmhnqEnc/otgW6K7rtz+RvOnzJJIfqLKSLOefcuXPvEUUzxC5lmvbp3v3PPv/iy68ePGx9/c2jb797/OT7GxpExIKJFeCAvDcRBez6MGEuw/A+JIA8E8OtuRwl+O0KCHUD/5ptQ5h5yPFd27UQE1vTpfHCeOElH/6Hx23tQEuXKgd6HrSVfF1+ePLwkzEPrMgDn1kYUTrVtZDNYkSYa2HgLSOiECJriRyYRswezmLXDyMGvsXV5wKzI6yyQE3KUucuAYvhrQiQRVyRQbUWiCCLieJb1VQUfOQB7cxXbkizkK6cLGBI3HwWb9LO8EcVZewQFC5ca1MpLUYe9RBbSJt065nVTYgwkJVX3UzKFEXWmBsglkuTJlyKzlyESbfpdXCZ44ttuACf8jgimJeFAgBCwBbCNKTAojBObyNGvKQvGYmgk4Tp3ssxIssrmHdEnspGtRwbB4hVt8zaNXzXAlv0m7fEeq6OFsh3QA1ECYgFRE2aEWCazCu/cMsYgxg7gV9T6FXgs5jahUBc7eJa57vuxB6Pfd6gOcXhApnAYiMZArV5JYmYn4eWgISvWXrmHGyjgE/s5MxU55AgCg1RZEWcawO2ENJkGT6srcDzkD+PjZDHBoONyNA54KlVyugVj7PUpqleJXAFfVdC33FeBc9L4DnntcSTHWqrk7r0pgTeSKfeltDbutSMSmgkoasSupIym+sSvJbgTQndSOi2hG4l9GMJ/Sj3GYlfwbQ7i7NZpB6OL7C7gtcEwOdxu8vrdyHC3lO9KkksH7d1nrZbmEQ8DjPA2yb0+M312595PBp2j7Q+rzNMHEFB0Xr9o5EmUZysmpyjDYfdM4kTkOQ3U5DG5/1TXU4URiTEO9Jg0Ht1LGfaAsbBepdpdDbu9mok0ZBqTfpA17T67deOVRD6g8FYq9ezxneE09G4N6gfsyY73NR70K03b43vCPPe8FBOYO7wntbvH0s4viMMemgwlwjLHX6crjoe7HDoHx9lPajYxZLckptCbeuq5C6niZ63ulFgNgkySzXylzL/NUHbPeygKXvhtEZF2KQobNeo2DYpCg8Wiqpk3dCm1GyNB6Q2k+h4P79hZqkT92RvouP9/IaJpTbdk72JjvfzG+aberiZ3TDf1NG7ttfMEyaP06Ulxpy8ZyGcjSb/W30rHrMX+f/fT+LZShzPFbYV30Ynif6PiDYFUUTiZVCvv/rJwU33QO8fHP5y2D45y18LHyjPlB+UHxVdGSgnyhvlUpkolhIovyt/KH+2fmv91fq79U9GvX8v1zxVKqv173+7vw5Y</latexit>

k m n
<latexit sha1_base64="gIN/VYtQBz58E9bSa3L/vLYJdMg=">AAAK+nicfVbLbttGFGXSV6I2bdIssyEqBCgKwSAlW5IRBLAtOcmiiV3XsgOIQjCkLilCwwdmhnqEnc/otgW6K7rtz+RvOnzJJIfqLKSLOefcuXPvEUUzxC5lmvbp3v3PPv/iy68ePGx9/c2jb797/OT7GxpExIKJFeCAvDcRBez6MGEuw/A+JIA8E8OtuRwl+O0KCHUD/5ptQ5h5yPFd27UQE1vTpfHCeOElH/6Hx23tQEuXKgd6HrSVfF1+ePLwkzEPrMgDn1kYUTrVtZDNYkSYa2HgLSOiECJriRyYRswezmLXDyMGvsXV5wKzI6yyQE3KUucuAYvhrQiQRVyRQbUWiCCLieJb1VQUfOQB7cxXbkizkK6cLGBI3HwWb9LO8EcVZewQFC5ca1MpLUYe9RBbSJt065nVTYgwkJVX3UzKFEXWmBsglkuTJlyKzlyESbfpdXCZ44ttuACf8jgimJeFAgBCwBbCNKTAojBObyNGvKQvGYmgk4Tp3ssxIssrmHdEnspGtRwbB4hVt8zaNXzXAlv0m7fEeq6OFsh3QA1ECYgFRE2aEWCazCu/cMsYgxg7gV9T6FXgs5jahUBc7eJa57vuxB6Pfd6gOcXhApnAYiMZArV5JYmYn4eWgISvWXrmHGyjgE/s5MxU55AgCg1RZEWcawO2ENJkGT6srcDzkD+PjZDHBoONyNA54KlVyugVj7PUpqleJXAFfVdC33FeBc9L4DnntcSTHWqrk7r0pgTeSKfeltDbutSMSmgkoasSupIym+sSvJbgTQndSOi2hG4l9GMJ/Sj3GYlfwbQ7i7NZpB6OL7C7gtcEwOdxu8vrdyHC3lO9KkksH7d1nrZbmEQ8DjPA2yb0+M312595PBp2j7Q+rzNMHEFB0Xr9o5EmUZysmpyjDYfdM4kTkOQ3U5DG5/1TXU4URiTEO9Jg0Ht1LGfaAsbBepdpdDbu9mok0ZBqTfpA17T67deOVRD6g8FYq9ezxneE09G4N6gfsyY73NR70K03b43vCPPe8FBOYO7wntbvH0s4viMMemgwlwjLHX6crjoe7HDoHx9lPajYxZLckptCbeuq5C6niZ63ulFgNgkySzXylzL/NUHbPeygKXvhtEZF2KQobNeo2DYpCg8Wiqpk3dCm1GyNB6Q2k+h4P79hZqkT92RvouP9/IaJpTbdk72JjvfzG+aberiZ3TDf1NG7ttfMEyaP06Ulxpy8ZyGcjSb/W30rHrMX+f/fT+LZShzPFbYV30Ynif6PiDYFUUTiZVCvv/rJwU33QO8fHP5y2D45y18LHyjPlB+UHxVdGSgnyhvlUpkolhIovyt/KH+2fmv91fq79U9GvX8v1zxVKqv173+7vw5Y</latexit>

k m n
<latexit sha1_base64="gIN/VYtQBz58E9bSa3L/vLYJdMg=">AAAK+nicfVbLbttGFGXSV6I2bdIssyEqBCgKwSAlW5IRBLAtOcmiiV3XsgOIQjCkLilCwwdmhnqEnc/otgW6K7rtz+RvOnzJJIfqLKSLOefcuXPvEUUzxC5lmvbp3v3PPv/iy68ePGx9/c2jb797/OT7GxpExIKJFeCAvDcRBez6MGEuw/A+JIA8E8OtuRwl+O0KCHUD/5ptQ5h5yPFd27UQE1vTpfHCeOElH/6Hx23tQEuXKgd6HrSVfF1+ePLwkzEPrMgDn1kYUTrVtZDNYkSYa2HgLSOiECJriRyYRswezmLXDyMGvsXV5wKzI6yyQE3KUucuAYvhrQiQRVyRQbUWiCCLieJb1VQUfOQB7cxXbkizkK6cLGBI3HwWb9LO8EcVZewQFC5ca1MpLUYe9RBbSJt065nVTYgwkJVX3UzKFEXWmBsglkuTJlyKzlyESbfpdXCZ44ttuACf8jgimJeFAgBCwBbCNKTAojBObyNGvKQvGYmgk4Tp3ssxIssrmHdEnspGtRwbB4hVt8zaNXzXAlv0m7fEeq6OFsh3QA1ECYgFRE2aEWCazCu/cMsYgxg7gV9T6FXgs5jahUBc7eJa57vuxB6Pfd6gOcXhApnAYiMZArV5JYmYn4eWgISvWXrmHGyjgE/s5MxU55AgCg1RZEWcawO2ENJkGT6srcDzkD+PjZDHBoONyNA54KlVyugVj7PUpqleJXAFfVdC33FeBc9L4DnntcSTHWqrk7r0pgTeSKfeltDbutSMSmgkoasSupIym+sSvJbgTQndSOi2hG4l9GMJ/Sj3GYlfwbQ7i7NZpB6OL7C7gtcEwOdxu8vrdyHC3lO9KkksH7d1nrZbmEQ8DjPA2yb0+M312595PBp2j7Q+rzNMHEFB0Xr9o5EmUZysmpyjDYfdM4kTkOQ3U5DG5/1TXU4URiTEO9Jg0Ht1LGfaAsbBepdpdDbu9mok0ZBqTfpA17T67deOVRD6g8FYq9ezxneE09G4N6gfsyY73NR70K03b43vCPPe8FBOYO7wntbvH0s4viMMemgwlwjLHX6crjoe7HDoHx9lPajYxZLckptCbeuq5C6niZ63ulFgNgkySzXylzL/NUHbPeygKXvhtEZF2KQobNeo2DYpCg8Wiqpk3dCm1GyNB6Q2k+h4P79hZqkT92RvouP9/IaJpTbdk72JjvfzG+aberiZ3TDf1NG7ttfMEyaP06Ulxpy8ZyGcjSb/W30rHrMX+f/fT+LZShzPFbYV30Ynif6PiDYFUUTiZVCvv/rJwU33QO8fHP5y2D45y18LHyjPlB+UHxVdGSgnyhvlUpkolhIovyt/KH+2fmv91fq79U9GvX8v1zxVKqv173+7vw5Y</latexit>

MV = ⌃U
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

MV⇥= ⌃U
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

⇥ k m n
<latexit sha1_base64="gIN/VYtQBz58E9bSa3L/vLYJdMg=">AAAK+nicfVbLbttGFGXSV6I2bdIssyEqBCgKwSAlW5IRBLAtOcmiiV3XsgOIQjCkLilCwwdmhnqEnc/otgW6K7rtz+RvOnzJJIfqLKSLOefcuXPvEUUzxC5lmvbp3v3PPv/iy68ePGx9/c2jb797/OT7GxpExIKJFeCAvDcRBez6MGEuw/A+JIA8E8OtuRwl+O0KCHUD/5ptQ5h5yPFd27UQE1vTpfHCeOElH/6Hx23tQEuXKgd6HrSVfF1+ePLwkzEPrMgDn1kYUTrVtZDNYkSYa2HgLSOiECJriRyYRswezmLXDyMGvsXV5wKzI6yyQE3KUucuAYvhrQiQRVyRQbUWiCCLieJb1VQUfOQB7cxXbkizkK6cLGBI3HwWb9LO8EcVZewQFC5ca1MpLUYe9RBbSJt065nVTYgwkJVX3UzKFEXWmBsglkuTJlyKzlyESbfpdXCZ44ttuACf8jgimJeFAgBCwBbCNKTAojBObyNGvKQvGYmgk4Tp3ssxIssrmHdEnspGtRwbB4hVt8zaNXzXAlv0m7fEeq6OFsh3QA1ECYgFRE2aEWCazCu/cMsYgxg7gV9T6FXgs5jahUBc7eJa57vuxB6Pfd6gOcXhApnAYiMZArV5JYmYn4eWgISvWXrmHGyjgE/s5MxU55AgCg1RZEWcawO2ENJkGT6srcDzkD+PjZDHBoONyNA54KlVyugVj7PUpqleJXAFfVdC33FeBc9L4DnntcSTHWqrk7r0pgTeSKfeltDbutSMSmgkoasSupIym+sSvJbgTQndSOi2hG4l9GMJ/Sj3GYlfwbQ7i7NZpB6OL7C7gtcEwOdxu8vrdyHC3lO9KkksH7d1nrZbmEQ8DjPA2yb0+M312595PBp2j7Q+rzNMHEFB0Xr9o5EmUZysmpyjDYfdM4kTkOQ3U5DG5/1TXU4URiTEO9Jg0Ht1LGfaAsbBepdpdDbu9mok0ZBqTfpA17T67deOVRD6g8FYq9ezxneE09G4N6gfsyY73NR70K03b43vCPPe8FBOYO7wntbvH0s4viMMemgwlwjLHX6crjoe7HDoHx9lPajYxZLckptCbeuq5C6niZ63ulFgNgkySzXylzL/NUHbPeygKXvhtEZF2KQobNeo2DYpCg8Wiqpk3dCm1GyNB6Q2k+h4P79hZqkT92RvouP9/IaJpTbdk72JjvfzG+aberiZ3TDf1NG7ttfMEyaP06Ulxpy8ZyGcjSb/W30rHrMX+f/fT+LZShzPFbYV30Ynif6PiDYFUUTiZVCvv/rJwU33QO8fHP5y2D45y18LHyjPlB+UHxVdGSgnyhvlUpkolhIovyt/KH+2fmv91fq79U9GvX8v1zxVKqv173+7vw5Y</latexit>

MV = ⌃U
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

k m n

MV = ⌃U ==
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit> <latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit> <latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit> <latexit sha1_base64="lX8X34QGl9yoI6F2uNAsxMx5Duo=">AAALDnicfVbLbtw2FFXSVzJtGqddFgWEDgIUhWFIM/aMvQhge8ZJFk3sOn4EsAyD0lxphKEeIKl5hOCqP9Df6LYFuiu67S/kb0q9xpKoKTdzwXPOJXnvGYp2jH3KDOPjg4effPrZ5188etz58qsnXz/devbNFY0S4sClE+GIvLcRBeyHcMl8huF9TAAFNoZrezZK8es5EOpH4QVbxXAbIC/0Xd9BTE7dbX1vLYjDLTvQL4VuLbw8tt75XoCEfrfVNXaMbOhqYBZBVyvG2d2zxx+tSeQkAYTMwYjSG9OI2S1HhPkOBtGxEgoxcmbIg5uEufu33A/jhEHoCP25xNwE6yzS063qE5+Aw/BKBsghvsygO1NEkMPkgTr1VBRCFADdnsz9mOYhnXt5wJCsxi1fZtUST2pK7hEUT31nWdsaRwENEJsqk3QV2PVJSDCQeVCfTLcpN9lgLoE4Pk2LcCYrcxqnHaAX0VmBT1fxFEIqeEKwqAolAISAK4VZSIElMc9OI9s+oy8YSWA7DbO5F2NEZucw2ZZ5ahP17bg4Qqw+ZTeOEfoOuLLeoiPHc300RaEHeiS3gFhE9LQYEaZpv4oDd6wxyLYTeJdBL6OQceqWAnm00wtTrKvDA8FD0aI5wvEU2cC4lTaBuqKWRPYvQDNA0ussW3MCrlXCh266ZqbzSJTEltxkTVxoIzaV0nRYISycKAhQOOFWLLjFYCkzbO+IzCpV9FzwPLVt6+cpXEPfVtC3QtTBkwp4IkQj8eUadeX/sCG9qoBXyqrXFfS6KbWTCpoo6LyCzpXM9qICLxR4WUGXCrqqoCsF/VBBP6h1RvJfcNO75XkvMg/zU+zP4RUBCAXv9kTzLETa+8asS1LL864psnJLk8grMgeCVUrnry/e/Cz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vWOFEJP3PlKTxyeDIVBPFCYnxmjQc9l8eqJlWgHG0WGcaHY97/QZJFqS+J3NoGkbz9PKSLwmD4XBsNPezwPeEo9G4P2wuI78YJW6bfeg1i7fA94RJf39XTWCv8b4xGBwoOL4nDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPt3i9J2FcN6a4rP6Rl6zp8X37yd5txIv8KVt5a+1nUb/R0TLkigj+Rg0m08/Nbjq7ZiDnd1fdruHx8Wz8JH2nfaD9qNmakPtUHutnWmXmqP9qv2u/aH92fmt81fn784/OfXhg0LzrVYbnX//A13NFkg=</latexit>

MV
MV ⌃U⌃U U⌃
k m n k m n k m n k m n

Diagram for the equation MV = U⌃. White cells represent


0s. The light columns of V and U are the vectors we add to
make these matrices square.

We see that the effect of extending V with orthogonal vectors is


to add zero columns to the product MV. This is not surprising,
since we noted already that these must be in the null space of M.
4.1. EIGENVALUES AND SINGULAR VALUES 147

On the right, the product U⌃ needs to be equal to MV. The extra


zero columns can easily be added by adding zero columns to ⌃.
How about the extra basis vectors we would like to add to
U? We need to add rows to ⌃ in order to make the matrix di-
mensions match. If the rows we add are zero rows, then we
can be sure that when we multiply the two matrices, any entry
in any column added to U will be multiplied by 0, so it won’t
affect the product U⌃.
Here is the process in three steps:

MV =MV
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

⌃U= ⌃U
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

Set up the singular values


MV = ⌃U
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

and vectors in a matrix mul-


tiplication.
MV = ⌃U
MV==⌃U
<latexit sha1_base64="lX8X34QGl9yoI6F2uNAsxMx5Duo=">AAALDnicfVbLbtw2FFXSVzJtGqddFgWEDgIUhWFIM/aMvQhge8ZJFk3sOn4EsAyD0lxphKEeIKl5hOCqP9Df6LYFuiu67S/kb0q9xpKoKTdzwXPOJXnvGYp2jH3KDOPjg4effPrZ5188etz58qsnXz/devbNFY0S4sClE+GIvLcRBeyHcMl8huF9TAAFNoZrezZK8es5EOpH4QVbxXAbIC/0Xd9BTE7dbX1vLYjDLTvQL4VuLbw8tt75XoCEfrfVNXaMbOhqYBZBVyvG2d2zxx+tSeQkAYTMwYjSG9OI2S1HhPkOBtGxEgoxcmbIg5uEufu33A/jhEHoCP25xNwE6yzS063qE5+Aw/BKBsghvsygO1NEkMPkgTr1VBRCFADdnsz9mOYhnXt5wJCsxi1fZtUST2pK7hEUT31nWdsaRwENEJsqk3QV2PVJSDCQeVCfTLcpN9lgLoE4Pk2LcCYrcxqnHaAX0VmBT1fxFEIqeEKwqAolAISAK4VZSIElMc9OI9s+oy8YSWA7DbO5F2NEZucw2ZZ5ahP17bg4Qqw+ZTeOEfoOuLLeoiPHc300RaEHeiS3gFhE9LQYEaZpv4oDd6wxyLYTeJdBL6OQceqWAnm00wtTrKvDA8FD0aI5wvEU2cC4lTaBuqKWRPYvQDNA0ussW3MCrlXCh266ZqbzSJTEltxkTVxoIzaV0nRYISycKAhQOOFWLLjFYCkzbO+IzCpV9FzwPLVt6+cpXEPfVtC3QtTBkwp4IkQj8eUadeX/sCG9qoBXyqrXFfS6KbWTCpoo6LyCzpXM9qICLxR4WUGXCrqqoCsF/VBBP6h1RvJfcNO75XkvMg/zU+zP4RUBCAXv9kTzLETa+8asS1LL864psnJLk8grMgeCVUrnry/e/Cz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vWOFEJP3PlKTxyeDIVBPFCYnxmjQc9l8eqJlWgHG0WGcaHY97/QZJFqS+J3NoGkbz9PKSLwmD4XBsNPezwPeEo9G4P2wuI78YJW6bfeg1i7fA94RJf39XTWCv8b4xGBwoOL4nDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPt3i9J2FcN6a4rP6Rl6zp8X37yd5txIv8KVt5a+1nUb/R0TLkigj+Rg0m08/Nbjq7ZiDnd1fdruHx8Wz8JH2nfaD9qNmakPtUHutnWmXmqP9qv2u/aH92fmt81fn784/OfXhg0LzrVYbnX//A13NFkg=</latexit>

<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit> <latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit> <latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

MV ⌃U U⌃

MV = ⌃U
MV = ⌃U
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

Add orthogonal vectors to V


to make it square. Add zero
MV = ⌃U
<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

columns to ⌃ to make the


sizes match.
MV = ⌃U ==
<latexit sha1_base64="lX8X34QGl9yoI6F2uNAsxMx5Duo=">AAALDnicfVbLbtw2FFXSVzJtGqddFgWEDgIUhWFIM/aMvQhge8ZJFk3sOn4EsAyD0lxphKEeIKl5hOCqP9Df6LYFuiu67S/kb0q9xpKoKTdzwXPOJXnvGYp2jH3KDOPjg4effPrZ5188etz58qsnXz/devbNFY0S4sClE+GIvLcRBeyHcMl8huF9TAAFNoZrezZK8es5EOpH4QVbxXAbIC/0Xd9BTE7dbX1vLYjDLTvQL4VuLbw8tt75XoCEfrfVNXaMbOhqYBZBVyvG2d2zxx+tSeQkAYTMwYjSG9OI2S1HhPkOBtGxEgoxcmbIg5uEufu33A/jhEHoCP25xNwE6yzS063qE5+Aw/BKBsghvsygO1NEkMPkgTr1VBRCFADdnsz9mOYhnXt5wJCsxi1fZtUST2pK7hEUT31nWdsaRwENEJsqk3QV2PVJSDCQeVCfTLcpN9lgLoE4Pk2LcCYrcxqnHaAX0VmBT1fxFEIqeEKwqAolAISAK4VZSIElMc9OI9s+oy8YSWA7DbO5F2NEZucw2ZZ5ahP17bg4Qqw+ZTeOEfoOuLLeoiPHc300RaEHeiS3gFhE9LQYEaZpv4oDd6wxyLYTeJdBL6OQceqWAnm00wtTrKvDA8FD0aI5wvEU2cC4lTaBuqKWRPYvQDNA0ussW3MCrlXCh266ZqbzSJTEltxkTVxoIzaV0nRYISycKAhQOOFWLLjFYCkzbO+IzCpV9FzwPLVt6+cpXEPfVtC3QtTBkwp4IkQj8eUadeX/sCG9qoBXyqrXFfS6KbWTCpoo6LyCzpXM9qICLxR4WUGXCrqqoCsF/VBBP6h1RvJfcNO75XkvMg/zU+zP4RUBCAXv9kTzLETa+8asS1LL864psnJLk8grMgeCVUrnry/e/Cz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vWOFEJP3PlKTxyeDIVBPFCYnxmjQc9l8eqJlWgHG0WGcaHY97/QZJFqS+J3NoGkbz9PKSLwmD4XBsNPezwPeEo9G4P2wuI78YJW6bfeg1i7fA94RJf39XTWCv8b4xGBwoOL4nDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPt3i9J2FcN6a4rP6Rl6zp8X37yd5txIv8KVt5a+1nUb/R0TLkigj+Rg0m08/Nbjq7ZiDnd1fdruHx8Wz8JH2nfaD9qNmakPtUHutnWmXmqP9qv2u/aH92fmt81fn784/OfXhg0LzrVYbnX//A13NFkg=</latexit>

<latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit> <latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit> <latexit sha1_base64="tFayTdnpdU+PA0h3UN89lqNPTGQ=">AAALJ3icfVbLbtw2FFXSVzJtWqdddiN0EKAoDEOasWfshQHbM06yqGM38SOAZRiU5kojDPUASc0jBP+kP9Df6LYFuiuaZf+k1GssiZpyMxc851yS9x5xaMfYp8wwPj56/Mmnn33+xZOnnS+/evb1N1vPv72mUUIcuHIiHJH3NqKA/RCumM8wvI8JoMDGcGPPRil+MwdC/Si8ZKsY7gLkhb7rO4jJqfutgbXwHG7ZgX4mdGtB8vha6If6GrHe+V6AKvCVuN/qGjtGNnQ1MIugqxXj4v7504/WJHKSAELmYETprWnE7I4jwnwHg+hYCYUYOTPkwW3C3P077odxwiB0hP5CYm6CdRbp6Rn0iU/AYXglA+QQX2bQnSkiyGHypJ16KgohCoBuT+Z+TPOQzr08YEiW6Y4vszKKZzUl9wiKp76zrG2No4AGiE2VSboK7PokJBjIPKhPptuUm2wwl0Acn6ZFuJCVOY/T1tDL6KLAp6t4CiEVPCFYVIUSAELAlcIspMCSmGenkX6Y0UNGEthOw2zucIzI7C1MtmWe2kR9Oy6OEKtP2Y1jhL4Drqy36MjxQh9NUeiBHsktIBYRPS1GhGnar+LAHWsMsu0E3mXQyyhknLqlQB7t/NIU6+rwQPBQtGiOcTxFNjBupU2grqglkf0L0AyQ/AhYtuYEXKuEj9x0zUznkSiJLbnJmrjQRmwqpemwQlg4URCgcMKtWHCLwVJm2N4RmVWq6FvB89S2rb9N4Rr6poK+EaIOnlbAUyEaia/WqCu/uob0ugJeK6veVNCbptROKmiioPMKOlcy24sKvFDgZQVdKuiqgq4U9EMF/aDWGcmv4LZ3x/NeZB7m59ifwysCEAre7YnmWYi0961Zl6SW511TZOWWJpF3Zw4Eq5TOX1+e/Sz4aL+3ZwxEk2HjBEqK0R/sjQyF4uW7KTjG/n7vROFEJP1mStL4dHBsqonihMR4TRoO+y8P1EwrwDharDONTsa9foMkC1Lfkzk0DaN5ennfl4TBcDg2mvtZ4AfC8WjcHzaXkf8PJW6bfeg1i7fAD4RJf39XTWCv8b4xGBwoOH4gDPtoOFEIszV+kI0mHq1xGBzs5TWo2cVR3FKYQu+auuIur41elLpVYLcJcku18mcq/xVBqw3sqC176bRWRdymKG3Xqli1KUoPloq6ZNFSpsxsrQtkNlPoeDO/pWeZEzdkb6PjzfyWjmU23ZC9jY4381v6m3m4nd3S38zR67I3zBOn1+lMPtri9J2FcN6a4m/1TF6z58X/30/ybiVe4Evbyl9rO43+j4iWJVFG8jFoNp9+anDd2zEHO7u/7HaPTopn4RPte+0H7UfN1IbakfZau9CuNEf7Vftd+0P7s/Nb56/O351/curjR4XmO602Ov/+B2ZzIC0=</latexit>

MV
MV ⌃U⌃U U⌃

MV = ⌃U

You might also like