0% found this document useful (0 votes)
17 views4 pages

Orthogonal Projections

Uploaded by

1628kananjain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views4 pages

Orthogonal Projections

Uploaded by

1628kananjain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

1.

Orthogonal Projections
Definition: An orthogonal projection of a vector x onto a subspace U is the closest vector in U to
x.

• If u is a unit vector, the projection of x on to u is:

In ML:

• Dimensionality reduction: Principal Component Analysis (PCA) finds orthogonal


directions (principal components) and projects data onto them.
• Least squares regression: The predicted values are the orthogonal projection of the
target vector onto the column space of the feature matrix.
• Data denoising: Projecting onto a lower-dimensional subspace removes components
considered "noise."
2. Rotations

Definition: A rotation is a linear transformation represented by an orthogonal matrix R with


det(R)=1. It preserves lengths and angles:
In ML:

• Data preprocessing: Rotations of feature space don’t affect algorithms that depend only
on distances/angles (like k-NN, SVM with linear kernels, PCA).
• Weight matrices: Orthogonal/rotational weight initialization (e.g., in deep networks)
helps prevent vanishing/exploding gradients.

Augmentation: For images, rotations are used as a data augmentation strategy

You might also like