Share on facebook
Share on twitter
Share on linkedin
Share on pinterest

Singular Value Decomposition for Dimensionality Reduction

Singular Value Decomposition usually referred to as SVD, is a type of matrix decomposition technique that is popularly used to reduce the dimensions of the data.

SVD decomposes a mxn real matrix A into a product of three matrices in the form

A=U \Sigma V^{\top}

Where U is an m x m orthogonal matrix, V is an n x n orthogonal matrix, Σ is an m x n diagonal matrix with nonnegative diagonal entries, i.e., the values will all be positive or zero.

The columns of U and V are called left, and right singular vectors and the values in the diagonal of Σ are called singular values. The singular values are arranged in decreasing order Σ =diag(σ1, …,σn), where σ1 ≥ ··· ≥ σn >0

Let’s see how to determine U, Σ, and V.

The values of U correspond to the eigenvectors of A A^{\mathrm{T}}

The values of V correspond to the eigenvectors of A^{\mathrm{T}}A

The values of Σ correspond to the eigenvalues of A A^{\mathrm{T}} or A^{\mathrm{T}}A which is the same since a matrix and its transpose have the same eigenvalues. Taking the square root of the eigenvalues gives us the singular values.

The eigendecomposition of A^{\mathrm{T}}A is

\begin{aligned}&\mathrm{AA}^{\top}=\left(\mathrm{UΣV}^{\top}\right)\left(\mathrm{UΣV}^{\top}\right)^{\top}\\&\mathrm{AA}^{\top}=\mathrm{UΣV}^{\top} \mathrm{VΣ}^{\top} \mathrm{U}^{\top}\\&\mathrm{AA}^{\top}=\mathrm{U} Σ Σ^{\top} U^{\top}\left[\text { since V is orthogonal} V^{\top} V \text { is } 1\right]\\&\mathrm{AA}^{\top}=\mathrm{U} \mathrm{Σ}^{2} \mathrm{U}^{\top}\end{aligned}

Where, U is the eigenvectors of A A^{\mathrm{T}} and Σ is the eigenvalues of A A^{\mathrm{T}}

Similarly, the eigendecomposition of A^{\mathrm{T}}A

\begin{aligned}&A^{\top} A=\left(\mathrm{U} Σ V^{\top}\right)^{\top}\left(\mathrm{U} Σ V^{\top}\right)\\&A^{\top} A=V Σ^{\top} U^{\top} U Σ V^{\top}\\&\mathrm{A}^{\mathrm{T}} \mathrm{A}=\mathrm{VΣ}^{\mathrm{T}} \mathrm{Σ} \mathrm{V}^{\mathrm{T}}\left[\text { since U is orthogonal} \mathrm{U}^{\mathrm{T}}\mathrm{U} \text { is } 1\right]\\&A^{\top} A=V Σ^{2} V^{\top}\end{aligned}

where, V is the eigenvectors of A^{\mathrm{T}}A and Σ is the eigenvalues of A^{\mathrm{T}}A

Let’s see an example,

Find SVD of A=\left[\begin{array}{ll}{3} & {0} \\ {0} & {1}\end{array}\right]

We start by computing A A^{\mathrm{T}} and A^{\mathrm{T}}A

A A^{\mathrm{T}}=A^{\mathrm{T}} A=\left[\begin{array}{ll}{9} & {0} \\ {0} & {1}\end{array}\right]

The eigen values of A A^{\mathrm{T}} are \lambda_{1}=9 \text { and } \lambda_{2}=1

Since the singular values are square root of the eigenvalues, these eigenvalues result in a singular values \sigma_{1}=3 \text { and } \sigma_{2}=1

\Sigma=\left[\begin{array}{ll}{3} & {0} \\ {0} & {1}\end{array}\right]

The eigenvectors of A A^{\mathrm{T}} and A^{\mathrm{T}}A is \left[\begin{array}{ll}{1} & {0} \\{0} & {1}\end{array}\right]

Since both A A^{\mathrm{T}} and A^{\mathrm{T}}A have same eigenvectors we can write,

U=V=\left[\begin{array}{ll}{1} & {0}\\{0} & {1}\end{array}\right]

Thus, the SVD of matrix A is,

A=\left[\begin{array}{ll}{1} & {0} \\{0} & {1}\end{array}\right]\left[\begin{array}{ll}{3} & {0} \\{0} & {1}\end{array}\right]\left[\begin{array}{ll}{1} & {0} \\{0} & {1}\end{array}\right]

Now let’s see how to compute SVD using python. We’ll use numpy for this

Now let’s apply SVD on the iris dataset. We’ll reduce the dimension of the data from (150,4) to (150,2)

SVD

CONCLUSION:

In this article, we discussed one of the most popular matrix factorization techniques Singular Value Decomposition and how it can be used for dimensionality reduction.

SVD decomposes a real matrix in the form

A=U \Sigma V^{\top}

where U and V are the left and right singular vectors and Σ is the singular values.

Reference:

Practical Linear Algebra: A Geometry Toolbox, 3rd Edition

Love What you Read. Subscribe to our Newsletter.

Stay up to date! We’ll send the content straight to your inbox, once a week. We promise not to spam you.

Subscribe Now! We'll keep you updated.