SVD Decomposition

We begin this section with an important definition.

Singular Value Decomposition (SVD) can be thought of as a generalization of orthogonal diagonalization of a symmetric matrix to an arbitrary matrix. This decomposition is the focus of this section.

The following is a useful result that will help when computing the SVD of matrices.

Proof
Suppose is an matrix, and suppose that is a nonzero eigenvalue of . Then there exists a nonzero vector such that Multiplying both sides of this equation by yields:
Since and , , and thus by equation (nonzero), ; thus , implying that .

Therefore is an eigenvector of corresponding to eigenvalue . An analogous argument can be used to show that every nonzero eigenvalue of is an eigenvalue of , thus completing the proof.

Given an matrix , we will see how to express as a product where

  • is an orthogonal matrix whose columns are eigenvectors of .
  • is an orthogonal matrix whose columns are eigenvectors of .
  • is an matrix whose only nonzero values lie on its main diagonal, and are the singular values of .

How can we find such a decomposition? We are aiming to decompose in the following form:

where is a block matrix of the form Thus and it follows that and so Similarly, Therefore, you would find an orthonormal basis of eigenvectors for make them the columns of a matrix such that the corresponding eigenvalues are decreasing. This gives You could then do the same for to get .

We formalize this discussion in the following theorem.

Proof
There exists an orthonormal basis, such that where for and equals zero if Thus for because For define by Thus Now
This means that when and when . Thus is an orthonormal set of vectors in Also, Now, using Gram-Schmidt, extend to an orthonormal basis for all of and let while Thus is the matrix which has the as columns and is defined as the matrix which has the as columns. Then where is given in the statement of the theorem.

The SVD has as an immediate corollary which is given in the following interesting result.

Let’s compute the SVD of a simple matrix.

Here is another example.

Consider another example.

This illustrates that if you have a good way to find the eigenvectors and eigenvalues for a Hermitian matrix which has nonnegative eigenvalues, then you also have a good way to find the SVD of an arbitrary matrix.

Text Source

This section was adapted from Section 8.11 of Keith Nicholson’s Linear Algebra with Applications. (CC-BY-NC-SA)

W. Keith Nicholson, Linear Algebra with Applications, Lyryx 2018, Open Edition.