In this module we discuss algebraic multiplicity, geometric multiplicity, and their relationship to diagonalizability.

EIG-0050: Diagonalizable Matrices and Multiplicity

Recall that a diagonal matrix is a matrix containing a zero in every entry except those on the main diagonal. More precisely, if is the entry of a diagonal matrix , then unless . Such matrices look like the following.

where is a number which might not be zero.

Diagonal matrices have some nice properties, as we demonstrate below.

Let and let . Compute and .

Notice the patterns present in the product matrices. Each row of is the same as its corresponding row of multiplied by the scalar which is the corresponding diagonal element of . In the product , it is the columns of that have been multiplied by the diagonal elements. These patterns hold in general for any diagonal matrix, and they are fundamental to understanding diagonalization.

If we are given a matrix that is diagonalizable, then we can write for some matrix , or, equivalently,

If we pause to examine Equation (eq:understand_diag), the work that we did in Exploration init:multiplydiag can help us to understand how to find that will diagonalize . The product is formed by multiplying each column of by a scalar which is the corresponding element on the diagonal of . To restate this, if is column in our matrix , then Equation (eq:understand_diag) tells us that where is the th diagonal element of .

Of course, Equation (eq:ev_ew_diag) is very familiar! We see that if we are able to diagonalize a matrix , the columns of matrix will be the eigenvectors of , and the corresponding diagonal entries of will be the corresponding eigenvalues of . This is summed up in the following theorem.

Proof

Suppose is given as above as an invertible matrix whose columns are eigenvectors of . To show that is diagonalizable, we will show which is equivalent to . We have while

We can complete this half of the proof by comparing columns, and noting that for since the are eigenvectors of and the are corresponding eigenvalues of .

Conversely, suppose is diagonalizable so that Let

where the columns are the vectors and Then and so showing the are eigenvectors of and the are eigenvalues.

Notice that because the matrix defined above is invertible it follows that the set of eigenvectors of , , is a basis of .

We demonstrate the concept given in the above theorem in the next example. Note that not only are the columns of the matrix formed by eigenvectors, but must be invertible, and therefore must consist of a linearly independent set of eigenvectors.

Consider the next important theorem.

The corollary that follows from this theorem gives a useful tool in determining if is diagonalizable.

The corollary tells us that many matrices can be diagonalized in this way.

Not every matrix has an eigenvalue decomposition. Sometimes we cannot find an invertible matrix such that . Consider the following example. We saw earlier in Corollary th:distincteigenvalues that an matrix with distinct eigenvalues is diagonalizable. It turns out that there are other useful diagonalizability tests.

Recall that the algebraic multiplicity of an eigenvalue is the number of times that it occurs as a root of the characteristic polynomial.

Consider now the following lemma.

In other words, the geometric multiplicity of an eigenvalue is less than or equal to the algebraic multiplicity of that same eigenvalue.

Proof
Let be the geometric multiplicity of , i.e., . Suppose is a basis for the eigenspace . Let be any invertible matrix having as its first columns, say In block form we may write where is , is , is , and is . We observe . This implies Therefore, We finish the proof by comparing the characteristic polynomials on both sides of this equation, and making use of the fact that similar matrices have the same characteristic polynomials. We see that the characteristic polynomial of has as a factor. This tells us that algebraic multiplicity of is at least , proving the desired inequality.

This result tells us that if is an eigenvalue of , then the number of linearly independent -eigenvectors is never more than the multiplicity of . We now use this fact to provide a useful diagonalizability condition.

Proof
Suppose is diagonalizable and let be the distinct eigenvalues of , with algebraic multiplicities , respectively and geometric multiplicities , respectively. Since is diagonalizable, Theorem th:eigenvectorsanddiagonalizable implies that . By applying Lemma lemma:dimeigenspace times, we have which is only possible if for .

Conversely, if the geometric multiplicity equals the algebraic multiplicity of each eigenvalue, then obtaining a basis for each eigenspace yields eigenvectors. Applying Theorem th:linindepeigenvectors, we know that these eigenvectors are linearly independent, so Theorem th:eigenvectorsanddiagonalizable implies that is diagonalizable.

Practice Problems

In this exercise you will ”fill in the details” of Example ex:diagonalizematrix.
Find the eigenvalues of matrix
Find a basis for each eigenspace of the matrix .
Compute the inverse of
Compute
Show that computing the inverse of is not really necessary by comparing the products and .

Text Source

A large portion of the text in this module is an adaptation of Section 7.2.1 of Ken Kuttler’s A First Course in Linear Algebra. (CC-BY)

Ken Kuttler, A First Course in Linear Algebra, Lyryx 2017, Open Edition, p. 363-368.