Diagonalizable Matrices and Multiplicity

Recall that a diagonal matrix is a matrix containing a zero in every entry except those on the main diagonal. More precisely, if is the entry of a diagonal matrix , then unless . Such matrices look like the following.

where is a number which might not be zero.

Diagonal matrices have some nice properties, as we demonstrate below.

Let and let . Compute and using Octave.

To use Octave, go to the Sage Math Cell Webpage, copy the code below into the cell, select OCTAVE as the language, and press EVALUATE.

% Understanding multiplication by a diagonal matrix
 
M=[1 2 3; 4 5 6; 7 8 9]  
D=diag([2 -5 10])  
%The function diag(vector) creates a square matrix with diagonal vector and the remaining entries zero. The function diag() can also be used to return the diagonal of a matrix.  For example, diag(M) would return [1;5;9].  
M*D  
D*M

Notice the patterns present in the product matrices. Each row of is the same as its corresponding row of multiplied by the scalar which is the corresponding diagonal element of . In the product , it is the columns of that have been multiplied by the diagonal elements. These patterns hold in general for any diagonal matrix, and they are fundamental to understanding diagonalization, the process we discuss below.

If we are given a matrix that is diagonalizable, then we can write for some matrix , or, equivalently,

If we pause to examine Equation (eq:understand_diag), the work that we did in Exploration init:multiplydiag can help us to understand how to find that will diagonalize . The product is formed by multiplying each column of by a scalar which is the corresponding element on the diagonal of . To restate this, if is column in our matrix , then Equation (eq:understand_diag) tells us that where is the th diagonal element of .

Of course, Equation (eq:ev_ew_diag) is very familiar! We see that if we are able to diagonalize a matrix , the columns of matrix will be the eigenvectors of , and the corresponding diagonal entries of will be the corresponding eigenvalues of . This is summed up in the following theorem.

Proof
Suppose is given as above as an invertible matrix whose columns are eigenvectors of . To show that is diagonalizable, we will show which is equivalent to . We have while We can complete this half of the proof by comparing columns, and noting that for since the are eigenvectors of and the are corresponding eigenvalues of .

Conversely, suppose is diagonalizable so that Let

where the columns are the vectors and Then and so showing the are eigenvectors of and the are eigenvalues.

Notice that because the matrix defined above is invertible it follows that the set of eigenvectors of , , is a basis of .

We demonstrate the concept given in the above theorem in the next example. Note that not only are the columns of the matrix formed by eigenvectors, but must be invertible, and therefore must consist of a linearly independent set of eigenvectors.

It is often easier to work with matrices that are diagonalizable, as the next Exploration demonstrates.

Let and let . Would it be easier to compute or if you had to do so by hand, without a computer? Certainly is easier, due to the number of zero entries! Let’s use Octave to compute both.

To use Octave, go to the Sage Math Cell Webpage, copy the code below into the cell, select OCTAVE as the language, and press EVALUATE.

% Powers of a diagonal matrix are easy to compute
 
A=[2 0 0; 1 4 -1; -2 -4 4]  
D=diag([2 2 6])  
HARD = A^5  
EZ = D^5

We see that raising a diagonal matrix to a power amounts to simply raising each entry to that same power, whereas computing requires many more calculations. However, we learned in Example ex:diagonalizematrix that is similar to , and we can use this to make our computation easier. This is because

With this in mind, it is not as daunting to calculate by hand. We can compute the product quite easily since is diagonal, as we learned in Exploration init:multiplydiag. That leaves just one product of matrices to compute by hand to compute . And the savings in work would certainly be more pronounced for larger matrices or for powers larger that 5.

To use Octave, go to the Sage Math Cell Webpage, copy the code below into the cell, select OCTAVE as the language, and press EVALUATE.

A=[2 0 0; 1 4 -1; -2 -4 4]
 
D=diag([2 2 6])  
P=[-2 1 0; 1 0 1; 0 1 -2]  
EZ = D^5  
P*EZ*P^{-1}  
HARD = A^5

In Exploration exp:motivate_diagonalization, because matrix was diagonalizable, we were able to cut down on computations. When we chose to work with and instead of we worked with the eigenvalues and eigenvectors of . Each column of is an eigenvector of , and so we repeatedly made use of the following theorem (with ).

Proof
We prove this theorem by induction on . Clearly holds when , as that was given. For the inductive step, suppose that we know . Then as desired.

Matrix from the Example ex:diagonalizematrix and Exploration exp:motivate_diagonalization had a repeated eigenvalue of 2. The next theorem and corollary show that matrices which have distinct eigenvalues (where none are repeated) have desirable properties.

Proof
We prove this by induction on , the number of vectors in the set. If , then is a linearly independent set because . In general, suppose we have established that the theorem is true for some . Given eigenvectors , suppose We must show that each . Multiply both sides of (eq:thm_proof_5_5_4_1) on the left by and use the fact that to get If we multiply (eq:thm_proof_5_5_4_1) by and subtract the result from (eq:thm_proof_5_5_4_2), the first terms cancel and we obtain Since correspond to distinct eigenvalues , the set is linearly independent by the induction hypothesis. Hence, and so because the are distinct. It follows that (eq:thm_proof_5_5_4_1) becomes , which implies that because , and the proof is complete.

The corollary that follows from this theorem gives a useful tool in determining if is diagonalizable.

Not every matrix has an eigenvalue decomposition. Sometimes we cannot find an invertible matrix such that . Consider the following example. We saw earlier in Corollary th:distincteigenvalues that an matrix with distinct eigenvalues is diagonalizable. It turns out that there are other useful diagonalizability tests.

Recall that the algebraic multiplicity of an eigenvalue is the number of times that it occurs as a root of the characteristic polynomial.

Consider now the following lemma.

In other words, the geometric multiplicity of an eigenvalue is less than or equal to the algebraic multiplicity of that same eigenvalue.

Proof
Let be the geometric multiplicity of , i.e., . Suppose is a basis for the eigenspace . Let be any invertible matrix having as its first columns, say In block form we may write where is , is , is , and is . We observe . This implies Therefore, We finish the proof by comparing the characteristic polynomials on both sides of this equation, and making use of the fact that similar matrices have the same characteristic polynomials. We see that the characteristic polynomial of has as a factor. This tells us that algebraic multiplicity of is at least , proving the desired inequality.

This result tells us that if is an eigenvalue of , then the number of linearly independent -eigenvectors is never more than the multiplicity of . We now use this fact to provide a useful diagonalizability condition.

Proof
Suppose is diagonalizable and let be the distinct eigenvalues of , with algebraic multiplicities , respectively and geometric multiplicities , respectively. Since is diagonalizable, Theorem th:eigenvectorsanddiagonalizable implies that . By applying Lemma lemma:dimeigenspace times, we have which is only possible if for .

Conversely, if the geometric multiplicity equals the algebraic multiplicity of each eigenvalue, then obtaining a basis for each eigenspace yields eigenvectors. Applying Theorem th:linindepeigenvectors, we know that these eigenvectors are linearly independent, so Theorem th:eigenvectorsanddiagonalizable implies that is diagonalizable.

Practice Problems

In this exercise you will ”fill in the details” of Example ex:diagonalizematrix.
Find the eigenvalues of matrix
Find a basis for each eigenspace of the matrix .
Compute the inverse of
Compute
Show that computing the inverse of is not really necessary by comparing the products and .
In each case, decide whether the matrix is diagonalizable. If so, find such that is diagonal.
(a)
[correct]DiagonalizableNot Diagonalizable
(b)
[correct]DiagonalizableNot Diagonalizable
(c)
[correct]DiagonalizableNot Diagonalizable
(d)
DiagonalizableNot Diagonalizable
Let denote an upper triangular matrix.
(a)
If all the main diagonal entries of are distinct, show that is diagonalizable.
(b)
If all the main diagonal entries of are equal, show that is diagonalizable only if it is already diagonal. Click the arrow to see the answer.

The eigenvalues of are all equal (they are the diagonal elements), so if is diagonal, then . Hence .

(c)
Show that is diagonalizable but that is not diagonalizable.
Let be a diagonalizable matrix with eigenvalues (including multiplicities). Show that:
(a)
(b)
(a)
Show that two diagonalizable matrices are similar if and only if they have the same eigenvalues with the same multiplicities.
(b)
If is diagonalizable, show that .
(c)
Show that if

Text Source

The text in this section is a compilation of material from Section 7.2.1 of Ken Kuttler’s A First Course in Linear Algebra (CC-BY) and Section 5.5 of Keith Nicholson’s Linear Algebra with Applications (CC-BY-NC-SA).

Ken Kuttler, A First Course in Linear Algebra, Lyryx 2017, Open Edition, p. 362-364.

Many of the Practice Problems are Exercises from W. Keith Nicholson, Linear Algebra with Applications, Lyryx 2018, Open Edition, pp. 298-310.