You are about to erase your work on this activity. Are you sure you want to do this?
Updated Version Available
There is an updated version of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?
Mathematical Expression Editor
In this module we discuss algebraic multiplicity, geometric multiplicity, and their
relationship to diagonalizability.
EIG-0050: Diagonalizable Matrices and Multiplicity
Recall that a diagonal matrix is a matrix containing a zero in every entry except
those on the main diagonal. More precisely, if is the entry of a diagonal matrix ,
then unless . Such matrices look like the following.
where is a number which might not be zero.
Let be an matrix. Then is said to be diagonalizable if there exists an invertible
matrix such that
where is a diagonal matrix.
Diagonal matrices have some nice properties, as we demonstrate below.
Let and let . Compute and .
Notice the patterns present in the product matrices. Each row of is the same as its
corresponding row of multiplied by the scalar which is the corresponding diagonal
element of . In the product , it is the columns of that have been multiplied by the
diagonal elements. These patterns hold in general for any diagonal matrix, and they
are fundamental to understanding diagonalization.
If we are given a matrix that is diagonalizable, then we can write for some matrix ,
or, equivalently,
If we pause to examine Equation (eq:understand_diag), the work that we did in Exploration init:multiplydiag can help us
to understand how to find that will diagonalize . The product is formed by
multiplying each column of by a scalar which is the corresponding element on the
diagonal of . To restate this, if is column in our matrix , then Equation (eq:understand_diag) tells us
that
where is the th diagonal element of .
Of course, Equation (eq:ev_ew_diag) is very familiar! We see that if we are able to diagonalize a
matrix , the columns of matrix will be the eigenvectors of , and the corresponding
diagonal entries of will be the corresponding eigenvalues of . This is summed up in
the following theorem.
An matrix is diagonalizable if and only if there is an invertible matrix given by
where the columns are eigenvectors of .
Moreover, if is diagonalizable, the corresponding eigenvalues of are the diagonal
entries of the diagonal matrix .
Proof
Suppose is given as above as an invertible matrix whose columns are
eigenvectors of . To show that is diagonalizable, we will show
which is equivalent to . We have
while
We can complete this half of the proof by comparing columns, and noting that
for since the are eigenvectors of and the are corresponding eigenvalues of
.
Conversely, suppose is diagonalizable so that Let
where the columns are the vectors and
Then
and so
showing the are eigenvectors of and the are eigenvalues.
Notice that because the matrix defined above is invertible it follows that the set of
eigenvectors of , , is a basis of .
We demonstrate the concept given in the above theorem in the next example. Note
that not only are the columns of the matrix formed by eigenvectors, but must
be invertible, and therefore must consist of a linearly independent set of
eigenvectors.
Let
Find an invertible matrix and a diagonal matrix such that .
We will use eigenvectors of as the columns of , and the corresponding eigenvalues of
as the diagonal entries of . The eigenvalues of are , and . We leave these
computations as exercises, as well as the computations to find a basis for each
eigenspace.
One possible basis for , the eigenspace corresponding to , is , while a basis for is
given by .
We construct the matrix by using these basis elements as columns.
You can verify (See Practice Problem ) that
Thus,
You can see that the result here is a diagonal matrix where the entries on the
main diagonal are the eigenvalues of . Notice that eigenvalues on the main
diagonal must be in the same order as the corresponding eigenvectors in
.
Consider the next important theorem.
Let be an matrix, and suppose that has distinct eigenvalues . For each , let be a
-eigenvector of . Then is linearly independent.
The corollary that follows from this theorem gives a useful tool in determining if is
diagonalizable.
Let be an matrix and suppose it has distinct eigenvalues. Then it follows that is
diagonalizable.
The corollary tells us that many matrices can be diagonalized in this way.
If we are
able to diagonalize , say , we say that is an eigenvalue decomposition of
.
Not every matrix has an eigenvalue decomposition. Sometimes we cannot find an
invertible matrix such that . Consider the following example.
Let
If possible, find an invertible matrix and a diagonal matrix so that .
We see immediately (how?) that the eigenvalues of are and . To find , the next step
would be to find a basis for the corresponding eigenspace . We solve the equation .
Writing this equation as an augmented matrix, we already have a matrix in row
echelon form:
We see that the eigenvectors in are of the form
so a basis for the eigenspace is given by . It is easy to see that we cannot form an
invertible matrix , because any two eigenvectors will be of the form , and so the
second row of would have a row of zeros, and could not be invertible. Hence cannot
be diagonalized.
We saw earlier in Corollary th:distincteigenvalues that an matrix with distinct eigenvalues
is diagonalizable. It turns out that there are other useful diagonalizability
tests.
Recall that the algebraic multiplicity of an eigenvalue is the number of times that it
occurs as a root of the characteristic polynomial.
The geometric multiplicity of an eigenvalue is the dimension of the corresponding
eigenspace .
Consider now the following lemma.
Let be an matrix, and let be the eigenspace corresponding to the eigenvalue which
has algebraic multiplicity . Then
In other words, the geometric multiplicity of an eigenvalue is less than or equal to the
algebraic multiplicity of that same eigenvalue.
Proof
Let be the geometric multiplicity of , i.e., . Suppose is a basis for the
eigenspace . Let be any invertible matrix having as its first columns, say
In block form we may write
where is , is , is , and is . We observe . This implies
Therefore,
We finish the proof by comparing the characteristic polynomials on both sides of this
equation, and making use of the fact that similar matrices have the same
characteristic polynomials.
We see that the characteristic polynomial of has as a factor. This tells us that
algebraic multiplicity of is at least , proving the desired inequality.
This result tells us that if is an eigenvalue of , then the number of linearly
independent -eigenvectors is never more than the multiplicity of . We now use this
fact to provide a useful diagonalizability condition.
Let be an matrix . Then is diagonalizable if and only if for each eigenvalue
of , the algebraic multiplicity of is equal to the geometric multiplicity of
.
Proof
Suppose is diagonalizable and let be the distinct eigenvalues of , with
algebraic multiplicities , respectively and geometric multiplicities , respectively.
Since is diagonalizable, Theorem th:eigenvectorsanddiagonalizable implies that . By applying Lemma lemma:dimeigenspace times,
we have
which is only possible if for .
Conversely, if the geometric multiplicity equals the algebraic multiplicity of
each eigenvalue, then obtaining a basis for each eigenspace yields eigenvectors.
Applying Theorem th:linindepeigenvectors, we know that these eigenvectors are linearly independent,
so Theorem th:eigenvectorsanddiagonalizable implies that is diagonalizable.
Practice Problems
In this exercise you will ”fill in the details” of Example ex:diagonalizematrix.
Find the eigenvalues of
matrix
Find a basis for each eigenspace of the matrix .
Compute the inverse of
Compute
Show that computing the inverse of is not really necessary by comparing the
products and .
Text Source
A large portion of the text in this module is an adaptation of Section 7.2.1 of Ken
Kuttler’s A First Course in Linear Algebra. (CC-BY)
Ken Kuttler, A First Course in Linear Algebra, Lyryx 2017, Open Edition, p.
363-368.