You are about to erase your work on this activity. Are you sure you want to do this?
Updated Version Available
There is an updated version of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?
We listed a few reasons why we are interested in finding eigenvalues and eigenvectors,
but we did not give any process for finding them. In this section we will focus on a
process which can be used for small matrices. For larger matrices, the best methods
we have are iterative methods, and we will explore some of these in The Power
Method and the Dominant Eigenvalue.
For an matrix, we will see that the eigenvalues are the roots of a polynomial called
the characteristic polynomial. So finding eigenvalues is equivalent to solving a
polynomial equation of degree . Finding the corresponding eigenvectors turns out to
be a matter of computing the null space of a matrix, as the following exploration
demonstrates.
If a vector is an eigenvector satisfying Equation (def:eigen), then clearly it also satisfies 0.
It seems natural at this point to try to factor. We would love to “factor out” . Here is
the procedure:
The middle step was necessary before factoring because we cannot subtract a
scalar from an matrix is a Greek letter.
This shows that any eigenvector of is in the row spacecolumn spacenull
space of the related matrix, .
Since eigenvectors are non-zero vectors, this means that will have eigenvectors if and
only if the null space of is nontrivial. The only way that can be nontrivial is if .
If the rank of an matrix is less than , then the matrix is singular. Since must be
singular for any eigenvalue , we see that is an eigenvalue of if and only if
In theory, Exploration exp:slowdown offers us a way to find eigenvalues. To find the eigenvalues of
, one can solve Equation (eqn:chareqn) for .
Eigenvalues
The equation
is called the characteristic equation of .
Let . Compute the eigenvalues of this matrix using the characteristic equation.
The characteristic equation has solutions and . These are the eigenvalues of
.
Let . Compute the eigenvalues of using the characteristic equation. (List your
answers in an increasing order.)
and
Let . Compute the eigenvalues of using the characteristic equation.
Matrix has eigenvalues and .
In Example ex:3x3eig, the factor appears twice. This repeated factor gives rise to the
eigenvalue . We say that the eigenvalue has algebraic multiplicity .
The three examples above are a bit contrived. It is not always possible to completely
factor the characteristic polynomial using only real numbers. However, a fundamental
fact from algebra is that every degree polynomial has roots (counting multiplicity)
provided that we allow complex numbers. This is why sometimes eigenvalues and
their corresponding eigenvectors involve complex numbers. The next example
illustrates this point.
Let . Compute the eigenvalues of this matrix.
So one of the eigenvalues of is . To get the other eigenvalues we must solve .
Using the quadratic formula, we compute that and are also eigenvalues of
.
Let . Compute the eigenvalues of this matrix. (List your answers in an increasing
order.)
What do you observe about the eigenvalues?
The eigenvalues are the diagonal entries
of the matrix.
What property of the matrix makes this “coincidence” possible?
is a triangular matrix.
The matrix in Exploration Problem init:3x3tri is a triangular matrix, and the property you
observed holds in general.
Let be a triangular matrix. Then the eigenvalues of are the entries on the main
diagonal.
Let be a diagonal matrix. Then the eigenvalues of are the entries on the main
diagonal.
One final note about eigenvalues. We began this section with the sentence, ”In theory,
then, to find the eigenvalues of , one can solve Equation (eqn:chareqn) for .” In general, one does
not attempt to compute eigenvalues by solving the characteristic equation of a
matrix, as there is no simple way to solve this polynomial equation for . Instead,
one can often approximate the eigenvalues using iterative methods. We will
explore some of these techniques in The Power Method and the Dominant
Eigenvalue.
Eigenvectors
Once we have computed an eigenvalue of an matrix , the next step is to compute
the associated eigenvectors. In other words, we seek vectors such that , or
equivalently,
For any given eigenvalue there are infinitely many eigenvectors associated with it. In
fact, the eigenvectors associated with form a subspace of .
Let be an matrix and let be an eigenvalue of . Then the set of all eigenvectors
associated with is a subspace of .
The set of all eigenvectors associated with a given eigenvalue of a matrix is known as
the eigenspace associated with that eigenvalue.
So given an eigenvalue , there is an associated eigenspace , and our goal is to find a
basis of , for then any eigenvector will be a linear combination of the vectors in that
basis. Moreover, we are trying to find a basis for the set of vectors that
satisfy Equation eqn:nullspace, which means we seek a basis for . We have already learned
how to compute a basis of a null space - see Subspaces Associated with
Matrices.
Let’s return to the examples we did in the first part of this section.
Recall that has eigenvalues and . Compute a basis for the eigenspace
associated with each of these eigenvalues.
Eigenvectors associated with the
eigenvalue are in the null space of . So we seek a basis for . We compute:
From this we see that the eigenspace associated with consists of vectors of the form .
This means that is one possible basis for .
In a similar way, we compute a basis for , the subspace of all
eigenvectors associated with the eigenvalue . Now we compute:
Vectors in the null space have the form This means that is one possible basis for the
eigenspace .
(Finding eigenvectors for Example ex:2x2eig2) We know from Example ex:2x2eig2 that has eigenvalues and .
Compute a basis for the eigenspace associated with each of these eigenvalues.
Let’s begin
by finding a basis for the eigenspace , which is the subspace of consisting of eigenvectors
corresponding to the eigenvalue . We need to compute a basis for . We compute:
From this we see that an eigenvector in has the form . This means that is one
possible basis for the eigenspace . By letting , we obtain an arguably nicer-looking
basis: .
See if you can compute a basis for .
Click on the arrow if you need help.
To compute a basis for , the subspace of all eigenvectors associated to the eigenvalue ,
we compute:
From this we find that is one possible basis for the eigenspace .
(Finding eigenvectors for Example ex:3x3eig) We know from Example ex:3x3eig that has eigenvalues
and . Compute a basis for the eigenspace associated to each of these eigenvalues.
We
first find a basis for the eigenspace . We need to compute a basis for . We compute:
Notice that there are two free variables. The eigenvectors in have the form
So one possible basis for the eigenspace is given by .
Next we find a basis for the eigenspace . We need to compute a basis for . We compute:
This time there is one free variable. The eigenvectors in have the form , so a possible
basis for the eigenspace is given by .
(Finding eigenvectors for Example ex:3x3_complex_eig) We know from Example ex:3x3_complex_eig that has eigenvalues ,
, and . Compute a basis for the eigenspace associated with each eigenvalue.
We first
find a basis for the eigenspace . We need to compute a basis for . We compute:
From this we see that for any eigenvector in we have and , but is a free variable.
So one possible basis for the eigenspace is given by
Next we find a basis for the eigenspace . We need to compute a basis for . We compute:
There is one free variable. Setting , we get and . From this we see that
eigenvectors in have the form , so a possible basis for the eigenspace is
given by . We ask you in Practice Problem prob:3x3_complex_ev to show that is a basis for
.
We conclude this section by establishing the significance of a matrix having an
eigenvalue of zero.
A square matrix has an eigenvalue of zero if and only if it is singular.
Proof
A square matrix is singular if and only if .(see th:detofsingularmatrix). But if and only if ,
which is true if and only if zero is an eigenvalue of .
Practice Problems
Problems prob:eigenspace1-prob:eigenspace2 In this exercise we will prove that the eigenvectors associated with an
eigenvalue of an matrix form a subspace of .
Let and be eigenvectors of associated with . Show that is also an eigenvector of
associated with . (This shows that the set of eigenvectors of associated with is
closed under addition).
Show that the set of eigenvectors of associated with is closed under scalar
multiplication.
Compute a basis for each of the eigenspaces of this matrix, , , and .
Answer: A basis for is , a basis for is ,
and a basis for is .
Complete Example ex:3x3_complex_ev by showing that a basis for is given by , where is the
eigenspace associated with the eigenvalue of the matrix .
Prove Theorem th:eigtri. (HINT: Proceed by induction on the dimension n. For the
inductive step, compute by expanding along the first column (or row) if is upper
(lower) triangular.)
Recall that a vertical stretch/compression of the plane is a linear transformation
whose standard matrix is
Find the eigenvalues of . Find a basis for the eigenspace corresponding to each
eigenvalue.
Answer: A basis for is and a basis for is
Sketch several vectors in each eigenspace and use geometry to explain why the
eigenvectors you sketched make sense.
Recall that a horizontal shear of the plane is a linear transformation whose standard
matrix is
Find the eigenvalue of .
Answer:
Find a basis for the eigenspace corresponding to .
Answer: A basis for is
Sketch several vectors in the eigenspace and use geometry to explain why the
eigenvectors you sketched make sense.