You are about to erase your work on this activity. Are you sure you want to do this?
Updated Version Available
There is an updated version of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?
Mathematical Expression Editor
We introduce the concepts of eigenvalues and eigenvectors of a matrix.
EIG-0010: Describing Eigenvalues and Eigenvectors Algebraically and Geometrically
At several places in this course it has been valuable to restrict ourselves
to square matrices, and we do so again when discussing eigenvalues and
eigenvectors.
In Theorem th:matrixtran of Module LTR-0010, we proved that any matrix induces a linear
transformation from to itself. For our first few examples, let us consider the case
.
Let . The following animation helps us to visualize the matrix transformation
associated with .
Given a vector in , the vector is also in . For many vectors, will not be pointing in
the same direction as . This is the case for any of the gray vectors in the animation,
as we can see that points in a different direction than . But if we look at the
red vectors (vectors parallel to ), we notice that they appear unchanged in
magnitude and direction. Such vectors are sometimes called fixed vectors of
.
Looking next at the blue vectors (vectors parallel to ), we observe that the
magnitudes of the vectors are changed, but the direction in which the blue vectors
point is unchanged by this linear transformation.
In Exploration init:eignintro we found that certain vectors do not change direction under the
linear transformation induced by matrix . Such vectors are examples of eigenvectors
of .
In general, any nonzero vector whose image under a matrix transformation is parallel
to the original vector is called an eigenvector of the matrix that induced the
transformation. The following definition captures this idea algebraically.
Let be an matrix. We say that a non-zero vector is an eigenvector of
if
for some scalar . We say that is an eigenvalue of associated with the eigenvector
.
In Exploration init:eignintro we observed visually that vectors parallel to were eigenvectors, as
they changed length but did not change direction under the linear transformation. To
verify this algebraically, observe that all vectors parallel to can be written in the
form , . We compute
This shows that any non-zero scalar multiple of is an eigenvector of which has a
corresponding eigenvalue of 3.
Fixed vectors of Exploration init:eignintro are also eigenvectors. For example,
This shows that is a fixed vector and an eigenvector of which has a corresponding
eigenvalue of .
A couple of finer points of Definition def:eigen require clarification.
The definition requires that eigenvectors be non-zero. Imagine what would
happen if we allowed to be an eigenvector of . Clearly for all scalars . This
means that every number would be an eigenvalue of every matrix. Because
eigenvalues are supposed to capture certain information about the matrix,
allowing every number to be an eigenvalue of every matrix would defeat
the purpose.
Up to now we had talked about eigenvectors as vectors whose images
under a matrix transformation are parallel to the original vectors. But
the algebraic definition allows non-zero vectors that map to zero to be
considered eigenvectors. (What would an eigenvalue of such an eigenvector
be?) The zero vector has no direction, so we cannot say that the image
of such an eigenvector is parallel to the original vector. Example ex:eigen will
illustrate this point.
Let . Note that takes a vector in and projects it onto the -axis, as we
learned in Practice Problem of Module LTR-0020. Which vectors in would be
the eigenvectors, and what are the corresponding eigenvalues?
Since is the
projection of onto the -axis, in many cases and are not parallel. Notice,
however, that all of the red vectors located along the -axis in the diagram are
fixed by . So, for any of the red vectors we have , which means that each of
the red vectors is an eigenvector of with the corresponding eigenvalue of
.
The blue vectors along the y-axis are also eigenvectors. To see this, note that each of
the blue vectors is of the form . But then
So each of the blue vectors is an eigenvector of with the corresponding eigenvalue of
.
A natural question is this: does every square matrix have eigenvalues and
eigenvectors? We will see in Module EIG-0020 that the answer to this question
is “yes”, provided that we permit eigenvalues and entries of eigenvectors
to be complex numbers. The next example is one that requires complex
numbers.
Let . Note that takes any vector in and rotates it , as we saw in Example ex: rotate45 of
Module LTR-0070.
Since rotates every vector in , every nonzero vector changes direction, so there are
no eigenvectors in the plane. But it turns out that does have eigenvectors and
eigenvalues, but in order to find them we need to work with vectors whose
entries are complex numbers. Since these vectors are not in , we cannot see
them.
Consider the vector . We compute:
so is an eigenvector of . Its corresponding eigenvalue is .
We will continue to work with complex numbers as we study eigenvalues and
eigenvectors.
Why All the Fuss About Eigenvalues and Eigenvectors?
The first in-depth study of eigenvalues can probably be attributed to Fourier as he
studied partial differential equations early in the nineteenth century, and in
particular when he studied what is known as the heat equation. [Trefethen and
Embree] By the twentieth century mathematicians understood the connections
between differential equations and eigenvalues. Systems of differential equations are
often best represented by matrices, especially in the context of using computers to
find numerical solutions. Most algorithms to solve these systems work by iterating
some process, and eigenvalues along with their corresponding eigenvectors indicate
what will happen to such a process after many repetitions.
The most famous modern example of a large-scale eigenvalue problem is the Google
PageRank algorithm, which helped set Google apart from its competitors as a search
engine. Some of the relevant mathematics can be learned by working through the
paper, “The $25,000,000,000 Eigenvector, The Linear Algebra Behind Google”, by
Kurt Bryan and Tanya Leise.
Practice Problems
Let .
Show that is an eigenvector of . What is its corresponding eigenvalue?
Show that is an eigenvector of . What is its corresponding eigenvalue?
Show that is an eigenvector of . What is its corresponding eigenvalue?
Let . Note that takes any vector in and projects it onto the -axis, as we learned in
Practice Problem of Module LTR-0020. Which vectors in would be eigenvectors,
and what are the corresponding eigenvalues?
Returning to Example ex:eigsrotation, let . Show that is an eigenvector of . What is its
corresponding eigenvalue?
Arguing geometrically, identify the linear transformation whose standard matrix has
eigenvalues and .
Vertical ShearHorizontal ShearCounterclockwise Rotation
through a angleReflection About the line Horizontal StretchVertical Stretch
Let . Can you find an eigenvector and its corresponding eigenvalue? Can you find
another “eigenpair”? Can you find all of the eigenvectors of ?
The rotation matrix in Example ex:eigsrotation has complex eigenvectors and eigenvalues. Think
geometrically to find an example of a (non-identity) rotation matrix with real
eigenvectors and eigenvalues.
Answer: Rotation through degrees.
Can an eigenvalue have multiple eigenvectors associated with it?
YesNo
Can an eigenvector have multiple eigenvalues associated with it?
YesNo
Bibliography
[Trefethen and Embree] Trefethen, Lloyd and Embree, Mark, Spectra and
Pseudospectra, Princeton University Press, 2005, p. 5-6