What do you observe about the eigenvalues?
What property of the matrix makes this “coincidence” possible?
We explore the theory behind finding the eigenvalues and associated eigenvectors of a square matrix.
Let be an matrix. In Module EIG-0010 we learned that the eigenvectors and eigenvalues of are vectors and scalars that satisfy the equation
We listed a few reasons why we are interested in finding eigenvalues and eigenvectors, but we did not give any process for finding them. In this module we will focus on the process.
If a vector is an eigenvector satisfying Equation (def:eigen), then it also satisfies the following equations.
This shows that any eigenvector of is in the null space of the related matrix, . Since eigenvectors are non-zero vectors, this means that will have eigenvectors if and only if the null space of is nontrivial. The only way that can be nontrivial is if .
If the rank of an matrix is less than , then the matrix is singular. Since must be singular for any eigenvalue , Theorem th:detofsingularmatrix implies that is an eigenvalue of if and only if
In theory, then, to find the eigenvalues of , one can solve Equation (eqn:chareqn) for .
In Example ex:3x3eig, the factor appears twice in the characteristic polynomial. This repeated factor gives rise to the eigenvalue . We say that has algebraic multiplicity .
The three examples above are a bit contrived. It is not always possible to completely factor the characteristic polynomial. However, a fundamental fact from algebra is that every degree polynomial has roots (counting multiplicity) provided that we allow complex numbers. This is why sometimes eigenvalues and their corresponding eigenvectors involve complex numbers. The next example illustrates this point.
What do you observe about the eigenvalues?
What property of the matrix makes this “coincidence” possible?
The matrix in Exploration Problem init:3x3tri is a triangular matrix, and the property you observed holds in general.
One final note about eigenvalues. We began this section with the sentence, ”In theory, then, to find the eigenvalues of , one can solve Equation (eqn:chareqn) for .” In general, one does not attempt to compute eigenvalues by solving the characteristic equation of a matrix, as there is no simple way to solve such an equation for . Instead, one can often approximate the eigenvalues using iterative methods.
Once we have computed an eigenvalue of an matrix , the next step is to compute the associated eigenvectors. In other words, we seek vectors such that , or equivalently,
For any given eigenvalue there are infinitely many eigenvectors associated with it. In fact, the eigenvectors associated with form a subspace of . (see Practice Problems prob:eigenspace1 and prob:eigenspace2) This motivates the following definition.
So given an eigenvalue , there is an associated eigenspace , and our goal is to find a basis of , for then any eigenvector will be a linear combination of the vectors in that basis. Moreover, we are trying to find a basis for the set of vectors that satisfy Equation eqn:nullspace, which means we seek a basis for . We have already learned how to compute a basis of a null space - see Module VSP-0040.
Let’s return to the examples we did in the first section of this module.
Recall that has eigenvalues and . Compute a basis for the eigenspace associated with each of these eigenvalues.
From this we see that the eigenspace associated with consists of vectors of the form . This means that is one possible basis for .
In a similar way, we compute a basis for , the subspace of all eigenvectors associated with the eigenvalue . Now we compute:
Vectors in the null space have the form This means that is one possible basis for the eigenspace .
From this we see that an eigenvector in has the form . This means that is one possible basis for the eigenspace . By letting , we obtain an arguably nicer-looking basis: .
To compute a basis for , the subspace of all eigenvectors associated to the eigenvalue , we compute:
From this we find that is one possible basis for the eigenspace .
Notice that there are two free variables. The eigenvectors in have the form
So one possible basis for the eigenspace is given by .
Next we find a basis for the eigenspace . We need to compute a basis for . We compute:
This time there is one free variable. The eigenvectors in have the form , so a possible basis for the eigenspace is given by .
From this we see that for any eigenvector in we have and , but is a free variable. So one possible basis for the eigenspace is given by Next we find a basis for the eigenspace . We need to compute a basis for . We compute:
There is one free variable. Setting , we get and . From this we see that eigenvectors in have the form , so a possible basis for the eigenspace is given by . We ask you in Practice Problem prob:3x3_complex_ev to show that is a basis for .
Answer: A basis for is and a basis for is
Sketch several vectors in each eigenspace and use geometry to explain why the eigenvectors you sketched make sense.
Answer:
Find a basis for the eigenspace corresponding to .
Answer: A basis for is
Sketch several vectors in the eigenspace and use geometry to explain why the eigenvectors you sketched make sense.
Suppose is a muliple of . Then the eigenspaces corresponding to the two eigenvalues are the same. Which of the following describes the eigenspace?
Answer: A basis for is and a basis for is
Choose the best description of .
Choose the best description of .
Use geometry to explain why the eigenspaces you found make sense.
Practice Problem prob:3x3fromKuttler1 is adopted from Problem 7.1.11 of Ken Kuttler’s A First Course in Linear Algebra. (CC-BY)
Ken Kuttler, A First Course in Linear Algebra, Lyryx 2017, Open Edition, p. 361.