We prove the Jordan normal form theorem under the assumption that the eigenvalues of are all real. The proof for matrices having both real and complex eigenvalues proceeds along similar lines.

Let be an matrix, let be the distinct eigenvalues of , and let .

Proof
Just compute and So , as claimed.

Let be the generalized eigenspace corresponding to eigenvalue .

Proof
Recall from Lemma ?? that for some . Suppose that . We first verify that is also in . Using Lemma ??, just compute Therefore, .

Let be the linear mapping . It follows from Chapter ??, Theorem ?? that Now if and . Since , it follows that . Hence and Since , it follows that only when . Hence the nullity of is zero. We conclude that Thus, is invertible, since the domain and range of are the same space.

Proof
Let for some integer . Let . Then since for . But Lemma ?? implies that is invertible. Therefore, . Similarly, all of the remaining have to vanish.

Proof
Let be the subspace of consisting of all vectors of the form where . We need to verify that . Suppose that is a proper subspace. Then choose a basis of and extend this set to a basis of . In this basis the matrix has block form, that is, where is an matrix. The eigenvalues of are eigenvalues of . Since all of the distinct eigenvalues and eigenvectors of are accounted for in (that is, in ), we have a contradiction. So , as claimed.

Proof
We first show that the vectors in span . It follows from Lemma ?? that every vector in is a linear combination of vectors in . But each vector in is a linear combination of vectors in . Hence, the vectors in span .

Second, we show that the vectors in are linearly independent. Suppose that a linear combination of vectors in sums to zero. We can write this sum as where is the linear combination of vectors in . Lemma ?? implies that each . Since is a basis for , it follows that the coefficients in the linear combinations must all be zero. Hence, the vectors in are linearly independent.

Finally, it follows from Theorem ?? of Chapter ?? that is a basis.

Proof
It follows from Lemma ?? that . Suppose that . Then is in and is a linear combination of vectors in . The block diagonalization of follows. Since , it follows that all eigenvalues of equal .

Lemma ?? implies that to prove the Jordan normal form theorem, we must find a basis in which the matrix is in Jordan normal form. So, without loss of generality, we may assume that all eigenvalues of equal , and then find a basis in which is in Jordan normal form. Moreover, we can replace by the matrix , a matrix all of whose eigenvalues are zero. So, without loss of generality, we assume that is an matrix all of whose eigenvalues are zero. We now sketch the remainder of the proof of Theorem ??.

Let be the smallest integer such that and let Let be a basis for and extend this set to a basis for by adjoining the linearly independent vectors . Let It follows that .

We claim that the vectors where and are linearly independent. We can write any linear combination of the vectors in as , where . Suppose that Then . Therefore, is in and in . Hence, . Similarly, . But where and . Hence, and . Similarly, all of the . It follows from that a linear combination of the vectors is zero; that is Applying to this expression, we see that is in and in the . Hence, Since the are linearly independent, each , thus verifying the claim.

Next, we find the largest integer so that Proceed as above. Choose a basis for and extend to a basis for by adjoining the vectors . Adjoin the vectors to the set and verify that these vectors are all linearly independent. And repeat the process. Eventually, we arrive at a basis for .

In this basis the matrix is block diagonal; indeed, each of the blocks is a Jordan block, since Note the resemblance with (??).