Similarity represents an important equivalence relation on the vector space of square matrices of a given dimension.

Two square matrices and are said to be similar if there exists an invertible matrix such that The matrix appearing in this equation is referred to as a similarity matrix. Similarity represents an important equivalence relation on the vector space of square matrices of a given dimension; in particular, is similar to iff is similar to . Also, a number of important aspects of a square matrix remain unchanged under this relation.

Proof
Choose such that . Then the equality of the characteristic polynomials follows from the sequence of equalities

Thus similar matrices have the same eigenvalues, occurring with the same multiplicity. Moreover, their eigenvectors are related.

If , and is an eigenvector of with eigenvalue , show that is an eigenvector of with eigenvalue . More generally, if is the linear transformation , show that induces an isomorphism on eigenspaces for all eigenvalues .

In the previous section we considered the question: when does decompose as a direct sum of eigenspaces of a matrix ? To answer this, we consider first the case when a diagonal matrix with . For such a matrix, each standard basis vector is an eigenvector, as for each . So for such a matrix one has an evident direct sum decomposition

If , we will say that is real-diagonalizable if is similar to a diagonal matrix, where the similarity matrix is also in (this is one of the places where one has to be careful whether one is working over the real or complex numbers).

Proof
Statements (1) and (2) are clearly equivalent. It will suffice then to show that statement (2) is equivalent to statement (3). Suppose first that is real-diagonalizable, in other words that there is an invertible matrix such that . Multiplying both sides of this equation on the right by yields The column of is , while the column of is where represents the diagonal element of . In other words, we have an equality implying the columns of , which are vectors in , are eigenvectors of . The matrix is invertible, so must have rank . This means the set of column vectors are linearly independent, and therefore form a basis for .

On the other hand, if is a basis of with , then concatenating the vectors in the basis forms a matrix whose column is the eigenvector . If we now define to be the diagonal matrix with , then (as above) one has By construction the set of columns of are linearly independent, and so is invertible. So we may multiply both sides of this last equation on the right by , yielding implying that is diagonalizable. This completes the proof.

Many matrices are diagonalizable, but there are also many that are not. Moreover, a real matrix might be diagonalizable, but not real-diagonalizable. The following exercise illustrates a type of real matrix that isn’t diagonalizable even if one allows complex similarity matrices.

Show that is not diagonalizable, by i) finding the single eigenvalue for the matrix, and then ii) showing has dimension 1 (and this cannot be all of ).

Closely related to the difference between real-diagonalizable and diagonalizable is the fact that real matrices need not have real eigenvalues. In fact, some real matrices have no eigenvalues over the real numbers. To illustrate

What should be done with such matrices? The answer is that even if one is primarily concerned with real matrices and working over the real numbers, there are cases where one needs to enlarge the set of scalars to . This is one of those cases.