You are about to erase your work on this activity. Are you sure you want to do this?
Updated Version Available
There is an updated version of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?
Mathematical Expression Editor
Orthogonal Matrices and Symmetric Matrices
Recall that an matrix is diagonalizable if and only if it has linearly independent
eigenvectors. (see Diagonalizable Matrices and Multiplicity) Moreover, the matrix
with these eigenvectors as columns is a diagonalizing matrix for , that is
As we have seen, the nice bases of are the orthogonal ones, so a natural question is:
which matrices have orthogonal eigenvectors, so that columns of form an
orthogonal basis for ? These turn out to be precisely the symmetric matrices
(matrices for which ), and this is the main result of this section.
Orthogonal Matrices
Recall that an orthogonal set of vectors is called orthonormal if for each vector in
the set, and that any orthogonal set can be “normalized”, i.e. converted into an
orthonormal set . In particular, if a matrix has orthogonal eigenvectors, they can
(by normalizing) be taken to be orthonormal. The corresponding diagonalizing matrix
(we will use instead of ) has orthonormal columns, and such matrices are very easy
to invert.
The following conditions are equivalent for an matrix .
It is not enough that the rows of a matrix are merely orthogonal for to be an
orthogonal matrix. Here is an example.
Let
(a)
Check that matrix has rows that are orthogonal.
(b)
Check that matrix has columns that are NOT orthogonal.
(c)
Check that matrix has rows that are NOT orthonormal.
(d)
Create a matrix by normalizing each of the rows of .
(e)
Check that is an orthogonal matrix.
Click the arrow to see the answer.
You should get , and one can check that this is orthogonal in a number of ways.
This exploration can certainly be done by hand (although it takes some time), but it
also makes for a very nice Octave exercise.
To use Octave, go to the Sage Math Cell Webpage, copy the code below into the cell,
select OCTAVE as the language, and press EVALUATE.
%Exploration from Section 9.4 Orthogonal Matrices and Symmetric Matrices
A=[2 1 1; -1 1 1; 0 -1 1]
%Check that matrix A has rows that are orthogonal.
A(1,:)*transpose(A(2,:))
A(2,:)*transpose(A(3,:))
A(1,:)*transpose(A(3,:))
%Check that matrix A has columns that are NOT orthogonal.
transpose(A(:,1))*A(:,2)
%(This is 1 of 3 calculations to do.)
%Check that matrix A in the Octave window has rows that are NOT orthonormal.
%(See the results from the first question.)
%Create a matrix Q by normalizing each of the rows of A.
q1=A(1,:)/norm(A(1,:));
q2=A(2,:)/norm(A(2,:));
q3=A(3,:)/norm(A(3,:));
Q = [q1;q2;q3]
%Check that Q is an orthogonal matrix.
Q*transpose(Q)
%(You may get numbers close to zero in some places you expect to get zero due to rounding error)
We studied the idea of closure when we studied Subspaces of . The next theorem tells
us that orthogonal matrices are closed under matrix multiplication.
(a)
If and are orthogonal matrices, then is also orthogonal. (We say that
the set of orthogonal matices is closed under matrix multiplication.)
We now shift our focus from orthogonal matrices to another important class of
matrices called symmetric matrices. A symmetric matrix is a matrix which is equal
to its transpose. We saw a few examples of such matrices in Transpose of a
Matrix.
When we began our study of eigenvalues and eigenvectors, we saw numerous
examples of matrices with entries that were real numbers with eigenvalues
that were complex numbers. It can be shown that symmetric matrices only
have real eigenvalues. We also learned that some matrices are diagonalizable
while other matrices are not. It turns out that every symmetric matrix is
diagonalizable. In fact, we can say more, but first we need the following
definition.
An matrix is said to be orthogonally diagonalizable if an orthogonal matrix can be
found such that is diagonal.
We have learned earlier that when we diagonalize a matrix , we write for some
matrix where is diagonal, and the diagonal entries are the eigenvalues of . We have
also learned that the columns of the matrix are the corresponding eigenvectors of .
So when a matrix is orthogonally diagonalizable, we are able to accomplish the
diagonalization using a matrix consisting of eigenvectors that form an orthonormal
basis for . The following remarkable theorem shows that the matrices that have this
property are precisely the symmetric matrices.
Real Spectral Theorem Let be an matrix. Then is symmetric if and only if is
orthogonally diagonalizable.
Proof
If is orthogonally diagonalizable, then it is an easy exercise to prove
that it is symmetric. You are asked to do this in Practice Problem prob:ortho_diag_implies_symmetric.
To prove the “only if” part of this theorem, we assume is symmetric, and we need to
show it is orthogonally diagonalizable. We proceed by induction on , the size of the
symmetric matrix. If , is already diagonal. If , assume that we know the “only if”
statement holds for symmetric matrices. Let be an eigenvalue of , and
let , where . Next, set , and use the Gram-Schmidt algorithm to find an
orthonormal basis for . Let , so that is an orthogonal matrix. We have
where the block has dimensions , and the block under is a zero matrix, because of
the orthogonality of the basis vectors.
Next, using the fact that is symmetric, we notice that
so is symmetric. It follows that is also a zero matrix and that is symmetric. Since
is an symmetric matrix, we may apply the inductive hypothesis, so there exists an
orthogonal matrix such that is diagonal. We observe that is orthogonal, and we
compute:
Because the eigenvalues of a real symmetric matrix are real, Theorem th:PrinAxes is also called
the Real Spectral Theorem, and the set of distinct eigenvalues is called the spectrum
of the matrix. A similar result holds for matrices with complex entries (Theorem
th:025890).
Find an orthogonal matrix such that is diagonal, where .
The characteristic polynomial of is (adding twice row 1 to row 2):
Thus the eigenvalues are , , and , and corresponding eigenvectors are
respectively. Moreover, by what at first appears to be remarkably good luck, these
eigenvectors are orthogonal. We have , , and , so
is an orthogonal matrix. Thus and
Actually, the fact that the eigenvectors in Example ex:DiagonalizeSymmetricMatrix are orthogonal is no coincidence.
These vectors certainly must be linearly independent (they correspond to distinct
eigenvalues). We will see that the fact that the matrix is symmetric implies that the
eigenvectors are orthogonal. To prove this we need the following useful fact about
symmetric matrices.
If A is an symmetric matrix, then
for all columns and in .
The converse also holds (see Practice Problem ex:8_2_15).
Proof
Recall that for all columns and . Because , we get
If is a symmetric matrix, then eigenvectors of corresponding to distinct eigenvalues
are orthogonal.
Proof
Let and , where . We compute
Hence , and so because .
Now the procedure for diagonalizing a symmetric matrix is clear. Find the distinct
eigenvalues and find orthonormal bases for each eigenspace (the Gram-Schmidt
algorithm may be needed when there is a repeated eigenvalue). Then the set of all
these basis vectors is orthonormal (by Theorem th:symmetric_has_ortho_ev) and contains vectors. Here is an
example.
Orthogonally diagonalize the symmetric matrix .
The characteristic polynomial is
Hence the distinct eigenvalues are and are of algebraic multiplicity
and , respectively. The geometric multiplicities must be the same, for is
diagonalizable, being symmetric. It follows that and . Gaussian elimination gives
The eigenvectors in are both orthogonal to as Theorem th:symmetric_has_ortho_ev guarantees, but not to
each other. However, the Gram-Schmidt process yields an orthogonal basis
Normalizing gives orthonormal vectors , so
is an orthogonal matrix such that is diagonal.
It is worth noting that other, more convenient, diagonalizing matrices exist. For
example, and lie in and they are orthogonal. Moreover, they both have norm (as
does ), so
is a nicer orthogonal matrix with the property that is diagonal.
Let be an matrix. has an orthonormal set of eigenvectors if and only if is
orthogonally diagonalizable.
Proof
Let be orthonormal eigenvectors of with corresponding eigenvalues
. We must show is orthogonally diagonalizable. Let so that is orthogonal. We
have
where is the diagonal matrix with diagonal entries . But then , proving this
half of the theorem.
For the converse, if is orthogonally diagonalizable, then by Theorem th:PrinAxes it is
symmetric. But then Theorem th:symmetric_has_ortho_ev tells us that eigenvectors corresponding to
distinct eigenvalues are orthogonal. Because is (orthogonally) diagonalizable,
we know the geometric multiplicity of each eigenvalue is equal to its algebraic
multiplicity. This implies that we can use Gram-Schmidt on each eigenspace of
dimension to get a full set of orthogonal eigenvectors.
If we are willing to replace “diagonal” by “upper triangular” in the real spectral
theorem, we can weaken the requirement that is symmetric to insisting only that
has real eigenvalues.
Schur Triangularization Theorem If is an matrix with real eigenvalues, an
orthogonal matrix exists such that is upper triangular.
There is also a lower triangular version of this theorem.
The eigenvalues of an upper triangular matrix are displayed along the main diagonal.
Because and have the same determinant and trace whenever is orthogonal (for they
are similar matrices), Theorem th:Schur gives:
If is an matrix with real eigenvalues (possibly not all distinct), then and
.
This corollary remains true even if the eigenvalues are not real.
Practice Problems
Suppose is orthogonally diagonalizable. Prove that is symmetric. (This is the easy
direction of the ”if and only if” in Theorem th:PrinAxes.)
Normalize the rows to make each of the following matrices orthogonal.
If is a triangular orthogonal matrix, show that is diagonal and that all diagonal
entries are or .
We have ; the first step is to show that is lower triangular and also upper triangular,
and so is diagonal. But then , so . This implies that the diagonal entries of are all
.
If is orthogonal, show that is orthogonal if and only if or .
If the first two rows of an orthogonal matrix are and , find all possible third
rows.
For each matrix , find an orthogonal matrix such that is diagonal.
(a)
(b)
(c)
(d)
(e)
(f)
(g)
(challenging problem)
(h)
(challenging problem)
Show that the following are equivalent for a symmetric matrix .
Show that every eigenvalue of is zero if and only if is nilpotent ( for some
).
If has real eigenvalues, show that where is symmetric and is nilpotent.
Let be an orthogonal matrix.
(a)
Show that or .
(b)
Give examples of such that and .
(c)
If , show that has no inverse.
.
(d)
If is and , show that has no inverse.
We call a square matrix a projection matrix if .
(a)
If is a projection matrix, show that is orthogonal and symmetric.
(b)
If is orthogonal and symmetric, show that is a projection matrix.
(c)
If is and (for example, a unit column in ), show that is a projection
matrix.
A matrix that we obtain from the identity matrix by writing its rows in a different
order is called a permutation matrix (see Theorem th:LUPA). Show that every permutation
matrix is orthogonal.
If the rows of the matrix are orthogonal, show that the -entry of is .
(a)
Let be an matrix. Show that the following are equivalent.
i.
has orthogonal rows.
ii.
can be factored as , where is invertible and diagonal and has
orthonormal rows.
iii.
is an invertible, diagonal matrix.
(b)
Show that an matrix has orthogonal rows if and only if can be factored as ,
where is orthogonal and is diagonal and invertible.
Let be a skew-symmetric matrix; that is, . Assume that is an matrix.
(a)
Show that is invertible.
By Theorem thm:004553, it suffices to show that , in , implies .
Compute , and use the fact that and .
(b)
Show that is orthogonal.
(c)
Show that every orthogonal matrix such that is invertible arises as in part (b)
from some skew-symmetric matrix .
Solve for .
Show that the following are equivalent for an matrix .
(a)
is orthogonal.
(b)
for all .
(c)
for all , .
(d)
for all columns , .
For (d) (a), show that column of equals , where is column of the identity
matrix.
This exercise shows that linear transformations with orthogonal standard matrices
are distance-preserving (b,c) and angle-preserving (d).
(a)
Show that is an orthogonal matrix.
(b)
Show that every orthogonal matrix has the form or for some angle .