Complex Matrices

Nearly everything we have studied in this book would remain true if the phrase real number were replaced by complex number wherever it occurs. Then we would deal with matrices with complex entries, systems of linear equations with complex coefficients (and complex solutions), determinants of complex matrices, and vector spaces with scalar multiplication by any complex number allowed. Moreover, the proofs of most theorems about (the real version of) these concepts extend easily to the complex case. It is not our intention here to give a full treatment of complex linear algebra. However, we will carry the theory far enough to give another proof of the Real Spectral Theorem (th:PrinAxes).

The set of complex numbers is denoted . We will use only the most basic properties of these numbers (mainly conjugation and absolute values), and the reader can find this material in Complex Numbers.

If , we denote the set of all -tuples of complex numbers by . As with , these -tuples will be written either as row or column matrices and will be referred to as vectors. We define vector operations on as follows: \begin{align*} [v_{1}, v_{2}, \ldots , v_{n}] + [w_{1}, w_{2}, \ldots , w_{n}] &= [v_{1} + w_{1}, v_{2} + w_{2}, \ldots , v_{n} + w_{n}] \\ u[v_{1}, v_{2}, \ldots , v_{n}] &= [uv_{1}, uv_{2}, \ldots , uv_{n}] \quad \mbox{ for } u \mbox{ in } \mathbb{C} \end{align*}

With these definitions, satisfies the axioms for a vector space (with complex scalars) given in Abstract Vector Spaces. Thus we can speak of spanning sets for , of linearly independent subsets, and of bases. In all cases, the definitions are identical to the real case, except that the scalars are allowed to be complex numbers. In particular, the standard basis of remains a basis of , called the standard basis of .

A Generalization of the Dot Product for Complex Vectors

There is a generalization to of the dot product in .

Clearly, if and actually lie in , then is the usual dot product.

Note that is a complex number in general, as opposed to requiring an inner product to be real as we do in Inner Product Spaces. However, if , the definition gives which is a nonnegative real number, equal to if and only if . This explains the conjugation in the definition of , and it gives th:025575d of the following theorem.

Proof
We leave th:025575a and th:025575b to the reader (Practice Problem prb:complex_matrices10), and th:025575d has already been proved. To prove th:025575c, write and . Then \begin{align*} \overline{\langle \vec{w}, \vec{z} \rangle } = (\overline{w_{1}\overline{z}_{1} + \ldots + w_{n}\overline{z}_{n}}) &= \overline{w}_{1}\overline{\overline{z}}_{1} + \ldots + \overline{w}_{n}\overline{\overline{z}}_{n} \\ &= z_{1}\overline{w}_{1} + \ldots + z_{n}\overline{w}_{n} = \langle \vec{z}, \vec{w} \rangle \end{align*}

The only properties of the norm function we will need are the following (the proofs are left to the reader):

A vector in is called a unit vector if . Property th:025616b in Theorem th:025616 then shows that if is any nonzero vector in , then is a unit vector.

A matrix is called a complex matrix if every entry is a complex number. The notion of conjugation for complex numbers extends to matrices as follows: Define the conjugate of to be the matrix \begin{equation*} \overline{A} = \left [ \begin{array}{c} \overline{a}_{ij} \end{array}\right ] \end{equation*} obtained from by conjugating every entry. Then (using Appendix chap:appacomplexnumbers) \begin{equation*} \overline{A + B} = \overline{A} + \overline{B} \quad \mbox{ and } \quad \overline{AB} = \overline{A} \; \overline{B} \end{equation*} holds for all (complex) matrices of appropriate size.

Transposition of complex matrices is defined just as in the real case, and the following notion is fundamental.

Observe that when is real.

The following properties of follow easily from the rules for transposition of real matrices and extend these rules to complex matrices. Note the conjugate in property th:025659c.

Hermitian and Unitary Matrices

If is a real symmetric matrix, it is clear that . The complex matrices that satisfy this condition turn out to be the most natural generalization of the real symmetric matrices:

Hermitian matrices are easy to recognize because the entries on the main diagonal must be real, and the “reflection” of each off-diagonal entry in the main diagonal must be the conjugate of that entry.

The following theorem extends Theorem th:dotpSymmetric, and gives a very useful characterization of Hermitian matrices in terms of the standard inner product in .

Proof
If is Hermitian, we have . If and are columns in , then , so \begin{equation*} \langle A\vec{z}, \vec{w} \rangle =(A\vec{z})^T\overline{\vec{w}} = \vec{z}^TA^T\overline{\vec{w}} = \vec{z}^T\overline{A}\overline{\vec{w}} = \vec{z}^T(\overline{A\vec{w}}) = \langle \vec{z}, A\vec{w} \rangle \end{equation*} To prove the converse, let denote column of the identity matrix. If , the condition gives \begin{equation*} \overline{a}_{ij} = \langle \vec{e}_{i}, A\vec{e}_{j} \rangle = \langle A\vec{e}_{i}, \vec{e}_{j} \rangle ={a}_{ij} \end{equation*} Hence , so is Hermitian.

Let be an complex matrix. As in the real case, a complex number is called an eigenvalue of if holds for some column in . In this case is called an eigenvector of corresponding to .

If is an matrix, the characteristic polynomial is a polynomial of degree and the eigenvalues of are just the roots of . In most of our examples these roots have been real numbers (in fact, the examples have been carefully chosen so this will be the case!); but it need not happen, even when the characteristic polynomial has real coefficients. For example, if then has roots and , where is a complex number satisfying . Therefore, we have to deal with the possibility that the eigenvalues of a (real) square matrix might be complex numbers.

For a complex matrix, has complex coefficients (possibly nonreal). However, an argument like that given in Exploration exp:slowdown of The Characteristic Equation still works to show that the eigenvalues of are the roots (possibly complex) of .

It is at this point that the advantage of working with complex numbers becomes apparent. The real numbers are incomplete in the sense that the characteristic polynomial of a real matrix may fail to have all its roots real. However, this difficulty does not occur for the complex numbers. The so-called fundamental theorem of algebra ensures that every polynomial of positive degree with complex coefficients has a complex root. Hence every square complex matrix has a (complex) eigenvalue. Indeed (see th:034210), factors completely as follows: \begin{equation*} c_{A}(z) = (z -\lambda _{1})(z -\lambda _{2}) \cdots (z -\lambda _{n}) \end{equation*} where are the eigenvalues of (with possible repetitions due to multiple roots).

The next result extends Theorem th:symmetric_has_ortho_ev, which asserts that eigenvectors of a symmetric real matrix corresponding to distinct eigenvalues are orthogonal. In the complex context, two -tuples and in are said to be orthogonal if .

Proof
Let and be eigenvalues of with (nonzero) eigenvectors and . Then and , so Theorem th:025697 gives \begin{equation} \label{eigenvalEq} \lambda \langle \vec{z}, \vec{w} \rangle = \langle \lambda \vec{z}, \vec{w} \rangle = \langle A\vec{z}, \vec{w} \rangle = \langle \vec{z}, A\vec{w} \rangle = \langle \vec{z}, \mu \vec{w} \rangle = \overline{\mu } \langle \vec{z}, \vec{w} \rangle \end{equation} If and , this becomes . Because , this implies . Thus is real, proving (1). Similarly, is real, so equation (eigenvalEq) gives . If , this implies , proving (2).

Proof
Symmetric real matrices are Hermitian, and so the result follows immediately from Theorem th:025729.

The Real Spectral Theorem (th:PrinAxes) asserts that every real symmetric matrix is orthogonally diagonalizable—that is is diagonal where is an orthogonal matrix . The next theorem identifies the complex analogs of these orthogonal real matrices.

Proof
If is a complex matrix with th column , then , as in Theorem thm:024227. Now th:025759a th:025759b follows, and th:025759a th:025759c is proved in the same way.

Thus a real matrix is unitary if and only if it is orthogonal.

Given a real symmetric matrix , we saw in Orthogonal Matrices and Symmetric Matrices a procedure for finding an orthogonal matrix such that is diagonal (see Example ex:DiagonalizeSymmetricMatrix). The following example illustrates Theorem th:025729 and shows that the technique works for complex matrices.

Unitary Diagonalization

An complex matrix is called unitarily diagonalizable if is diagonal for some unitary matrix . As Example ex:025794 suggests, we are going to prove that every Hermitian matrix is unitarily diagonalizable. However, with only a little extra effort, we can get a very important theorem that has this result as an easy consequence.

A complex matrix is called upper triangular if every entry below the main diagonal is zero. We owe the following theorem to Issai Schur.

Proof
We use induction on , mirroring the form of the proof of th:PrinAxes. If , is already upper triangular. If , assume the theorem is valid for complex matrices. Let be an eigenvalue of , and let be an eigenvector with . Next, the (complex analog of the) Gram-Schmidt process provides such that is an orthonormal basis of . If is the matrix with these vectors as its columns, then \begin{equation*} U_{1}^HAU_{1} = \left [ \begin{array}{cc} \lambda _{1} & X_{1} \\ 0 & A_{1} \end{array}\right ] \end{equation*} in block form. Now apply induction to find a unitary matrix such that is upper triangular. Then is a unitary matrix. Hence is unitary (using Theorem th:025759), and \begin{align*} U^HAU &= U_{2}^H(U_{1}^HAU_{1})U_{2} \\ &= \left [ \begin{array}{cc} 1 & 0 \\ 0 & W_{1}^H \end{array}\right ] \left [ \begin{array}{cc} \lambda _{1} & X_{1} \\ 0 & A_{1} \end{array}\right ] \left [ \begin{array}{cc} 1 & 0 \\ 0 & W_{1} \end{array}\right ] = \left [ \begin{array}{cc} \lambda _{1} & X_{1}W_{1} \\ 0 & T_{1} \end{array}\right ] \end{align*}

is upper triangular. Finally, and have the same eigenvalues by (the complex version of) Theorem th:properties_similar_eig, and they are the diagonal entries of because is upper triangular.

The fact that similar matrices have the same traces and determinants gives the following consequence of Schur’s theorem.

Schur’s theorem asserts that every complex matrix can be “unitarily triangularized.” However, we cannot substitute “unitarily diagonalized” here. In fact, if , there is no invertible complex matrix at all such that is diagonal. However, the situation is much better for Hermitian matrices.

Proof
By Schur’s theorem, let be upper triangular where is unitary. Since is Hermitian, this gives \begin{equation*} T^H = (U^HAU)^H = U^HA^HU^{HH} = U^HAU = T \end{equation*} This means that is both upper and lower triangular. Hence is actually diagonal.

The Real Spectral Theorem asserts that a real matrix is symmetric if and only if it is orthogonally diagonalizable (that is, is diagonal for some real orthogonal matrix ). Theorem th:Spectral Theorem is the complex analog of half of this result. However, the converse is false for complex matrices: There exist unitarily diagonalizable matrices that are not Hermitian.

There is a very simple way to characterize those complex matrices that are unitarily diagonalizable. To this end, an complex matrix is called normal if . It is clear that every Hermitian or unitary matrix is normal, as is the matrix in Example exa:025874. In fact we have the following result.

Proof
Assume first that , where is unitary and is diagonal. Then as is easily verified. Because and , it follows by cancellation that .

Conversely, assume is normal—that is, . By Schur’s theorem, let , where is upper triangular and is unitary. Then is normal too: \begin{equation*} TT^H = U^H(AA^H)U = U^H(A^HA)U = T^HT \end{equation*} Hence it suffices to show that a normal upper triangular matrix must be diagonal. We induct on ; it is clear if . If and , then equating -entries in and gives \begin{equation*} |t_{11}|^2 + |t_{12}|^2 + \ldots + |t_{1n}|^2 = |t_{11}|^2 \end{equation*} This implies , so in block form. Hence so implies . Thus is diagonal by induction, and the proof is complete.

We conclude this section by using Schur’s theorem (Theorem th:025814) to prove a famous theorem about matrices. Recall that the characteristic polynomial of a square matrix is defined by , and that the eigenvalues of are just the roots of .

Proof
If is any polynomial with complex coefficients, then for any invertible complex matrix . Hence, by Schur’s theorem, we may assume that is upper triangular. Then the eigenvalues of appear along the main diagonal, so \begin{equation*} c_{A}(z) = (z - \lambda _{1})(z - \lambda _{2})(z - \lambda _{3}) \cdots (z -\lambda _{n}) \end{equation*} Thus \begin{equation*} c_{A}(A) = (A - \lambda _{1}I)(A - \lambda _{2}I)(A - \lambda _{3}I) \cdots (A - \lambda _{n}I) \end{equation*} Note that each matrix is upper triangular. Now observe:
(a)
has zero first column because column 1 of is .
(b)
Then has the first two columns zero because the second column of is for some constant .
(c)
Next has the first three columns zero because column 3 of is for some constants and .

Continuing in this way we see that has all columns zero; that is, .

Practice Problems

In each case, compute the norm of the complex vector.
(a)
(b)
(c)
(d)
In each case, determine whether the two vectors are orthogonal.
(a)
, YesNo
(b)
, YesNo
(c)
, YesNo
(d)
, YesNo
A subset of is called a complex subspace of if it contains and if, given and in , both and lie in ( any complex number). In each case, determine whether is a complex subspace of .
(a)
Not a subspace. For example, is not in .
(b)
(c)
SubspaceNot a Subspace
(d)
In each case, find a basis over , and determine the dimension of the complex subspace of (see the previous exercise).
(a)
Basis ; dimension
(b)
(c)
Basis ; dimension
(d)
In each case, determine whether the given matrix is Hermitian, unitary, or normal.
(a)
Normal only
(b)
(c)
Hermitian (and normal), not unitary
(d)
(e)
(f)
None of these adjectives apply.
(g)
(h)
,
Unitary (and normal); Hermitian if and only if is real
Show that a matrix is normal if and only if .
Let where , , and are complex numbers. Characterize in terms of , , and when is
(a)
Hermitian
(b)
unitary
(c)
normal.
In each case, find a unitary matrix such that is diagonal.
(a)
(b)
,
(c)
; , , real
(d)
,
(e)
(f)
,
Show that holds for all matrices and for all -tuples and in .
(a)
Prove th:025575a and th:025575b of Theorem th:025575.
(b)
Prove Theorem th:025616.
(c)
Prove Theorem th:025659.
a.
Show that is Hermitian if and only if .
b.
Show that the diagonal entries of any Hermitian matrix are real.
If the -entry of is , then the -entry of is so the -entry of is . This equals , so is real.
(a)
Show that every complex matrix can be written uniquely in the form , where and are real matrices.
(b)
If as in (a), show that is Hermitian if and only if is symmetric, and is skew-symmetric (that is, ).
If is any complex matrix, show that and are Hermitian.
A complex matrix is called skew-Hermitian if .
a.
Show that is skew-Hermitian for any square complex matrix .
b.
If is skew-Hermitian, show that and are Hermitian.
Show that ; .
c.
If is skew-Hermitian, show that the eigenvalues of are pure imaginary ( for real ).
d.
Show that every complex matrix can be written uniquely as , where is Hermitian and is skew-Hermitian.
If , as given, first show that , and hence that and .
Let be a unitary matrix. Show that:
(a)
for all columns in .
(b)
for every eigenvalue of .
(a)
If is an invertible complex matrix, show that is invertible and that .
(b)
Show that the inverse of a unitary matrix is again unitary.
If is unitary, , so is unitary.
(c)
If is unitary, show that is unitary.
Let be an matrix such that (for example, is a unit column in ).
(a)
Show that is Hermitian and satisfies
.
(b)
Show that is both unitary and Hermitian (so ).
(a)
If is normal, show that is also normal for all complex numbers .
(b)
Show that (a) fails if normal is replaced by Hermitian.
is Hermitian but is not.
Show that a real normal matrix is either symmetric or has the form .
If is Hermitian, show that all the coefficients of are real numbers.
(a)
If , show that is not diagonal for any invertible complex matrix .
(b)
If , show that is not upper triangular for any real invertible matrix .
Let be real and invertible, and assume that . Then , and first column entries are and . Hence is real ( and are both real and are not both ), and . Thus , , a contradiction.
If is any matrix, show that is lower triangular for some unitary matrix .
If is a matrix, show that if and only if there exists a unitary matrix such that has the form or the form .
If , show that rank . [Hint: Use Schur’s theorem.]

Text Source

This section was adapted from Section 8.6 of Keith Nicholson’s Linear Algebra with Applications. (CC-BY-NC-SA)

W. Keith Nicholson, Linear Algebra with Applications, Lyryx 2018, Open Edition, pp. 445–456.

2024-09-07 16:11:24