- (a)
- (b)
-
Click the arrow to see the answer.
- (c)
- (d)
-
Click the arrow to see the answer.
Positive Definite Matrices
All the eigenvalues of any symmetric matrix are real (proved later in cor:ews_symmetric_real); this section is about the case in which those eigenvalues are positive. These matrices, which arise whenever optimization (maximum and minimum) problems are encountered, have countless applications throughout science and engineering. They also arise in statistics (for example, in factor analysis used in the social sciences) and in geometry (quadratic forms, for instance). We will encounter them again in Inner Product Spaces when describing all inner products in .
Because these matrices are symmetric, the Real Spectral Theorem (th:PrinAxes) plays a central role in the theory.
- Proof
- If is and the eigenvalues are , then by the Real Spectral Theorem (or, more specifically, cor:det_and_tr).
If is a column in and is any real matrix, we view the matrix as a real number. With this convention, we have the following characterization of positive definite matrices.
- Proof
- is symmetric. Using the Real Spectral Theorem, let where and the are the eigenvalues of . Given a column in , write . Then \begin{equation} \label{symmetricEq} \vec{x}^TA\vec{x} = \vec{x}^T(QDQ^T)\vec{x} = \vec{y}^TD\vec{y} = \lambda _{1}y_{1}^2 + \lambda _{2}y_{2}^2 + \dots + \lambda _{n}y_{n}^2 \end{equation} If is positive definite and , then by (symmetricEq) because some and every . Conversely, if whenever , let where is column of . Then , so (symmetricEq) reads .
Note that Theorem thm:024830 shows that the positive definite matrices are exactly the symmetric matrices for which the quadratic form takes only positive values.
The next example of a positive definite matrix is one we will see again in SVD Decomposition
It is remarkable that the converse to Example exa:024865 is also true. In fact every positive definite matrix can be factored as where is an upper triangular matrix with positive elements on the main diagonal. However, before verifying this, we introduce another concept that is central to any discussion of positive definite matrices.
If is any matrix, let denote the submatrix in the upper left corner of ; that is, is the matrix obtained from by deleting the last rows and columns. The matrices , are called the principal submatrices of .
- Proof
- Write in block form. If in , write in .
Then , so the fact that is positive definite gives \begin{equation*} 0 < \vec{x}^TA\vec{x} = \left [ \begin{array}{rr} \vec{y}^T & \vec{0} \end{array}\right ] \left [ \begin{array}{rr} ^{(r)}A & P \\ Q & R \end{array}\right ] \left [ \begin{array}{r} \vec{y} \\ \vec{0} \end{array}\right ] = \vec{y}^T(^{(r)}A)\vec{y} \end{equation*} This shows that is positive definite by Theorem thm:024830. (A similar argument shows that, if is any matrix obtained from a positive definite matrix by deleting certain rows and deleting the same columns, then is also positive definite.)
If is positive definite, Lemma lem:024890 and Theorem thm:024815 show that for every . This proves part of the following theorem which contains the converse to Example exa:024865, and characterizes the positive definite matrices among the symmetric ones.
- (a)
- is positive definite.
- (b)
- for each .
- (c)
- where is an upper triangular matrix with positive entries on the main diagonal.
Furthermore, the factorization in thm:024907c is unique (called the Cholesky factorization of ).
- Proof
- First, thm:024907c thm:024907a by Example exa:024865, and thm:024907a thm:024907b by Lemma lem:024890 and Theorem thm:024815.
thm:024907b thm:024907c. Assume thm:024907b and proceed by induction on . If , then where by thm:024907b, so take . If , write . Then is symmetric and satisfies thm:024907b so, by induction, we have as in thm:024907c where is of size . Then, as is symmetric, it has block form where is a column in and is in . If we write and , block multiplication gives \begin{equation*} A = \left [ \begin{array}{cc} U^TU & \vec{p} \\ \vec{p}^T & b \end{array}\right ] = \left [ \begin{array}{cc} U^T & 0 \\ \vec{x}^T & 1 \end{array}\right ] \left [ \begin{array}{cc} U & \vec{x} \\ 0 & c \end{array}\right ]. \end{equation*} Taking determinants gives . Hence because by thm:024907b, so the above factorization can be written \begin{equation*} A = \left [ \begin{array}{cc} U^T & 0 \\ \vec{x}^T & \sqrt{c} \end{array}\right ] \left [ \begin{array}{cc} U & \vec{x} \\ 0 & \sqrt{c} \end{array}\right ] \end{equation*} Since has positive diagonal entries, this proves thm:024907c.
To prove uniqueness, suppose that are two Cholesky factorizations. Now write . Then is upper triangular, because , and lower triangular, because , and so it is a diagonal matrix. Thus and , so it suffices to show that . But eliminating gives , so because is invertible. Since the diagonal entries of are positive (this is true of and ), it follows that .
The remarkable thing is that the matrix in the Cholesky factorization is easy to obtain from using row operations. The key is that Step 1 of the following algorithm is possible for any positive definite matrix . A proof of the algorithm is given following Example exa:024959.
- Carry to an upper triangular matrix with positive diagonal entries using row operations each of which adds a multiple of a row to a lower row.
- Obtain from by dividing each row of by the square root of the diagonal entry in that row.
The reader can verify that .
- Proof of the Cholesky Algorithm
- If is positive definite, let be the
Cholesky factorization, and let be the common diagonal of and . Then is lower
triangular with ones on the diagonal (call such matrices LT-1). Hence is also
LT-1, and so by a sequence of row operations each of which adds a multiple
of a row to a lower row (modify columns right to left). But then by the same
sequence of row operations. Since is upper triangular with positive entries on
the diagonal, this shows that Step 1 of the algorithm is possible.
Turning to Step 2, let as in Step 1 so that where is LT-1. Since A is symmetric, we get \begin{equation} \label{symmetricEq2} L_{1}U_{1}^T = L_{1}(L_{1}A)^T = L_{1}A^TL_{1}^T = L_{1}AL_{1}^T = U_{1}L_{1}^T \end{equation} Let denote the diagonal of . Then (symmetricEq2) gives . This is both upper triangular (right side) and LT-1 (left side), and so must equal . In particular, . Now let , so that . If we write we have \begin{equation*} U^TU = (U_{1}^TD_{2}^{-1})(D_{2}^{-1}U_{1}) = U_{1}^T(D_{2}^2)^{-1}U_{1} = (U_{1}^TD_{1}^{-1})U_{1} = (L_{1}^{-1})U_{1} = A \end{equation*} This proves Step 2 because is formed by dividing each row of by the square root of its diagonal entry (verify).
Practice Problems
- a.
- If is positive definite, show that is positive definite for all .
- b.
- Prove the converse to (a) when is odd.
- c.
- Find a symmetric matrix such that is positive definite but is not.
Click the arrow to see the answer.
If , then and . Hence and , as .
Click the arrow for answer.
Let where . Since is positive definite, each eigenvalue . If then , so . Take . Since has eigenvalues , it is positive definite.
- (a)
- Suppose an invertible matrix can be factored in as where is lower triangular with s on the diagonal, is upper triangular with s on the diagonal, and is diagonal with positive diagonal entries. Show that the factorization is unique: If is another such factorization, show that , , and .
- (b)
- Show that a matrix is positive definite if and only if is symmetric and admits a factorization as in (a).
Click the arrow for answer.
If is positive definite, use Theorem thm:024815 to write where is upper triangular with positive diagonal . Then so is such a factorization if , , and . Conversely, let be such a factorization. Then , so by (a). Hence where and is diagonal with (the matrix exists because has positive diagonal entries). Hence is symmetric, and it is positive definite by Example exa:024865.
Text Source
This section was adapted from Section 8.3 of Keith Nicholson’s Linear Algebra with Applications. (CC-BY-NC-SA)
W. Keith Nicholson, Linear Algebra with Applications, Lyryx 2018, Open Edition, p. 433-437.
2024-09-07 16:11:57