Inner Product Spaces

We have used the dot product in to compute the length of vectors (Corollary cor:length_via_dotprod) and also the angle between vectors. The goal of this section is to define an inner product on an arbitrary vector space over the real numbers. The dot product is an example an inner product for .

A real vector space with an inner product will be called an inner product space. Note that every subspace of an inner product space is again an inner product space using the same inner product.

The next example is important in analysis.

If is any vector, then, using Property prop:inner_prod_3 of Definition def:innerproductspace, we get \begin{equation*} \langle \vec{0}, \vec{v} \rangle = \langle \vec{0} + \vec{0}, \vec{v} \rangle = \langle \vec{0}, \vec{v} \rangle + \langle \vec{0}, \vec{v} \rangle \end{equation*} and it follows that the number must be zero. This observation is recorded for reference in the following theorem, along with several other properties of inner products. The other proofs are left as Practice Problem ex:10_1_20.

If is an inner product on a space , then, given , , and in , \begin{equation*} \langle r\vec{u} + s\vec{v}, \vec{w} \rangle = \langle r\vec{u}, \vec{w} \rangle + \langle s\vec{v}, \vec{w} \rangle = r\langle \vec{u}, \vec{w} \rangle + s\langle \vec{v}, \vec{w} \rangle \end{equation*} for all and in by Property prop:inner_prod_3 and Property prop:inner_prod_4 of Definition def:innerproductspace. Moreover, there is nothing special about the fact that there are two terms in the linear combination or that it is in the first component: \begin{equation*} \langle r_1\vec{v}_1 + r_2\vec{v}_2 + \dots + r_n\vec{v}_n, \vec{w} \rangle = r_1\langle \vec{v}_1, \vec{w} \rangle + r_2\langle \vec{v}_2, \vec{w} \rangle + \dots + r_n\langle \vec{v}_n, \vec{w} \rangle \end{equation*} and \begin{equation*} \langle \vec{v}, s_1\vec{w}_1 + s_2\vec{w}_2 + \dots + s_m\vec{w}_m \rangle = s_1\langle \vec{v}, \vec{w}_1 \rangle + s_2\langle \vec{v}, \vec{w}_2 \rangle + \dots + s_m\langle \vec{v}, \vec{w}_m \rangle \end{equation*} hold for all and in and all , , , and in . These results are described by saying that inner products “preserve” linear combinations. For example, \begin{align*} \langle 2\vec{u} - \vec{v}, 3\vec{u} + 2\vec{v} \rangle &= \langle 2\vec{u}, 3\vec{u} \rangle + \langle 2\vec{u}, 2\vec{v} \rangle + \langle -\vec{v}, 3\vec{u} \rangle + \langle -\vec{v}, 2\vec{v} \rangle \\ &= 6 \langle \vec{u}, \vec{u} \rangle + 4 \langle \vec{u}, \vec{v} \rangle -3 \langle \vec{v}, \vec{u} \rangle - 2 \langle \vec{v}, \vec{v} \rangle \\ &= 6 \langle \vec{u}, \vec{u} \rangle + \langle \vec{u}, \vec{v} \rangle - 2 \langle \vec{v}, \vec{v} \rangle \end{align*}

If is a symmetric matrix and and are columns in , we regard the matrix as a number. If we write \begin{equation*} \langle \vec{x}, \vec{y} \rangle = \vec{x}^TA\vec{y} \quad \mbox{ for all columns } \vec{x}, \vec{y} \mbox{ in } \RR ^n \end{equation*} then Properties prop:inner_prod_1 -prop:inner_prod_4 of Definition def:innerproductspace follow from matrix arithmetic (only Property prop:inner_prod_2 of Definition def:innerproductspace requires that is symmetric). Property prop:inner_prod_5 of Definition def:innerproductspace reads \begin{equation*} \vec{x}^TA \vec{x} > 0 \quad \mbox{ for all columns } \vec{x} \neq \vec{0} \mbox{ in } \RR ^n \end{equation*} and this condition characterizes the positive definite matrices (Theorem thm:024830). This proves the first assertion in the next theorem.

Proof
Given an inner product on , let be the standard basis of . If and are two vectors in , compute by adding the inner product of each term to each term . The result is a double sum. \begin{equation*} \langle \vec{x}, \vec{y} \rangle = \displaystyle \sum _{i = 1}^{n} \sum _{j = 1}^{n} \langle x_i \vec{e}_i, y_j\vec{e}_j \rangle = \displaystyle \sum _{i = 1}^{n} \sum _{j = 1}^{n} x_i \langle \vec{e}_i, \vec{e}_j \rangle y_j \end{equation*} As the reader can verify, this is a matrix product: \begin{equation*} \langle \vec{x}, \vec{y} \rangle = \left [ \begin{array}{cccc} x_1 & x_2 & \cdots & x_n \\ \end{array} \right ] \left [ \begin{array}{cccc} \langle \vec{e}_1, \vec{e}_1 \rangle & \langle \vec{e}_1, \vec{e}_2 \rangle & \cdots & \langle \vec{e}_1, \vec{e}_n \rangle \\ \langle \vec{e}_2, \vec{e}_1 \rangle & \langle \vec{e}_2, \vec{e}_2 \rangle & \cdots & \langle \vec{e}_2, \vec{e}_n \rangle \\ \vdots & \vdots & \ddots & \vdots \\ \langle \vec{e}_n, \vec{e}_1 \rangle & \langle \vec{e}_n, \vec{e}_2 \rangle & \cdots & \langle \vec{e}_n, \vec{e}_n \rangle \\ \end{array} \right ] \left [ \begin{array}{c} y_1 \\ y_2 \\ \vdots \\ y_n \end{array} \right ] \end{equation*} Hence , where is the matrix whose -entry is . The fact that \begin{equation*} \langle \vec{e}_{i}, \vec{e}_{j}\rangle = \langle \vec{e}_{j}, \vec{e}_{i}\rangle \end{equation*} shows that is symmetric. Finally, is positive definite by Theorem thm:024830.

Thus, just as every linear operator corresponds to an matrix, every inner product on corresponds to a positive definite matrix. In particular, the dot product corresponds to the identity matrix .

Let be an inner product on given as in Theorem thm:030372 by a positive definite matrix . If , then is an expression in the variables called a quadratic form. For more on quadratic forms, see Section 8.8 of [Nicholson], pp. 472–482.

Norm and Distance

Note that Property prop:inner_prod_5 of Definition def:innerproductspace guarantees that , so is a real number.

A vector in an inner product space is called a unit vector if . The set of all unit vectors in is called the unit ball in . For example, if (with the dot product) and , then \begin{equation*} \norm{ \vec{v} }^2 = 1 \quad \mbox{ if and only if } \quad x^2 + y^2 = 1 \end{equation*} Hence the unit ball in is the unit circle with centre at the origin and radius . However, the shape of the unit ball varies with the choice of inner product.

Example exa:030469 graphically illustrates the fact that norms and distances in an inner product space vary with the choice of inner product in .

The next theorem reveals an important and useful fact about the relationship between norms and inner products, extending the Cauchy inequality for (Theorem th:CS).

Proof of Cauchy-Schwarz Inequality
Write and . Using Theorem thm:030346 we compute: \begin{equation} \label{eq:thm10_1_4} \begin{split} \norm{ b\vec{v} - a \vec{w} }^2 &= b^2 \norm{ \vec{v} }^2 - 2ab \langle \vec{v}, \vec{w} \rangle + a^2\norm{ \vec{w} }^2 = 2ab(ab - \langle \vec{v}, \vec{w} \rangle ) \\ \norm{ b\vec{v} + a \vec{w} }^2 &= b^2 \norm{ \vec{v} }^2 + 2ab \langle \vec{v}, \vec{w} \rangle + a^2\norm{ \vec{w} }^2 = 2ab(ab + \langle \vec{v}, \vec{w} \rangle ) \\ \end{split} \end{equation} It follows that and , and hence that . But then , as desired.

Conversely, if then . Hence (eq:thm10_1_4) shows that or . It follows that one of and is a scalar multiple of the other, even if or .

Another famous inequality, the so-called triangle inequality (See Triangle Inequality in the Appendix), also comes from the Cauchy-Schwarz inequality. It is included in the following list of basic properties of the norm of a vector.

Proof
Because , properties thm:030504a and thm:030504b follow immediately from thm:030346c and thm:030346d of Theorem thm:030346. As to thm:030504c, compute \begin{equation*} \norm{ r\vec{v} } ^2 = \langle r\vec{v}, r\vec{v} \rangle = r^2\langle \vec{v}, \vec{v} \rangle = r^2\norm{ \vec{v} }^2 \end{equation*} Hence thm:030504c follows by taking positive square roots. Finally, the fact that by the Cauchy-Schwarz inequality gives \begin{align*} \norm{ \vec{v} + \vec{w} } ^2 = \langle \vec{v} + \vec{w}, \vec{v} + \vec{w} \rangle &= \norm{ \vec{v} } ^2 + 2 \langle \vec{v}, \vec{w} \rangle + \norm{ \vec{w} } ^2 \\ &\leq \norm{ \vec{v} } ^2 + 2 \norm{ \vec{v} } \norm{ \vec{w} } + \norm{ \vec{w} } ^2 \\ &= (\norm{ \vec{v} } + \norm{ \vec{w} })^2 \end{align*}

Hence thm:030504d follows by taking positive square roots.

It is worth noting that the usual triangle inequality for absolute values, \begin{equation*} | r + s | \leq |r| + |s| \mbox{ for all real numbers } r \mbox{ and } s \end{equation*} is a special case of thm:030504d where and the dot product is used.

In many calculations in an inner product space, it is required to show that some vector is zero. This is often accomplished most easily by showing that its norm is zero. Here is an example.

The norm properties in Theorem thm:030504 translate to the following properties of distance familiar from geometry. The proof is Practice Problem ex:10_1_21.

Practice Problems

In each case, determine which of Property prop:inner_prod_1–Property prop:inner_prod_5 in Definition def:innerproductspace fail to hold.
(a)
,
(b)
,

Click the arrow to see the answer.

Property prop:inner_prod_5 fails.

(c)
, , where is complex conjugation

Click the arrow to see the answer.

Property prop:inner_prod_1 fails, as sometimes we get a complex number. However, we will return to this definition of in Complex Matrices – see Definition def:025549

(d)
,

Click the arrow to see the answer.

Property prop:inner_prod_5 fails.

(e)
,
(f)
,

Click the arrow to see the answer.

Property prop:inner_prod_5 fails.

Let be an inner product space. If is a subspace, show that is an inner product space using the same inner product.
Property prop:inner_prod_1–Property prop:inner_prod_5 hold in because they hold in .
In each case, find a scalar multiple of that is a unit vector.
(a)
in where
(b)
in where

Click the arrow to see the answer.

(c)
in where
(d)
in ,

Click the arrow to see the answer.

In each case, find the distance between and .
(a)
,
(b)
,

Click the arrow to see the answer.

(c)
, in where and ;
(d)
, in where and ;

Click the arrow to see the answer.

Let be positive numbers. Given and , define . Show that this is an inner product on .
If is a basis of and if and are vectors in , define \begin{equation*} \langle \vec{v}, \vec{w} \rangle = v_1w_1 + \dots + v_nw_n . \end{equation*} Show that this is an inner product on .
Let denote the real part of the complex number . Show that is an inner product on if .
If is an isomorphism of the inner product space , show that \begin{equation*} \langle \vec{v}, \vec{w} \rangle _1 = \langle T(\vec{v}), T(\vec{w}) \rangle \end{equation*} defines a new inner product on .
Show that every inner product on has the form for some upper triangular matrix with positive diagonal entries.
Theorem thm:024907
In each case, show that defines an inner product on and hence show that is positive definite.
(a)
(b)

Click the arrow to see the answer.

(c)
(d)

Click the arrow to see the answer.

In each case, find a symmetric matrix such that .
(a)
(b)

Click the arrow to see the answer.

(c)
(d)

Click the arrow to see the answer.

If is symmetric and for all columns in , show that .
Consider where .

Click the arrow to see the answer.

By the condition, for all , . Let denote column of . If , then for all and .

Show that the sum of two inner products on is again an inner product.
Let , , , , and . Compute:
(a)
(b)

Given the data in Practice Problem ex:10_1_16, show that .
Show that no vectors exist such that , , and .
Complete Example exa:030310.
Prove Theorem thm:030346.
(a)
Using Property prop:inner_prod_2: .
(b)
Using Property prop:inner_prod_2 and Property prop:inner_prod_4: .
(c)
Using Property prop:inner_prod_3: , so . The rest is Property prop:inner_prod_2.
(d)
Assume that . If this contradicts Property prop:inner_prod_5, so . Conversely, if , then by Part 3 of this theorem.
Prove Theorem thm:030346.
Let and be vectors in an inner product space .
(a)
Expand .
(b)
Expand .

Click the arrow to see the answer.

(c)
Show that .
(d)
Show that .

Click the arrow to see the answer.

Show that \begin{equation*} \norm{ \vec{v} } ^2 + \norm{ \vec{w} } ^2 = \frac{1}{2} \{ \norm{ \vec{v} + \vec{w} } ^2 + \norm{ \vec{v} - \vec{w} } ^2\} \end{equation*} for any and in an inner product space.
Let be an inner product on a vector space . Show that the corresponding distance function is translation invariant. That is, show that
for all , , and in .
(a)
Show that for all , in an inner product space .
(b)
If and are two inner products on that have equal associated norm functions, show that holds for all and .
Let denote a vector in an inner product space .
(a)
Show that is a subspace of .
(b)
Let be as in (a). If with the dot product, and if , find a basis for .

Click the arrow to see the answer.

Given vectors and , assume that for each . Show that for all in .
If and holds for each . Show that .
for each , so by Practice Problem ex:10_1_27.
Use the Cauchy-Schwarz inequality in an inner product space to show that:
(a)
If , then for all in .
(b)
for all real , , and .
If in (with the dot product) then . Use (a) with .
(c)
for all vectors , and all in .
If is a matrix, let and denote the rows of .
(a)
Show that .
(b)
Show that .
(a)
If and are nonzero vectors in an inner product space , show that , and hence that a unique angle exists such that
and . This angle is called the angle between and .
(b)
Find the angle between and in with the dot product.
(c)
If is the angle between and , show that the law of cosines is valid: \begin{equation*} \norm{ \vec{v} - \vec{w} } = \norm{ \vec{v} } ^2 + \norm{ \vec{w} } ^2 - 2\norm{ \vec{v} } \norm{ \vec{w} } \cos \theta . \end{equation*}
If , define .
(a)
Show that satisfies the conditions in Theorem thm:030504.
(b)
Show that does not arise from an inner product on given by a matrix .
If it did, use Theorem thm:030372 to find numbers , , and such that for all and .

Text Source

This section was adapted from Section 10.1 of Keith Nicholson’s Linear Algebra with Applications. (CC-BY-NC-SA)

W. Keith Nicholson, Linear Algebra with Applications, Lyryx 2018, Open Edition, pp. 521–530.

2024-09-07 16:11:47