You are about to erase your work on this activity. Are you sure you want to do this?
Updated Version Available
There is an updated version of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?
Mathematical Expression Editor
Orthogonality and Projections
Orthogonal and Orthonormal Sets
In this section, we examine what it means for vectors (and sets of vectors)
to be orthogonal and orthonormal. Recall that two non-zero vectors are
orthogonal if their dot product is zero. A collection of non-zero vectors in
is called orthogonal if the vectors are pair-wise orthogonal. The diagram
below shows two orthogonal vectors in and three orthogonal vectors in .
If every vector in an orthogonal set of vectors is also a unit vector, then we say that
the given set of vectors is orthonormal.
Formally, we can define orthogonal and orthonormal vectors as follows.
Let be a set of nonzero vectors in . Then this set is called an orthogonal set if for
all . Moreover, if for (i.e. each vector in the set is a unit vector), we say the set of
vectors is an orthonormal set.
An orthogonal set of vectors may not be orthonormal. To convert an orthogonal set
to an orthonormal set, we need to divide each vector by its own length.
Normalizing an orthogonal set is the process of turning an orthogonal set into an
orthonormal set. If is an orthogonal subset of , then is an orthonormal
set.
We illustrate this concept in the following example.
Consider the vectors Show that is an orthogonal set of vectors but not an
orthonormal one. Find the corresponding orthonormal set.
One easily verifies that and is an orthogonal set of vectors. On the other hand one
can compute that and so the set is not orthonormal.
To find a corresponding orthonormal set, we need to normalize each vector.
Similarly,
Therefore the corresponding orthonormal set is
You can verify that this set is orthonormal.
Orthogonal and Orthonormal Bases
Recall that every basis of (or a subspace of ) imposes a coordinate system on (or )
that can be used to express any vector of (or ) as a linear combination of the
elements of the basis. For example, vectors and impose a coordinate system onto
the plane, as shown in the figure below. We readily see that , contained in the plane,
can be written as .
Vector is visually easy to work with. In general, one way to express an
arbitrary vector as a linear combination of the basis vectors is to solve a
system of linear equations, which can be costly. One reason we like as a
basis of is because any vector of can be easily expressed as the sum of the
orthogonal projections of onto the basis vectors and , as shown below.
We can see why an “upright” coordinate system with basis works well. What if we
tilt this coordinate system while preserving the orthogonal relationship between
the basis vectors? The following exploration allows you to investigate the
consequences.
In the following GeoGebra interactive, vectors and are orthogonal (slopes
of the lines containing them are negative reciprocals of each other). These
vectors are clearly linearly independent and span . Therefore is a basis of
.
Let be an arbitrary vector. Orthogonal projections of onto and are depicted in
light grey.
Use the tip of vector to manipulate the vector and convince yourself that
is always the diagonal of the parallelogram (a rectangle!) determined by
the projections.
Use the tips of and to change the basis vectors. What happens when
and are no longer orthogonal?
Pick another pair of orthogonal vectors and . Verify that is the sum of
its projections.
As you have just discovered in Exploration exp:orth1a, we can express an arbitrary vector of as
the sum of its projections onto the basis vectors, provided that the basis is
orthogonal. It turns out that this result holds for any subspace of , making a basis
consisting of orthogonal vectors especially useful.
If an orthogonal set is a basis, we call it an orthogonal basis. Similarly, if an
orthonormal set is a basis, we call it an orthonormal basis.
The following theorem generalizes our observation in Exploration exp:orth1a. As you read the
statement of the theorem, it will be helpful to recall that the orthogonal projection of
vector onto a non-zero vector is given by
Let be a subspace of and suppose is an orthogonal basis of . Then for every in ,
Proof
We may express as a linear combination of the basis elements: We
claim that for . To see this, we take the dot product of each side with the vector
and obtain the following.
Our basis is orthogonal, so for all , which means after we distribute the
dot product, only one term will remain on the right-hand side. We have
We now divide both sides by , and since our claim holds for , the proof is
complete.
Theorem th:fourierexpansion shows one important benefit of a basis being orthogonal. With
an orthogonal basis it is easy to represent any vector in terms of the basis
vectors.
Let , and let .
Notice that is an orthogonal set of vectors, and spans . Use this fact to write as a
linear combination of the vectors of .
We first observe that is a linearly independent set of vectors, and so is a basis for .
Next we apply Theorem th:fourierexpansion to express as a linear combination of the vectors of . We
wish to write:
We readily compute:
Therefore,
The formula from Theorem th:fourierexpansion is easy to use, and it becomes even easier when our
basis is orthonormal.
Let be a subspace of and suppose is an orthonormal basis of . Then for any in ,
Proof
This is a special case of Theorem th:fourierexpansion. Because for , the terms are given
by
Orthogonal Projection onto a Subspace
In the previous section we found that given a subspace of with an orthogonal basis ,
every vector in can be expressed as the sum of the orthogonal projections of onto
the elements of . Note that our premise was that is in . In this section, we
look into the meaning of the sum of orthogonal projections of onto the
elements of an orthogonal basis of for those vectors of that are not in
.
In the GeoGebra interactive below, is a plane spanned by and , in . is
subspace of . In the initial set up, and are orthogonal. Vector is not in
.
Use check-boxes to construct the sum of orthogonal projections of onto and .
RIGHT-CLICK and DRAG to rotate the image.
If moved, return the basis vectors and to their default position (set ) to ensure that
they are orthogonal.
Rotate the image to convince yourself that the perpendiculars dropped from
the tip of to and are indeed perpendicular to and in the diagram. (You’ll
have to look at it just right to convince yourself of this.) Are both of these
perpendiculars also necessarily perpendicular to the plane? Yes, No
Use sliders and to manipulate . Rotate the figure for a better view. What is
true about about vector ?
.Vector is orthogonal to .All of the above.
Rotate the figure so that you’re looking directly down at the plane. If
you’re looking at it correctly, you will notice that (1) the parallelogram
determined by the projections of onto and is a rectangle; (2) the
sum of projections, , is located directly underneath , like a shadow at
midday.
Use sliders and to manipulate the basis vectors and so that they are no longer
orthogonal.
Rotate the figure for a better view. Which of the following is true?
.Vector
is orthogonal to .All of the above.
Rotate your figure so that you’re looking directly down at the plane. Which
of the following is true?
Parallelogram determined by and is a
rectangle. is located directly underneath .None of the above.
In Exploration exp:orthProjSub, you discovered that given a plane, spanned by orthogonal vectors ,
in , and a vector , not in the plane, we can interpret the sum of orthogonal
projections of onto and as a “shadow” of that lies in the plane directly
underneath the vector . We say that this “shadow” is an orthogonal projection of
onto . You have also found that if are not orthogonal, the parallelogram
representing the sum of the orthogonal projections of onto and will not be
a rectangle. In this case, minus this sum will NOT be orthogonal to the
plane. It is essential that are orthogonal for to be considered an orthogonal
projection.
In general, we can define an orthogonal projection of in onto a subspace
of as the sum of the orthogonal projections of onto the elements of an
orthogonal basis of . Definition def:projOntoSubspace and the subsequent diagram summarize this
discussion.
Projection onto a Subspace of Let be a subspace of with orthogonal basis . If is in ,
the vector
is called the orthogonal projection of onto .
An illustration of Definition def:projOntoSubspace for a two-dimensional subspace with orthogonal basis
is shown below.
Using equation (eq:orthProj) multiple times, we can also express in Definition def:projOntoSubspace using the
following formula.
Orthogonal Decomposition of
Definition def:projOntoSubspace allows us to express as the sum of its orthogonal projection, , located in ,
and a vector we will call (pronounced “W-perp”), given by . This decomposition of
is shown in the diagram below.
You have already met , under the name of in Exploration exp:orthProjSub, and observed
that this vector is orthogonal to . We will now prove that is orthogonal to
every vector in . This will be accomplished in two steps. First, in Theorem th:orthDecompX
we will prove that is orthogonal to all of the basis elements of . Next, you
will use this result to demonstrate that is orthogonal to every vector in
.
Let be a subspace of with orthogonal basis . Let be in , and define as
Then is orthogonal to for .
Proof
We will use Formula form:orthProjOntoW to show that =0. Recall that is an orthogonal
basis. Therefore for . This observation enables us to compute as follows.
We leave the proof of the following Corollary as Practice Problem prob:proofCor
Let
be a subspace of with orthogonal basis . Let be in , and define as
Then is orthogonal to every vector in .
The fact that the decomposition of into the sum of and is unique is the subject of
the Orthogonal Decomposition Theorem which we will prove in Orthogonal
Complements and Decompositions.
Throughout this section we have worked with orthogonal bases of subspaces. Does
every subspace of have an orthogonal basis? If so, how do we find one? These
questions will be addressed in Gram-Schmidt Orthogonalization.
Practice Problems
Retry Example fourier using Gaussian elimination. Which method seems easier to
you?
Let and suppose . Furthermore, suppose that there exists a vector for which for all
, . Show that .