You are about to erase your work on this activity. Are you sure you want to do this?
Updated Version Available
There is an updated version of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?
Mathematical Expression Editor
Linear Independence
If a friend told you that they have a line spanned by and and , you would probably
think that your friend’s description is a little excessive. Isn’t one of the above vectors
sufficient to describe the line? A line can be described as a span of one vector, but it
can also be described as a span of two or more vectors. There are many
advantages, however, to using the most efficient description possible. In this
section we will begin to explore what makes a description “more efficient.”
Redundant Vectors
Consider the following collection of vectors:
What is the span of these vectors? A line, , A parallelogram, A
parallelepiped
In this Exploration we will examine what can happen to the span of a collection of
vectors when a vector is removed from the collection.
First, let’s remove from .
Which of the following is true?
is a line is a parallelogram.
Removing from changed, did not change the span.
Now let’s remove from the original collection of vectors.
Which of the following is true?
is a line is the right side of the coordinate plane. is a parallelogram.
Removing from changed, did not change the span.
As you just discovered, removing a vector from a collection of vectors may or may
not affect the span of the collection. We will refer to vectors that can be removed
from a collection without changing the span as redundant. In Exploration exp:redundantVecs1, is
redundant, while is not.
Let be a set of vectors in . If we can remove one vector without changing the span
of this set, then that vector is redundant. In other words, if
we say that is a redundant element of , or simply redundant.
Our next goal is to see what causes of Exploration exp:redundantVecs1 to be redundant. The answer lies
not in the vector itself, but in its relationship to the other vectors in the collection.
Observe that . In other words, is a scalar multiple of another vector in the set. To
see why this matters, let’s pick an arbitrary vector in . Vector is in the span
because it can be written as a linear combination of the three vectors as
follows
But is not essential to this linear combination because it can be replaced with , as
shown below.
Regardless of what vector we write as a linear combination of, and , we will always
be able to replace with , placing into the span of and , and making redundant.
(Note that we can just as easily write , and argue that is redundant.) We conclude
that only one of and is needed to maintain the span of the original three vectors.
We have
The left-most collection in this expression contains redundant vectors; the other two
collections do not.
In Exploration exp:redundantVecs1 we found one vector to be redundant because we could replace it
with a scalar multiple of another vector in the set. The following Exploration delves
into what happens when a vector in a given set is a linear combination of the other
vectors.
Consider the set of vectors
The three vectors are shown below. RIGHT-CLICK and DRAG to rotate the
interactive graph.
is A line, A plane, , A parallelepiped
Can we remove one of the vectors from the set without changing the span?
Observe that we can write as a linear combination of the other two vectors \begin{equation} \label{eq:redundant}\begin{bmatrix}4\\4\\-1\end{bmatrix}=2\begin{bmatrix}1\\2\\-1\end{bmatrix}+1\begin{bmatrix}2\\0\\1\end{bmatrix} \end{equation}
This means that we can write any vector in as a linear combination of only and by
replacing with the expression in (eq:redundant). For example,
We have
We conclude that vector is redundant. Can each of the other two vectors in the set
be considered redundant? You will address this question in Practice Problem
prob:redundant1.
Collections of vectors that do not contain redundant vectors are very important in
linear algebra. We will refer to such collections as linearly independent. Collections of
vectors that contain redundant vectors will be called linearly dependent. The
following section offers a definition that will allow us to easily determine linear
dependence and independence of vectors.
Linear Independence
Linear Independence Let be vectors of . We say that the set is linearly independent
if the only solution to \begin{equation} \label{eq:defLinInd}c_1\vec{v}_1+c_2\vec{v}_2+\ldots +c_p\vec{v}_k=\vec{0} \end{equation}
is the trivial solution .
If, in addition to the trivial solution, a non-trivial solution (not all are zero) exists,
then we say that the set is linearly dependent.
Given a set of vectors we can now ask the following questions:
Can we write one element of as a linear combination of the others?
(c)
Does contain redundant vectors?
It turns out that these questions are equivalent. In other words, if the answer to one
of them is “YES”, the answer to the other two is also “YES”. Conversely, if the
answer to one of them is “NO”, then the answer to the other two is also “NO”. We
will start by illustrating this idea with an example, then conclude this section by
formally proving the equivalency.
What can we say about the following sets of vectors in light of Remark remark:LinIndEquiv?
(a)
(b)
item:linindpart1 We will start by addressing linear independence. To do so, we will solve the vector
equation \begin{align} \label{eq:linrelationpart1}c_1\begin{bmatrix}2\\-3\end{bmatrix}+c_2 \begin{bmatrix}0\\3\end{bmatrix}+c_3\begin{bmatrix}1\\-1\end{bmatrix}+c_4\begin{bmatrix}1\\-2\end{bmatrix}=\vec{0} \end{align}
Clearly is a solution to the equation. The question is whether another solution
exists.
The vector equation translates into the following system:
Writing the system in augmented matrix form and applying elementary row
operations gives us the following reduced row-echelon form:
This shows that (eq:linrelationpart1) has infinitely many solutions:
Letting , we obtain the following:
\begin{equation} \label{eq:ex1} -6\begin{bmatrix}2\\-3\end{bmatrix}+0 \begin{bmatrix}0\\3\end{bmatrix}+6\begin{bmatrix}1\\-1\end{bmatrix}+6\begin{bmatrix}1\\-2\end{bmatrix}=\vec{0} \end{equation}
We conclude that the vectors are linearly dependent.
Observe that (eq:ex1) allows us to solve for one of the vectors and express it as a linear
combination of the others. For example, \begin{equation} \label{eq:ex1lincomb} \begin{bmatrix}2\\-3\end{bmatrix}=0 \begin{bmatrix}0\\3\end{bmatrix}+\begin{bmatrix}1\\-1\end{bmatrix}+\begin{bmatrix}1\\-2\end{bmatrix} \end{equation}
This would not be possible if a nontrivial solution to the equation
did not exist.
Using the linear combination in (eq:ex1lincomb) and the argument of Exploration exp:redundantVecs2, we conclude
that is redundant in
We find that the answer to all questions in Remark remark:LinIndEquiv is “YES”.
item:linindpart2 To address linear independence, we need to solve the equation
Converting the equation to augmented matrix form and performing row reduction
gives us
This shows that is the only solution. Therefore the two vectors are linearly
independent.
Furthermore, we cannot write one of the vectors as a linear combination of the other.
(Do you see that the only way this would be possible with a set of two vectors is if
they were scalar multiples of each other?)
Finally, we observe that removing either vector would change the span from a
plane in to a line in , so the answer to all three questions in Remark remark:LinIndEquiv is
“NO”.
Let be a set of vectors in containing two or more vectors. The following conditions
are equivalent.
(a)
are linearly dependent.
(b)
One of can be expressed as a linear combination of the others.
(Do you see why it was important to have one of the constants nonzero?)
This shows that may be expressed as a linear combination of the other
vectors.
th:lindeplincombofother_bth:lindeplincombofother_c First, suppose is a linear combination of . We will show that is redundant by
showing that
To show equality of the two spans we will pick a vector in the left span and show
that it is also an element of the span on the right. Then, we will pick a vector in the
right span and show that it is also an element of the span on the left, and we will
conclude that the sets are equal.
Observe that if is in , then it has to be in . (Why?)
Now suppose is in . We need to show that is also in .
By assumption, we can write as \begin{equation} \label{eq:vj} \vec{v}_j=a_1\vec{v}_1+a_2\vec{v}_2+\dots +a_{j-1}\vec{v}_{j-1}+a_{j+1}\vec{v}_{j+1}+\dots +a_k\vec{v}_k. \end{equation}
Since is in , we have
Substituting the expression in (eq:vj) for and simplifying, we obtain the following
This shows that is in . We now have
which shows that is redundant.
th:lindeplincombofother_cth:lindeplincombofother_a Suppose that is redundant, so that
Consider a vector in \begin{equation} \label{eq:w1} \vec{w}=a_1\vec{v}_1+a_2\vec{v}_2+\dots +a_j\vec{v}_j+\dots +a_k\vec{v}_k \end{equation}
Since the span contains ALL possible linear combinations of , we may choose such
that .
By assumption, is also in . Therefore, we can express as a linear combination \begin{equation} \label{eq:w2} \vec{w}=b_1\vec{v}_1+b_2\vec{v}_2+\dots +b_{j-1}\vec{v}_{j-1}+b_{j+1}\vec{v}_{j+1}+\dots +b_k\vec{v}_k. \end{equation}
We complete the proof by showing there exists a non-trivial solution to \begin{equation} \label{eq:LinIndepDefRepeated} c_1\vec{v}_1+c_2\vec{v}_2+\ldots +c_j\vec{v}_j+\ldots +c_k\vec{v}_k=\vec{0}. \end{equation}
Subtracting expression (eq:w2) from (eq:w1) we obtain
Recall that we ensured that . This implies that we have a non-trivial solution to
Equation eq:LinIndepDefRepeated.
These three parts of the proof show that if one of the conditions is true, all three
must be true. It is a logical consequence that if one of the three conditions is false, all
three must be false.
Geometry of Linearly Dependent and Linearly Independent Vectors
Theorem th:lindeplincombofother gives us a convenient ways of looking at linear dependence/independence
geometrically. When looking at two or more vectors, we ask, “can one of the vectors
be written as a linear combination of the others?” We can also ask, “is one of the
vectors redundant?” If the answer to either of these questions is “YES”, then the
vectors are linearly dependent.
A Set of Two Vectors
Two vectors are linearly dependent if and only if one is a scalar multiple
of the other. Two nonzero linearly dependent vectors may look like this:
or like this:
Two linearly independent vectors will look like this:
A Set of Three Vectors
Given a set of three nonzero vectors, we have the following possibilities:
(Linearly Dependent Vectors) The three vectors are scalar multiples of each
other.
(Linearly Dependent Vectors) Two of the vectors are scalar multiples of each
other.
(Linearly Dependent Vectors) One vector can be viewed as the diagonal of a
parallelogram determined by scalar multiples of the other two vectors. All three
vectors lie in the same plane.
(Linearly Independent Vectors) A set of three vectors is linearly independent if
the vectors do not lie in the same plane. For example, vectors , and are
linearly independent.
Practice Problems
In Exploration exp:redundantVecs2 we considered the following set of vectors
and demonstrated that is redundant by using the fact that it is a linear combination
of the other two vectors.
(a)
Express each of and as a linear combination of the remaining vectors.
(b)
Which of the following is NOT true?
If is in , then is in .Both and are
redundant in .We can remove and from at the same time without affecting
the span.
Any set containing the zero vector is linearly dependent.
TRUEFALSE
Can the zero vector be removed from the set without changing the span?
A set containing five vectors in is linearly dependent.
TRUEFALSE
If we rewrite Equation eq:defLinInd for five vectors in as a system of equations, how many
equations and unknowns will it have? What does this imply about the number of
solutions?