The minimum number of vectors that span a vector space has special significance.

For example, recall that is the vector in whose component is and all of whose other components are . Let be in . Then

Since every vector in is a linear combination of the vectors , it follows that . Thus, is finite dimensional. Moreover, the dimension of is at most , since is spanned by vectors. It seems unlikely that could be spanned by fewer than vectors— but this point needs to be proved.
An Example of a Vector Space that is Not Finite Dimensional

Next we discuss an example of a vector space that does not have finite dimension. Consider the subspace consisting of polynomials of all degrees. We show that is not the span of a finite number of vectors and hence that does not have finite dimension. Let be a set of polynomials and let be the maximum degree of these polynomials. Then every polynomial in the span of has degree less than or equal to . In particular, is a polynomial that is not in the span of and is not spanned by finitely many vectors.

Bases and The Main Theorem

It follows that if is a basis for , then . The main theorem about bases is:

Remark: The importance of Theorem ?? is that we can show that a set of vectors is a basis by verifying spanning and linear independence. We never have to check directly that the spanning set has the minimum number of vectors for a spanning set.

For example, we have shown previously that the set of vectors in is linearly independent and spans . It follows from Theorem ?? that this set is a basis, and that the dimension of is . In particular, cannot be spanned by fewer than vectors.

The proof of Theorem ?? is given in Section ??.

Consequences of Theorem ??

We discuss two applications of Theorem ??. First, we use this theorem to derive a way of determining the dimension of the subspace spanned by a finite number of vectors. Second, we show that the dimension of the subspace of solutions to a homogeneous system of linear equation is where is an matrix.

Computing the Dimension of a Span

We show that the dimension of a span of vectors can be found using elementary row operations on .

Proof
To verify (??), observe that the span of is unchanged by
  • swapping and ,
  • multiplying by a nonzero scalar, and
  • adding a multiple of to .

That is, if we perform elementary row operations on , the vector space spanned by the rows of does not change. So we may perform elementary row operations on until we arrive at the matrix in reduced echelon form. Suppose that ; that is, suppose that is the number of nonzero rows in . Then where the are the nonzero rows in the reduced echelon form matrix.

We claim that the vectors are linearly independent. It then follows from Theorem ?? that is a basis for and that the dimension of is . To verify the claim, suppose

We show that must equal as follows. In the row, the pivot must occur in some column — say in the column. It follows that the entry in the vector of the left hand side of (??) is since all entries in the column of other than the pivot must be zero, as is in reduced echelon form.

For instance, let in where

To compute in MATLAB , type e5_5_4 to load the vectors and type

M = [w1; w2; w3]

Row reduction of the matrix M in MATLAB leads to the reduced echelon form matrix
ans =
 
     1.0000         0    1.4706    1.1176  
         0     1.0000    1.7059    2.1765  
         0          0         0         0

indicating that the dimension of the subspace is two, and therefore is not a basis of . Alternatively, we can use the MATLAB command rank(M) to compute the rank of and the dimension of the span .

However, if we change one of the entries in , for instance w3(3)=-18 then indeed the command rank([w1;w2;w3]) gives the answer three indicating that for this choice of vectors is a basis for .

Solutions to Homogeneous Systems Revisited

We return to our discussions in Chapter ?? on solving linear equations. Recall that we can write all solutions to the system of homogeneous equations in terms of a few parameters, and that the null space of is the subspace of solutions (See Definition ??). More precisely, Proposition ?? states that the number of parameters needed is where is the number of variables in the homogeneous system. We claim that the dimension of the null space is exactly .

For example, consider the reduced echelon form matrix

that has rank three. Suppose that the unknowns for this system of equations are . We can solve the equations associated with by solving the first equation for , the second equation for , and the third equation for , as follows:
Thus, all solutions to this system of equations have the form which equals

This calculation shows that the null space of , which is , is spanned by the four vectors . Moreover, this same calculation shows that the four vectors are linearly independent. From the left hand side of (??) we see that if this linear combination sums to zero, then . It follows from Theorem ?? that .

Proof
Neither the rank nor the null space of are changed by elementary row operations. So we can assume that is in reduced echelon form. The rank of is the number of nonzero rows in the reduced echelon form matrix. Proposition ?? states that the null space is spanned by vectors where . We must show that these vectors are linearly independent.

Let be the columns of that do not contain pivots. In example (??) and After solving for the variables corresponding to pivots, we find that the spanning set of the null space consists of vectors in , which we label as . See (??). Note that the entry of is while the entry in all of the other vectors is . Again, see (??) as an example that supports this statement. It follows that the set of spanning vectors is a linearly independent set. That is, suppose that From the entry in this equation, it follows that ; and the vectors are linearly independent.

Theorem ?? has an interesting and useful interpretation. We have seen in the previous subsection that the rank of a matrix is just the number of linearly independent rows in . In linear systems each row of the coefficient matrix corresponds to a linear equation. Thus, the rank of may be thought of as the number of independent equations in a system of linear equations. This theorem just states that the space of solutions loses a dimension for each independent equation.

Exercises

Show that where is a basis for .
Let where Find the dimension of and find a basis for .
Find a basis for the null space of What is the dimension of the null space of ?
Show that the set of all matrices is a vector space. Show that the dimension of is four by finding a basis of with four elements. Show that the space of all matrices is also a vector space. What is ?
Show that the set of all polynomials of degree less than or equal to is a subspace of . What is ? What is ?
Let be the vector space of polynomials of degree at most three in one variable . Let where are fixed constants. Show that is a basis for .
Let be a nonzero row vector.
  • Show that the matrix is symmetric and that . Hint: Begin by showing that for every vector that is perpendicular to and that is a nonzero multiple of .
  • Show that the matrix is invertible. Hint: Show that .