Bases and Dimension of Abstract Vector Spaces
When working with and subspaces of we developed several fundamental ideas including span, linear independence, bases and dimension. We will find that these concepts generalize easily to abstract vector spaces and that analogous results hold in these new settings.
Linear Independence
If, in addition to the trivial solution, a non-trivial solution (not all are zero) exists, then we say that the set is linearly dependent.
Bases and Dimension
Recall that our motivation for defining a basis of a subspace of was to have a collection of vectors such that every vector of the subspace can be expressed as a unique linear combination of the vectors in that collection. Definition of a basis (def:basis) generalizes to abstract vector spaces as follows.
- Proof
- By the definition of a basis, we know that can be written as a linear combination of . Suppose there are two such representations. Then, But then we have: Because are linearly independent, we have for . Consequently for .
In Bases and Dimension we defined the dimension of a subspace of to be the number of elements in a basis. (Definition def:dimension) We will adopt this definition for abstract vector spaces. As before, to ensure that dimension is well-defined we need to establish that this definition is independent of our choice of a basis. The proof of the following theorem is identical to the proof of its counterpart in . (Theorem th:dimwelldefined)
Now we can state the definition.In our discussions up to this point, we have always assumed that a basis is nonempty and hence that the dimension of the space is at least . However, the zero space has no basis. To accommodate for this, we will say that the zero vector space is defined to have dimension :
Our insistence that amounts to saying that the empty set of vectors is a basis of . Thus the statement that “the dimension of a vector space is the number of vectors in any basis” holds even for the zero space.
It was shown in Example ex:centralizerofA of Abstract Vector Spaces that is a subspace for any choice of the matrix .
Let . Show that and find a basis of .
Let
The set is linearly independent. (See Practice Problem prob:CABlinind) Every element of can be written as a linear combination of elements of . Thus . Therefore is a basis of , and .
Let be a subspace of consisting of all symmetric matrices. Find the dimension of .
Finite-Dimensional Vector Spaces
Our definition of dimension of a vector space depends on the vector space having a basis. In this section we will establish that any vector space spanned by finitely many vectors has a basis.
Given a finite-dimensional vector space we will find a basis for by starting with a linearly independent subset of and expanding it to a basis. The following results are more general versions of Lemmas lemma:atmostnlinindinrn and lemma:expandinglinindset, and Theorem th:dimwelldefined of Bases and Dimension. The proofs are identical and we will omit them.
Coordinate Vectors
Recall that in the context of (and subspaces of ) the requirement that elements of a basis be linearly independent guarantees that every element of the vector space has a unique representation in terms of the elements of the basis. (See Theorem th:linindbasis of Introduction to Bases) We proved the same property for abstract vector spaces in Theorem th:uniquerep.
Uniqueness of representation in terms of the elements of a basis allows us to associate every element of a vector space with a unique coordinate vector with respect to a given basis. Coordinate vectors were first introduced in Introduction to Bases. We now give a formal definition.
Next, we need to show that spans . To this end, we will consider a generic element of and attempt to express it as a linear combination of the elements of . then Setting the coefficients of like terms equal to each other gives us Solving this linear system of , and gives us (You should verify this.) This shows that every element of can be written as a linear combination of elements of . Therefore is a basis for .
To find the coordinate vector for with respect to we need to express as a linear combination of the elements of . Fortunately, we have already done all the necessary work. For , , and . This gives us the coefficients of the linear combination: , , . We now write as a linear combination The coordinate vector for with respect to is
- (a)
- Find the coordinate vector with respect to the ordered basis for .
- (b)
- Let be another ordered basis for . Find the coordinate vector for with respect to .
Coordinate vectors will play a vital role in establishing one of the most fundamental results in linear algebra, that all -dimensional vector spaces have the same structure as . In Example ex:p2isor3 of Isomorphic Vector Spaces, for instance, we will show that is essentially the same as .
Practice Problems
Problems prob:linindabstractvsp1-prob:linindabstractvsp3
Show that each of the following sets of vectors is linearly independent.
Problems prob:coordvectors1-prob:coordvectors2
Find the coordinate vector for with respect to the given ordered basis of .
Text Source
The discussion of the zero space was adapted from Section 6.3 of Keith Nicholson’s Linear Algebra with Applications. (CC-BY-NC-SA)
W. Keith Nicholson, Linear Algebra with Applications, Lyryx 2018, Open Edition, p. 349
Example Source
Examples ex:polyindset and ex:CAbasis were adapted from Examples 6.3.1 and 6.3.10 of Keith Nicholson’s Linear Algebra with Applications. (CC-BY-NC-SA)
W. Keith Nicholson, Linear Algebra with Applications, Lyryx 2018, Open Edition, p. 346, 350
Exercise Source
Practice Problems prob:linindabstractvsp1, prob:linindabstractvsp2 and prob:linindabstractvsp3 are Exercises 6.3(a)(b)(c) from Keith Nicholson’s Linear Algebra with Applications. (CC-BY-NC-SA)
W. Keith Nicholson, Linear Algebra with Applications, Lyryx 2018, Open Edition, p. 351