A linear transformation can be represented in terms of multiplication by a matrix.

Suppose , and is given by for some real matrix . Then it follows immediately from the properties of matrix algebra that is a linear transformation:

Conversely, suppose the linear transformation is given. Define the matrix by that is, the matrix with . Then by construction so that and are two linear transformations which agree on a basis for , which by the previous corollary implies Because of this, the matrix is referred to as a matrix representation of . Note that this representation is with respect to to the standard basis for and .

We see now that the same type of representation applies for arbitrary vector spaces once a basis has been fixed for both the domain and target. In other words, given

  • A vector space with basis ,
  • a vector space with basis , and
  • a linear transformation

we could ask if there is a similar representation of in terms of a matrix (which depends on these two choices of bases). The answer is “yes”.

Proof
Again by the above corollary it suffices to verify the equality for basis vectors. But is the coordinate vector identical to the basis vector for . From this we get completing the proof.

Returning once more to the general case where is linear, a basis for , a basis for , we note that the bases and can be used to identify the kernel and image of . Precisely, we have

This theorem tells us that, once we have fixed a basis for and , the representation of by the matrix further identifies i) the kernel of with the nullspace of and ii) the image with the column space of .

Suppose is the linear transformation whose representation in the standard basis for is given by Following the method in the above example, write down a minimal spanning set for and (the elements should be vectors in - i.e., polynomials, not their coordinate representations.)