Note to Student: In this section we will often use , and to denote subspaces of , or any other finite-dimensional vector space, such as those we study in Vector Spaces.
Composition and Inverses of Linear Transformations
Composition of Linear Transformations
(You should be able to verify that both transformations are linear.) Examine the effect of on vectors of .
This means that maps all vectors of to .
In addition to the computational approach, it is also useful to visualize what happens geometrically.
First, observe that Therefore the image of any vector of under lies on the line determined by the vector .
Even though is defined on all of , we are only interested in the action of on vectors along the line determined by . Our computations showed that all such vectors map to .
The actions of individual transformations, as well as the composite transformation are shown below.
- Proof
- Let and be linear transformations. We will show that is linear. For all vectors and of and scalars and we have: \begin{align*} (S\circ T)(a\vec{u}_1+b\vec{u}_2)&=S(T(a\vec{u}_1+b\vec{u}_2))\\ &=S(aT(\vec{u}_1)+bT(\vec{u}_2))\\ &=aS(T(\vec{u}_1))+bS(T(\vec{u}_2))\\ &=a(S\circ T)(\vec{u}_1)+b(S\circ T)(\vec{u}_2) \end{align*}
We have .
- Proof
- For all in we have: \begin{align*} ((R\circ S)\circ T)(\vec{u})&=(R\circ S)(T(\vec{u}))=R(S(T(\vec{u})))\\ &=R((S\circ T)(\vec{u}))=(R\circ (S\circ T))(\vec{u}) \end{align*}
Composition and Matrix Multiplication
In this section we will consider linear transformations of and their standard matrices.
- Proof
- For all in we have:
We conclude this section by revisiting the associative property of matrix multiplication. At the time matrix multiplication was introduced, we skipped the cumbersome proof that for appropriately sized matrices , and , we have . (See Theorem th:propertiesofmatrixmultiplication.) We are now in a position to prove this result with ease.
Every matrix induces a linear transformation. The product of two matrices can be interpreted as a composition of transformations. Since function composition is associative, so is matrix multiplication. We formalize this observation as a theorem.
Inverses of Linear Transformations
We leave it to the reader to verify that .
Linearity of Inverses of Linear Transformations
Definition def:inverseoflintrans does not specifically require an inverse of a linear transformation to be linear, but it turns out that the requirement that and is sufficient to guarantee that is linear.
- Proof
- The proof of this result is left to the reader. (See Practice Problem prob:inverseislinear)
Linear Transformations of and the Standard Matrix of the Inverse Transformation
Every linear transformation is a matrix transformation. (See Theorem th:matlin.) If has an inverse , then by Theorem th:inverseislinear, is also a matrix transformation. Let and denote the standard matrices of and , respectively. We see that and if and only if and . In other words, and are inverse transformations if and only if and are matrix inverses.
Note that if is an inverse of , then and are square matrices, and .
- Proof
- Part item:exists follows directly from the preceding discussion. Part item:unique follows from uniqueness of matrix inverses. (Theorem th:matinverseunique.)
Please note that Theorem th:existunique is only applicable in the context of linear transformations of and their standard matrices. The following example provides us with motivation to investigate inverses further, which we will do in Existence of the Inverse of a Linear Transformation.
Geometrically speaking, the domain of is a plane in and its codomain is .
Does have an inverse? We are not in a position to answer this question right now because Theorem th:existunique does not apply to this situation.
Practice Problems