Introduction to Linear Transformations

We start by reviewing the definition of a function.

In algebra and calculus you worked with functions whose domain and codomain were each the set of all real numbers. In linear algebra, we call our functions transformations. The domain and codomain of a transformation are vector spaces.

In this exercise we will introduce a very special type of transformation by contrasting the effects of two transformations on vectors of . We will see that some transformations have “nice” properties, while others do not. Define and as follows:

Each of these transformations takes a vector in , and maps it to another vector in . To see if you understand how these transformations are defined, see if you can determine what these transformations do to the vector .

Now, let’s take the vector and multiply it by a scalar, say . .

Now let’s compare how and “handle” this product. Starting with , we compute:

Observe that multiplying the original vector by , then applying , has the same effect as applying to the original vector, then multiplying the image by . In other words,

Diagrammatically, this can be represented as follows.

You should try to verify that this property does not hold for transformation . In other words,

There is nothing special about the number , and it is not hard to prove that for any scalar and vector of , satisfies \begin{align} \label{lin1} kT_1(\vec{u})= T_1(k\vec{u}). \end{align}

It turns out that satisfies another important property. For all vectors and of we have: \begin{align} \label{lin2} T_1(\vec{u}+\vec{v}) = T_1(\vec{u})+T_1(\vec{v}) \end{align}

We leave it to the reader to illustrate this property with a specific example (see Practice Problem prob:sum). We will show that satisfies (lin2) in general.

Let and , then \begin{align*} T_1(\vec{u}+\vec{v})&=T_1\left (\begin{bmatrix} u_1\\ u_2 \end{bmatrix}+\begin{bmatrix} v_1\\ v_2 \end{bmatrix}\right )=T_1\left (\begin{bmatrix} u_1+v_1\\ u_2+v_2 \end{bmatrix}\right )=\begin{bmatrix} u_1+v_1-u_2-v_2\\ u_1+v_1 \end{bmatrix}\\ &=\begin{bmatrix} u_1-u_2\\ u_1 \end{bmatrix}+\begin{bmatrix} v_1-v_2\\ v_1 \end{bmatrix}=T_1\left (\begin{bmatrix} u_1\\ u_2 \end{bmatrix}\right )+T_1\left (\begin{bmatrix} v_1\\ v_2 \end{bmatrix}\right )\\ &=T_1(\vec{u})+T_1(\vec{v}) \end{align*}

It turns out that fails to satisfy this property. Can you prove that this is the case? Remember that to prove that a property DOES NOT hold, it suffices to find a counter-example. See if you can find vectors and such that \begin{align} \label{t2}T_2(\vec{u}+\vec{v}) \neq T_2(\vec{u})+T_2(\vec{v}). \end{align}

(See Practice Problem prob:prob2.)

Transformations satisfying (lin1) and (lin2), like , are called linear transformations. Transformations like are not linear. You have already encountered several linear transformations in the form of matrix transformations in sections Matrix Transformations and Geometric Transformations of the Plane.

Equations (eq:lintrans1) and (eq:lintrans2) of the above definition can be illustrated diagrammatically as follows.

In Exploration init:lintransintro we introduced a transformation which turned out to be non-linear. It took some work to show that is not linear. The following theorem would have made our work easier.

Proof
To prove part item:zerotozero, let be any vector in . By linearity of , we have:

Part item:linetoline will become evident after we prove Theorem th:matlin in Standard Matrix of a Linear Transformation and combine it with Practice Problem prb:linesToLines.

Linear Transformations Induced by Matrices

Recall that a transformation defined by , where is some matrix, is called a matrix transformation (or transformation induced by ). As we had discovered in Matrix Transformations, all matrix transformations are linear. We now formalize this result as a theorem.

Proof
Let and be vectors in , and let be a scalar. By properties of matrix multiplication we have: Therefore is a linear transformation.

Linear Transformations of Subspaces of

Definition def:lin defines a linear transformation as a map from into . We will now make this definition more general by allowing the domain and the codomain of the transformation to be subspaces of and . Eventually, a linear transformation will be defined as a mapping between vector spaces.

We conclude this section by introducing two simple but important transformations.

Proof
Left to the reader. (See Practice Problem prob:idtrans)

Proof
Left to the reader. (See Practice Problem prob:zerotrans)

Practice Problems

Show that (lin2) of Exploration init:lintransintro holds for vectors and .

Use a counter-example to prove (t2) of Exploration init:lintransintro.
Suppose is a linear transformation such that and . Find the image of .

Let be a fixed vector. Define , by .
(a)
Describe the effect of this transformation by sketching and for at least four vectors and a fixed vector of your choice.
(b)
Is a linear transformation?
Define , by . This transformation is called an orthogonal projection onto the -plane. Show that is a linear transformation.
Suppose a linear transformation maps to , to , and to . Find the image of under .

Prove Theorem th:idlintrans
Prove Theorem th:zerolintrans

Problems prob:domaincodomain1-prob:domaincodomain2

For each matrix below, find the domain and the codomain of the linear transformation induced by ; find and draw the image of . (Hint: See Example ex:lineartrans3.)

Domain: , where .

Codomain: , where .

Exercise Source

Practice Problem prob:notlinear is adopted from Problem 5.1.3 of Ken Kuttler’s A First Course in Linear Algebra. (CC-BY)

Ken Kuttler, A First Course in Linear Algebra, Lyryx 2017, Open Edition, p. 272.

2024-09-26 22:12:28