$\newenvironment {prompt}{}{} \newcommand {\ungraded }[0]{} \newcommand {\inputGif }[1]{The gif ‘‘#1" would be inserted here when publishing online. } \newcommand {\todo }[0]{} \newcommand {\oiint }[0]{{\large \bigcirc }\kern -1.56em\iint } \newcommand {\mooculus }[0]{\textsf {\textbf {MOOC}\textnormal {\textsf {ULUS}}}} \newcommand {\npnoround }[0]{\nprounddigits {-1}} \newcommand {\npnoroundexp }[0]{\nproundexpdigits {-1}} \newcommand {\npunitcommand }[1]{\ensuremath {\mathrm {#1}}} \newcommand {\RR }[0]{\mathbb R} \newcommand {\R }[0]{\mathbb R} \newcommand {\N }[0]{\mathbb N} \newcommand {\Z }[0]{\mathbb Z} \newcommand {\sagemath }[0]{\textsf {SageMath}} \newcommand {\d }[0]{\mathop {}\!d} \newcommand {\l }[0]{\ell } \newcommand {\ddx }[0]{\frac {d}{\d x}} \newcommand {\zeroOverZero }[0]{\ensuremath {\boldsymbol {\tfrac {0}{0}}}} \newcommand {\inftyOverInfty }[0]{\ensuremath {\boldsymbol {\tfrac {\infty }{\infty }}}} \newcommand {\zeroOverInfty }[0]{\ensuremath {\boldsymbol {\tfrac {0}{\infty }}}} \newcommand {\zeroTimesInfty }[0]{\ensuremath {\small \boldsymbol {0\cdot \infty }}} \newcommand {\inftyMinusInfty }[0]{\ensuremath {\small \boldsymbol {\infty -\infty }}} \newcommand {\oneToInfty }[0]{\ensuremath {\boldsymbol {1^\infty }}} \newcommand {\zeroToZero }[0]{\ensuremath {\boldsymbol {0^0}}} \newcommand {\inftyToZero }[0]{\ensuremath {\boldsymbol {\infty ^0}}} \newcommand {\numOverZero }[0]{\ensuremath {\boldsymbol {\tfrac {\#}{0}}}} \newcommand {\dfn }[0]{\textbf } \newcommand {\unit }[0]{\mathop {}\!\mathrm } \newcommand {\eval }[1]{\bigg [ #1 \bigg ]} \newcommand {\seq }[1]{\left ( #1 \right )} \newcommand {\epsilon }[0]{\varepsilon } \newcommand {\phi }[0]{\varphi } \newcommand {\iff }[0]{\Leftrightarrow } \DeclareMathOperator {\arccot }{arccot} \DeclareMathOperator {\arcsec }{arcsec} \DeclareMathOperator {\arccsc }{arccsc} \DeclareMathOperator {\si }{Si} \DeclareMathOperator {\scal }{scal} \DeclareMathOperator {\sign }{sign} \newcommand {\arrowvec }[1]{{\overset {\rightharpoonup }{#1}}} \newcommand {\vec }[1]{{\overset {\boldsymbol {\rightharpoonup }}{\mathbf {#1}}}} \newcommand {\point }[1]{\left (#1\right )} \newcommand {\pt }[1]{\mathbf {#1}} \newcommand {\Lim }[2]{\lim _{\point {#1} \to \point {#2}}} \DeclareMathOperator {\proj }{\mathbf {proj}} \newcommand {\veci }[0]{{\boldsymbol {\hat {\imath }}}} \newcommand {\vecj }[0]{{\boldsymbol {\hat {\jmath }}}} \newcommand {\veck }[0]{{\boldsymbol {\hat {k}}}} \newcommand {\vecl }[0]{\vec {\boldsymbol {\l }}} \newcommand {\uvec }[1]{\mathbf {\hat {#1}}} \newcommand {\utan }[0]{\mathbf {\hat {t}}} \newcommand {\unormal }[0]{\mathbf {\hat {n}}} \newcommand {\ubinormal }[0]{\mathbf {\hat {b}}} \newcommand {\dotp }[0]{\bullet } \newcommand {\cross }[0]{\boldsymbol \times } \newcommand {\grad }[0]{\boldsymbol \nabla } \newcommand {\divergence }[0]{\grad \dotp } \newcommand {\curl }[0]{\grad \cross } \newcommand {\lto }[0]{\mathop {\longrightarrow \,}\limits } \newcommand {\bar }[0]{\overline } \newcommand {\surfaceColor }[0]{violet} \newcommand {\surfaceColorTwo }[0]{redyellow} \newcommand {\sliceColor }[0]{greenyellow} \newcommand {\vector }[1]{\left \langle #1\right \rangle } \newcommand {\sectionOutcomes }[0]{} \newcommand {\HyperFirstAtBeginDocument }[0]{\AtBeginDocument }$

The dot product measures how aligned two vectors are with each other.

### The definition of the dot product

We have already seen how to add vectors and how to multiply vectors by scalars.

In this section we will define a way to “multiply” two vectors called the dot product. The dot product measures how “aligned” two vectors are with each other.

The first thing you should notice about the the dot product is that

Compute.
Compute.
Let $\vec {u},\vec {v},\vec {w}$ be nonzero vectors in $\R ^3$. Which of the following expressions make sense?
$(\vec {w} \dotp \vec {u} ) \vec {u}$ $5(\vec {u} +\vec {w}) \dotp {\vec {u}}$ $\vec {w} / \vec {u}$ $\vector {2,3} \dotp \vector {4,2} + 7$ $\vec {w} / ( \vec {u} \dotp \vec {u})$ $\vector {1,3} \dotp \vector {-1,2,5}$ $\vec {u}\dotp \vec {v}+\vec {w}$
Think about which terms/factors are vectors and which terms/factors are scalars.
Which of the following are vectors?
$(\vec {w} \dotp \vec {u} ) \vec {u}$ $5(\vec {u} +\vec {w}) \dotp {\vec {u}}$ $\vector {2,3} \dotp \vector {4,2} + 7$ $\vec {w} / ( \vec {u} \dotp \vec {u})$

The dot product allows us to write some complicated formulas more simply.

Compute the magnitude of the vector $\vec {v} = \vector {1,2,3,4}$.

### The geometry of the dot product

Let’s see if we can figure out what the dot product tells us geometrically. As an appetizer, we give the next theorem: the Law of Cosines.

When $\theta = \pi /2$ what does the law of cosines say?
It is the Pythagorean theorem. It is the law of sines. It is undefined.

We can rephrase the Law of Cosines in the language of vectors. The vectors $\vec {v}$, $\vec {w}$, and $\vec {v} - \vec {w}$ form a triangle

so if $\theta$ is the angle between $\vec {v}$ and $\vec {w}$ we must have

The theorem above tells us some interesting things about the angle between two (nonzero) vectors.

We have a special buzz-word for when the dot product is zero.

From this we see that the dot product of two vectors is zero if those vectors are orthogonal. Moreover, if the dot product is not zero, using the formula allows us to compute the angle between these vectors via where $0\le \theta \le \pi$.

Find the angle between the vectors.

Find all unit vectors orthogonal to:

Write your vectors in the order of increasing $x$-components.

### Projections and components

#### Projections

One of the major uses of the dot product is to let us project one vector in the direction of another. Conceptually, we are looking at the “shadow” of one vector projected onto another, sort of like in the case of a sundial.

In essence we imagine the “sun” directly over a vector, casting a shadow onto another vector.

While this is good starting point for understanding orthogonal projections, now we need the definition.

Consider the vector $\vec {v}=\vector {3,2,1}$ and the vector $\veci = \vector {1,0,0}$. Compute $\proj _\veci (\vec {v})$.
Draw a picture.
Let $\vec {v} = \vector {1,1}$ and $\vec {w}=\vector {-1,1}$. Compute $\proj _\vec {w}(\vec {v})$.
Draw a picture.

To compute the projection of one vector along another, we use the dot product.

Find the projection of the vector $\vec {v} = \vector {2,3,1}$ in the direction of the vector $\vec {w} = \vector {3,-1,1}$.
Let $\vec {v}$ and $\vec {w}$ be nonzero vectors in $\R ^2$. Let $k\ge 1$. Select all statements that must be true.
$\proj _{\vec {w}}(\vec {v})=\proj _{\vec {v}}(\vec {w})$ $|\proj _{\vec {w}}(\vec {v})|\le |\vec {v}|$ $|\proj _{\vec {w}}(\vec {v})|\le |\vec {w}|$ $|\proj _{\vec {w}}(\vec {v})|\le |\proj _{\vec {w}}(k\cdot \vec {v})|$ $|\proj _{\vec {w}}(k\cdot \vec {v})|\le |\proj _{\vec {w}}(\vec {v})|$ $|\proj _{\vec {w}}(\vec {v})|\le |\proj _{k\cdot \vec {w}}(\vec {v})|$ $|\proj _{k\cdot \vec {w}}(\vec {v})|\le |\proj _{\vec {w}}(\vec {v})|$

#### Components

Scalar components compute “how much” of a vector is pointing in a particular direction.

Let $\vec {v} = \vector {3,-2,1}$. Compute $\scal _\veci (\vec {v})$.
Compute $\scal _\vecj (\vec {v})$.
Compute $\scal _\veck (\vec {v})$.
To compute the scalar component of a vector in the direction of another, you use the dot product.
Let $\vec {v}$ and $\vec {w}$ be nonzero vectors and let $\theta$ be the angle between them. Which of the following are true?
$|\proj _\vec {w}(\vec {v})| = \scal _{\vec {w}}(\vec {v})$ $|\proj _\vec {w}(\vec {v})| = |\scal _{\vec {w}}(\vec {v})|$ $\proj _\vec {w}(\vec {v}) =|\vec {v}|\cos (\theta )\left (\frac {\vec {w}}{|\vec {w}|}\right )$ $\scal _\vec {w}(\vec {v}) = |\vec {v}|\cos (\theta )$

#### Orthogonal decomposition

Given any vector $\vec {v}$ in $\R ^2$, we can always write it as for some real numbers $a$ and $b$. Here we’ve broken $\vec {v}$ into the sum of two orthogonal vectors — in particular, vectors parallel to $\veci$ and $\vecj$. In fact, given a vector $\vec {v}$ and another vector $\vec {w}$ you can always break $\vec {v}$ into a sum of two vectors, one of which is parallel to $\vec {w}$ and another that is perpendicular to $\vec {w}$. Such a sum is called an orthogonal decomposition. Move the point around to see various orthogonal decompositions of vector $\vec {v}$.

Let $\vec u = \vector {-2,1}$ and $\vec v = \vector {3,1}$. What is the orthogonal decomposition of $\vec {u}$ in terms of $\vec {v}$?
Let $\vec w =\vector {2,1,3}$ and $\vec x =\vector { 1,1,1}$. What is the orthogonal decomposition of $\vec {w}$ in terms of $\vec {x}$?

Now we give an example where this decomposition is useful.

### The algebra of the dot product

We summarize the arithmetic and algebraic properties of the dot product below.

Instead of defining the dot product by a formula, we could have defined it by the properties above! While this is common practice in mathematics, the process is a bit abstract and is perhaps beyond the scope of this course. Nevertheless, we know that you are an intrepid young mathematician, and we will not hold back. We will now show that there is only one formula which gives us all of these properties, and it will be our formula for the dot product.