$\newenvironment {prompt}{}{} \newcommand {\ungraded }{} \newcommand {\todo }{} \newcommand {\oiint }{{\large \bigcirc }\kern -1.56em\iint } \newcommand {\mooculus }{\textsf {\textbf {MOOC}\textnormal {\textsf {ULUS}}}} \newcommand {\npnoround }{\nprounddigits {-1}} \newcommand {\npnoroundexp }{\nproundexpdigits {-1}} \newcommand {\npunitcommand }{\ensuremath {\mathrm {#1}}} \newcommand {\RR }{\mathbb R} \newcommand {\R }{\mathbb R} \newcommand {\N }{\mathbb N} \newcommand {\Z }{\mathbb Z} \newcommand {\sagemath }{\textsf {SageMath}} \newcommand {\d }{\mathop {}\!d} \newcommand {\l }{\ell } \newcommand {\ddx }{\frac {d}{\d x}} \newcommand {\zeroOverZero }{\ensuremath {\boldsymbol {\tfrac {0}{0}}}} \newcommand {\inftyOverInfty }{\ensuremath {\boldsymbol {\tfrac {\infty }{\infty }}}} \newcommand {\zeroOverInfty }{\ensuremath {\boldsymbol {\tfrac {0}{\infty }}}} \newcommand {\zeroTimesInfty }{\ensuremath {\small \boldsymbol {0\cdot \infty }}} \newcommand {\inftyMinusInfty }{\ensuremath {\small \boldsymbol {\infty -\infty }}} \newcommand {\oneToInfty }{\ensuremath {\boldsymbol {1^\infty }}} \newcommand {\zeroToZero }{\ensuremath {\boldsymbol {0^0}}} \newcommand {\inftyToZero }{\ensuremath {\boldsymbol {\infty ^0}}} \newcommand {\numOverZero }{\ensuremath {\boldsymbol {\tfrac {\#}{0}}}} \newcommand {\dfn }{\textbf } \newcommand {\unit }{\mathop {}\!\mathrm } \newcommand {\eval }{\bigg [ #1 \bigg ]} \newcommand {\seq }{\left ( #1 \right )} \newcommand {\epsilon }{\varepsilon } \newcommand {\phi }{\varphi } \newcommand {\iff }{\Leftrightarrow } \DeclareMathOperator {\arccot }{arccot} \DeclareMathOperator {\arcsec }{arcsec} \DeclareMathOperator {\arccsc }{arccsc} \DeclareMathOperator {\si }{Si} \DeclareMathOperator {\scal }{scal} \DeclareMathOperator {\sign }{sign} \newcommand {\arrowvec }{{\overset {\rightharpoonup }{#1}}} \newcommand {\vec }{{\overset {\boldsymbol {\rightharpoonup }}{\mathbf {#1}}}\hspace {0in}} \newcommand {\point }{\left (#1\right )} \newcommand {\pt }{\mathbf {#1}} \newcommand {\Lim }{\lim _{\point {#1} \to \point {#2}}} \DeclareMathOperator {\proj }{\mathbf {proj}} \newcommand {\veci }{{\boldsymbol {\hat {\imath }}}} \newcommand {\vecj }{{\boldsymbol {\hat {\jmath }}}} \newcommand {\veck }{{\boldsymbol {\hat {k}}}} \newcommand {\vecl }{\vec {\boldsymbol {\l }}} \newcommand {\uvec }{\mathbf {\hat {#1}}} \newcommand {\utan }{\mathbf {\hat {t}}} \newcommand {\unormal }{\mathbf {\hat {n}}} \newcommand {\ubinormal }{\mathbf {\hat {b}}} \newcommand {\dotp }{\bullet } \newcommand {\cross }{\boldsymbol \times } \newcommand {\grad }{\boldsymbol \nabla } \newcommand {\divergence }{\grad \dotp } \newcommand {\curl }{\grad \cross } \newcommand {\lto }{\mathop {\longrightarrow \,}\limits } \newcommand {\bar }{\overline } \newcommand {\surfaceColor }{violet} \newcommand {\surfaceColorTwo }{redyellow} \newcommand {\sliceColor }{greenyellow} \newcommand {\vector }{\left \langle #1\right \rangle } \newcommand {\sectionOutcomes }{} \newcommand {\HyperFirstAtBeginDocument }{\AtBeginDocument }$

Projections tell us how much of one vector lies in the direction of another and are important in physical applications.

### Projections and components

#### Projections

One of the major uses of the dot product is to let us project one vector in the direction of another. Conceptually, we are looking at the “shadow” of one vector projected onto another, sort of like in the case of a sundial.

In essence we imagine the “sun” directly over a vector, casting a shadow onto another vector.

While this is good starting point for understanding orthogonal projections, now we need the definition.

Consider the vector $\vec {v}=\vector {3,2,1}$ and the vector $\veci = \vector {1,0,0}$. Compute $\proj _\veci (\vec {v})$.
Draw a picture.
Let $\vec {v} = \vector {1,1}$ and $\vec {w}=\vector {-1,1}$. Compute $\proj _\vec {w}(\vec {v})$.
Draw a picture.

To compute the projection of one vector along another, we use the dot product.

Find the projection of the vector $\vec {v} = \vector {2,3,1}$ in the direction of the vector $\vec {w} = \vector {3,-1,1}$.
Let $\vec {v}$ and $\vec {w}$ be nonzero vectors in $\R ^2$. Let $k\ge 1$. Select all statements that must be true.
$\proj _{\vec {w}}(\vec {v})=\proj _{\vec {v}}(\vec {w})$ $|\proj _{\vec {w}}(\vec {v})|\le |\vec {v}|$ $|\proj _{\vec {w}}(\vec {v})|\le |\vec {w}|$ $|\proj _{\vec {w}}(\vec {v})|\le |\proj _{\vec {w}}(k\cdot \vec {v})|$ $|\proj _{\vec {w}}(k\cdot \vec {v})|\le |\proj _{\vec {w}}(\vec {v})|$ $|\proj _{\vec {w}}(\vec {v})|\le |\proj _{k\cdot \vec {w}}(\vec {v})|$ $|\proj _{k\cdot \vec {w}}(\vec {v})|\le |\proj _{\vec {w}}(\vec {v})|$

#### Components

Scalar components compute “how much” of a vector is pointing in a particular direction.

Let $\vec {v} = \vector {3,-2,1}$. Compute $\scal _\veci (\vec {v})$.
Compute $\scal _\vecj (\vec {v})$.
Compute $\scal _\veck (\vec {v})$.
To compute the scalar component of a vector in the direction of another, you use the dot product.
Let $\vec {v}$ and $\vec {w}$ be nonzero vectors and let $\theta$ be the angle between them. Which of the following are true?
$|\proj _\vec {w}(\vec {v})| = \scal _{\vec {w}}(\vec {v})$ $|\proj _\vec {w}(\vec {v})| = |\scal _{\vec {w}}(\vec {v})|$ $\proj _\vec {w}(\vec {v}) =|\vec {v}|\cos (\theta )\left (\frac {\vec {w}}{|\vec {w}|}\right )$ $\scal _\vec {w}(\vec {v}) = |\vec {v}|\cos (\theta )$

#### Orthogonal decomposition

Given any vector $\vec {v}$ in $\R ^2$, we can always write it as for some real numbers $a$ and $b$. Here we’ve broken $\vec {v}$ into the sum of two orthogonal vectors — in particular, vectors parallel to $\veci$ and $\vecj$. In fact, given a vector $\vec {v}$ and another vector $\vec {w}$ you can always break $\vec {v}$ into a sum of two vectors, one of which is parallel to $\vec {w}$ and another that is perpendicular to $\vec {w}$. Such a sum is called an orthogonal decomposition. Move the point around to see various orthogonal decompositions of vector $\vec {v}$.

Let $\vec u = \vector {-2,1}$ and $\vec v = \vector {3,1}$. What is the orthogonal decomposition of $\vec {u}$ in terms of $\vec {v}$?
Let $\vec w =\vector {2,1,3}$ and $\vec x =\vector { 1,1,1}$. What is the orthogonal decomposition of $\vec {w}$ in terms of $\vec {x}$?

We conclude this section with a physical example where orthogonal decomposition is useful.