The dot product measures how aligned two vectors are with each other.

### The definition of the dot product

We have already seen how to add vectors and how to multiply vectors by scalars.

**never be utilized in this course**.

In this section we will define a way to “multiply” two vectors called the *dot
product*. The dot product measures how “aligned” two vectors are with each
other.

The first thing you should notice about the the dot product is that

The dot product allows us to write some complicated formulas more simply.

### The geometry of the dot product

Let’s see if we can figure out what the dot product tells us geometrically. As an appetizer, we give the next theorem: the Law of Cosines.

We can rephrase the Law of Cosines in the language of vectors. The vectors , , and form a triangle

The theorem above tells us some interesting things about the angle between two (nonzero) vectors.

We have a special buzz-word for when the dot product is zero.

**orthogonal**if the the dot product of these vectors is zero. Geometrically, this means that the angle between the vectors is or . Note, this also means that the zero vector is orthogonal to all vectors.

From this we see that the dot product of two vectors is zero if those vectors are orthogonal. Moreover, if the dot product is not zero, using the formula allows us to compute the angle between these vectors via where .

### Projections and components

#### Projections

One of the major uses of the dot product is to let us *project* one vector in
the direction of another. Conceptually, we are looking at the “shadow” of
one vector projected onto another, sort of like in the case of a sundial.

While this is good starting point for understanding orthogonal projections, now we need the definition.

**orthogonal projection**of vector in the direction of vector is a new vector denoted

To compute the projection of one vector along another, we use the dot product.

Notice that the sign of the direction is the sign of cosine, so we simply remove the absolute value from the cosine.

#### Components

Scalar components compute “how much” of a vector is pointing in a particular direction.

**scalar component**in the direction of of vector is denoted

#### Orthogonal decomposition

Given any vector in , we can always write it as for some real numbers and . Here
we’ve broken into the sum of two orthogonal vectors — in particular, vectors
parallel to and . In fact, given a vector and another vector you can always
break into a sum of two vectors, one of which is parallel to and another
that is perpendicular to . Such a sum is called an *orthogonal decomposition*.
Move the point around to see various orthogonal decompositions of vector .

**orthogonal decomposition**of in terms of is the sum where means that “ is parallel to ” and means that “ is perpendicular to ”.

Now we give an example where this decomposition is useful.

### The algebra of the dot product

We summarize the arithmetic and algebraic properties of the dot product below.

- Commutativity:
- Linear in first argument:
- and
- Linear in second argument:
- and
- Relation to magnitude:
- Relation to orthogonality:
- If is orthogonal to then .

Instead of defining the dot product by a formula, we could have defined it by the properties above! While this is common practice in mathematics, the process is a bit abstract and is perhaps beyond the scope of this course. Nevertheless, we know that you are an intrepid young mathematician, and we will not hold back. We will now show that there is only one formula which gives us all of these properties, and it will be our formula for the dot product.