We introduce partial derivatives and the gradient vector.

**partial derivative**of with respect to the th variable is denoted: This means that one should take the single-variable derivative with respect to of while treating all other variables as constants.

The following interactive let’s you see whats going on with partial derivatives:

We have shown *how* to compute a partial derivative, but it may still not be clear
what a partial derivative *means*. Given , measures the rate at which changes as only
varies: is held constant.

Imagine standing in a rolling meadow, then beginning to walk due east. Depending on your location, you might walk up, sharply down, or perhaps not change elevation at all. This is similar to measuring : you are moving only east (in the -direction) and not north/south at all. Going back to your original location, imagine now walking due north (in the -direction). Perhaps walking due north does not change your elevation at all. This is analogous to : does not change with respect to . We can see that and do not have to be the same, or even similar, as it is easy to imagine circumstances where walking east means you walk downhill, though walking north makes you walk uphill. The next example helps us visualize this.

Whenever we do a computation in mathematics, we should ask ourselves, “What does this mean?”

If , this means if one “stands” on the surface at the point and moves parallelorthogonal to the -axis (so only the -value changes, not the -value), then the instantaneous rate of change in is . Increasing the -value will increasedecrease the -value; decreasing the -value will increasedecrease the -value.

### Estimating partial derivatives

Functions of several variables, especially ones that map can be described by a table of values or level curves. In either case we can estimate partial derivatives by looking at Let’s do an example to make this more clear.

We can also estimate partial derivatives by looking at level curves.

### Combining partial derivatives

While a function only has one second derivative. However, functions have *second
partial derivatives* and functions have *second partial derivatives*! Don’t run off yet,
things get better.

- The
**second pure partial derivative**of with respect to then is - The
**second pure partial derivative**of with respect to then is

Moreover, there is also the notion of a **mixed partial derivative**, and The
notation is ambiguous, it does not state which derivative should be taken first. As
we will see, in practice this is not too much of a problem.

Finding and independently and comparing the results provides a convenient way of checking our work.

### The gradient vector

Given a function , we often want to work with all of first partial derivatives
simultaneously. In this case, we will work with the vector: As we will see, for
functions of several variables, this vector will play the role that the derivative did for
functions of a single variable. This vector is called the *gradient vector*.

**gradient**is a vector-valued function of variables.

The upside-down triangle in the notation for the gradient sometimes called a *del*. It
is also known as a *nabla*. You can think of the as the vector: and hence when one
writes: , you are literally distributing the across the vector, just as a scalar acts on a
vector.

The gradient at each point is a vector pointing in the -plane.

Try your hand at some casual computations.

And now in three variables.

This is just your first taste of the gradient vector. Much more will be coming soon.