In single variable calculus, you learned many differentiate rules and properties in order to more easily compute derivatives without the limit definition. These rules included the product rule, the quotient rule, the chain rule, and more, along with memorizing the derivatives of common functions.

Since partial derivatives obey the same rules as single variable derivatives, and the derivative matrix and gradient are comprised of partial derivatives, many of the rules listed above have analogous results in multivariable calculus. We’ll now start to explore some differentiation results for multivariable functions.

Linearity of the derivative

As you may recall from single variable calculus, “the derivative of the sum is the sum of the derivatives.” That is, if we have differentiable functions and , we compute the derivative of the sum by taking the sum of the derivatives and . For example,

An analogous result holds in multivariable calculus, for the derivative matrix.

Proof
We will begin by showing that on . Since is contained in the domains of both and , and both exist on .

Write and in terms of their component functions. Then, we have

giving us the component functions of .

Using the component functions found above, we take the derivative matrix of , using the linearity of partial derivatives.

Thus, we have on .

Next, we need to show that is differentiable on . In order to show this, we will show that the following limit evaluates to , for . This is done by separating the and terms, using the triangle inequality, and using the fact that and are both differentiable on .

You may also recall the constant multiple rule from single variable calculus. For example,

We have an analogous result in multivariable calculus.

The proof of this proposition is somewhat similar to the previous theorem, and it is left as an exercise.

Although these are important results, they actually aren’t particularly useful for computing derivative matrices. In practice, you’d compute the derivative matrix of a sum of functions by first adding the components, then differentiating. We include an example to demonstrate this.

Product and Quotient laws

We will now try to find multi-variable analogs to the product and quotient rules from single variable calculus. Let’s start by considering when these rules might make sense.

Suppose we have functions and . If we wanted to define the product of these functions, what would that mean? The outputs of and are vectors, so we’d be trying to multiply two vectors - but there isn’t really a clear multiplication on vectors! We could try multiplying component-wise, taking the dot product, or taking the cross product, and in different settings, these all might be reasonable things to do. We could work on finding product rules for all of the different ways we could “multiply” two vectors (these can be found in the exercises), but we’ll save some time, and focus on a case where we do have one clear choice for multiplication: scalar-valued functions.

Similarly, we have a multivariable quotient rule for scalar valued functions.

We’ll leave the proofs of these results as exercises, as they are similar to the single variable proofs.