We’ve been working towards defining some sort of “second derivative” for multivariable functions, which can tell us about the second-order behavior of functions. It will also enable us to define degree two Taylor polynomials, and we’ll later see how it can be used to classify critical points in optimization.

The Hessian Matrix

Suppose we have a differentiable function . We can take the gradient of this function. We can think of the gradient as a function . Assuming all of the partial derivatives exist, we can then take the derivative matrix of . This gives us a square matrix, which we call the Hessian of .

Notice that if has continuous first and second order partial derivatives, then the Hessian matrix will be symmetric by Clairaut’s Theorem.

Taylor Polynomials

We’re now in position to define the second-order Taylor polynomial of a function , using the Hessian matrix to find the degree two terms.

Notice that the order two part of the Taylor polynomial, will be a quadratic form when . We will soon make use of our classification of quadratic forms in order to use the Hessian matrix to determine the order two behavior of a function, which will be useful for optimizing multivariable functions.