
We study Taylor and Maclaurin series.

We’ve seen that we can approximate functions with polynomials, given that enough derivative information is available. We have also seen that certain functions can be represented by a power series. In this section we combine these concepts: If a function $f(x)$ is infinitely differentiable, we show how to represent it with a power series function.
Quick: Write down the Taylor series for $f(x) = x^3-6x^2+1$ centered at $x=0$.
Write down the Taylor series for $f(x) = x^3-6x^2+1$ centered at $x=1$. -

The difference between a Taylor polynomial and a Taylor series is the former is a polynomial, containing only a finite number of terms, whereas the latter is a series, a summation of an infinite set of terms, any number of which (including an infinite number) may be zero. When creating the Taylor polynomial of degree $n$ for a function $f(x)$ at $x=c$, we needed to evaluate $f$, and the first $n$ derivatives of $f$, at $x=c$. When creating the Taylor series of $f$, it helps to find a pattern that describes the $n$th derivative of $f$ at $x=c$. Time for examples!

Let’s see an example that is not centered at $x=0$:

Finally, sometimes Taylor’s formula may not be the best way to compute the Taylor series.

Above we implicitly used the following theorem:

This is just saying that if you know a power series for a function, then using Taylor’s formula will do nothing but give you the power series.

Here is an somewhat unsatisfying example:

A more satisfying example is the following:

We will find that “most of the time” they are equal, but we need to consider the conditions that allow us to conclude this. Taylor’s theorem states that the error between a function $f(x)$ and its $n$th degree Taylor polynomial is $R_n(x)$: and that where $M$ is the maximum value of $|f^{(n+1)}|$ on $[c,x]$. If $R_n(x)$ goes to $0$ for each $x$ in an interval $I$ as $n$ approaches infinity, we conclude that the function is equal to its Taylor series expansion. This leads us to our next theorem:

We’ll work a representative example of this theorem to see what is going on, the general case is much the same.

There is good news. A function $f$ that is equal to its Taylor series, centered at any point the domain of $f(x)$, is said to be an analytic function, and most, if not all, functions that we encounter within this course are analytic functions. Generally speaking, any function that one creates with elementary functions (polynomials, exponentials, trigonometric functions, etc.) that is not piecewise defined is probably analytic. For most functions, we assume the function is equal to its Taylor series on the series’ interval of convergence and only check the remainder (as above) when we suspect something may not work as expected.