You are about to erase your work on this activity. Are you sure you want to do this?
Updated Version Available
There is an updated version of this activity. If you update to the most recent version of this activity, then your current progress on this activity will be erased. Regardless, your record of completion will remain. How would you like to proceed?
Mathematical Expression Editor
We introduce antiderivatives.
Computing derivatives is not too difficult. At this point, you should be able to take
the derivative of almost any function you can write down. However, undoing
derivatives is much harder. This process of undoing a derivative is called taking an
antiderivative.
A function \(F\) is called an antiderivative of \(f\) on an interval if
\[ F'(x) = f(x) \]
for all \(x\) in the interval.
How many antiderivatives does \(f(x) = 2x\) have?
none one infinitely many
The functions \(x^2\), \(x^2-1.5\), \(x^2+2\), \(x^2+5\) and
so on, are all antiderivatives of \(2x\).
It is clear that any function \(x^2+C\), where \(C\) is a constant, is an antiderivative of \(f\). Why?
Because, \((x^2+C)'=2x=f(x)\).
Could there exist any other antiderivative of \(f\), call it \(G\), such that \(G(x)\) is not a sum of \(x^2\) and
a constant? In that case, for all \(x\) we would have
Recall: Any function whose
derivative is equal to \(0\) on some interval is equal to a constant on that interval. This
means that the function \(G(x)-x^2=C\), for some constant \(C\). This implies that \(G(x)=x^2+C\), for all x.
The Family of Antiderivatives
If \(F\) is an antiderivative of \(f\), then the function \(f\) has a whole family of antiderivatives.
Each antiderivative of \(f\) is the sum of \(F\) and some constant \(C\).
So, when we write \(F(x)+C\), we denote the entire family of antiderivatives of \(f\). Alternative
notation for this family of antiderivatives was introduced by G.W. Leibniz
(1646-1716):
Let \(f\) be a function. The family of of all antiderivatives of \(f\) is denoted by
\[ \int f(x) dx. \]
This is called the indefinite integral of \(f\).
It follows that
\[ \int f(x) dx =F(x)+C, \]
where \(F\) is any
antiderivative of \(f\) and \(C\) is an arbitrary constant.
Now we are ready to ”integrate” some famous functions.
Fill out these basic antiderivatives. Note each of these examples comes directly from
our knowledge of basic derivatives.
It may seem that one could simply memorize these antiderivatives and antidifferentiating
would be as easy as differentiating. This is not the case. The issue comes up when
trying to combine these functions. When taking derivatives we have the
product rule and the chain rule. The analogues of these two rules are much
more difficult to deal with when taking antiderivatives. However, not all is
lost.
Consider the following example.
Find an antiderivative of the function \(h\) defined by
expression
\(h(x)=\cos {x}+\frac {1}{x}\), for all \(x\) in some interval \(I\).
Differentiate each choice. In the last choice we get
\(H'(x)=\cos {x}+\frac {1}{x}=h(x)\).
It is easy to recognize an antiderivative: we just have to differentiate it, and check
whether \(H'(x)=h(x)\), for all \(x\) in \(I\).
Notice, that the function \(h\) is the sum of the two functions, \(f\) and \(g\), where \(f(x)=\cos {x}\) and \(g(x)=\frac {1}{x}\), for \(x\) in
\(I\).
We know antiderivatives of both functions: \(F(x)=\sin {x}\) and \(G(x)=\ln {x}\), for \(x\) in \(I\), are antiderivatives of \(f\) and \(g\),
respectively. So, in this example we see that the function \(F+G\) is an antiderivative
of \(f+g\). In other words, ”the sum of antiderivatives is an antiderivative of a
sum”.
Is this true in general?
Let’s check whether this rule holds for any eligible pair of functions \(f\) and \(g\)
defined on some interval \(I\).
Let \(f\), \(g\), \(F\) and \(G\) be four functions defined on some
interval \(I\) such that \(F\) is an antiderivative of \(f\) and \(G\) is an antiderivative of \(g\),
i.e.
\(F'(x)=f(x)\) and \(G'(x)=g(x)\) for all \(x\) in some interval \(I\).
Find an antiderivative of the function \(f+g\).
Differentiate the function \(F+G\).
Since \(\left (F(x)+G(x)\right )'=F'(x)+G'(x)=f(x)+g(x)\), it follows
that \(F+G\) is an antiderivative of \(f+g\).
To summarize: The sum of antiderivatives is an antiderivative of a sum.
We have
proved the following theorem.
The Sum Rule for Antiderivatives If \(F\) is an antiderivative of \(f\) and \(G\) is an antiderivative
of \(g\), then \(F+G\) is an antiderivative of \(f+g\).
We can write equivalently, using indefinite integrals,
Differentiate each choice for \(H(x)\). In the first choice we
get \(H'(x)=5\sec ^{2}{x}=h(x)\).
It is easy to recognize an antiderivative: we just have to differentiate it, and
check whether \(H'(x)=h(x)\), for all \(x\) in \(I\).
Notice, in this example the function \(h\) is a constant multiple of \(f\),
where \(f(x)=\sec ^{2}{x}\). On the other hand, we know that the function \(F\), defined by \(F(x)=\tan {x}\) is an
antiderivative of \(f\).
If we differentiate the function \(5F\), we get that
\(\left (5F(x)\right )'=\left (5\tan {x}\right )'=5\left (\tan {x}\right )'=5\sec ^{2}{x}=5f(x)\), for \(x\) in \(I\).
In other words, “a constant multiple of an antiderivative is an antiderivative of a
constant multiple of a function.” Is this always true?
Let’s check whether this rule holds for any constant \(k\) and any eligible function \(f\)
defined on some interval \(I\).
Let \(k\) be a constant, let \(f\) be a function defined on some
interval \(I\) and let \(F\) be an antiderivative of \(f\), i.e.
\(F'(x)=f(x)\), for all \(x\) in some interval \(I\).
Find an antiderivative of the function \(k f\).
Differentiate the function \(kF\).
Since \(\left (kF(x)\right )'=kF'(x)=kf(x)\), it follows
that \(kF\) is an antiderivative of \(kf\). To summarize: The constant multiple of an
antiderivative is an antiderivative of a constant multiple of a function.
We have
proved the following theorem.
The Constant Multiple Rule for Antiderivatives If
\(F\) is an antiderivative of \(f\), and \(k\) is a constant, then \(kF\) is an antiderivative of
\(kf\).
We can write equivalently, using indefinite integrals, \(\int kf(x) dx= k\int f(x) dx\).
Let’s put these rules and our
knowledge of basic derivatives to work.
Find the antiderivative of \(3 x^7\).
By the theorems
above , we see that \begin{align*} \int 3 x^7 dx &= 3 \int x^7 dx\\ &= 3 \cdot \answer [given]{\frac {x^8}{8}}+C. \end{align*}
The sum rule for antiderivatives allows us to integrate term-by-term. Let’s see an
example of this.
Unfortunately, we cannot tell you how to compute every antiderivative; we view them
as a sort of puzzle. Later we will learn a hand-full of techniques for computing
antiderivatives. However, a robust and simple way to compute antiderivatives is
guess-and-check.
Tips for guessing antiderivatives
(a)
If possible, express the function that you are integrating in a form that is
convenient for integration.
(b)
Make a guess for the antiderivative.
(c)
Take the derivative of your guess.
(d)
Note how the above derivative is different from the function whose
antiderivative you want to find.
(e)
Change your original guess by multiplying by constants or by adding
in new functions.
Compute:
\[ \int \frac {\sqrt {x}+1+x}{x} dx \]
Before guessing the solution, let’s express the function in the form
convenient for integration. Due to Sum Rule, it is more convenient to have a sum of
functions, instead of a single, complicated term. \begin{align*} \int \frac {\sqrt {x}+1+x}{x} dx &=\int \left (\frac {\sqrt {x}}{x}+\frac {1}{x}+\frac {x}{x}\right ) dx\\ &=\int \left (\frac {1}{\sqrt {x}}+\frac {1}{x}+1\right ) dx\\ &=\int \left (x^{-\frac {1}{2}}+\frac {1}{x}+1\right ) dx \end{align*}
Computing antiderivatives is a place where insight and rote computation meet.
We cannot teach you a method that will always work. Moreover, merely
understanding the examples above will probably not be enough for you to
become proficient in computing antiderivatives. You must practice, practice,
practice!
3 Differential equations
Differential equations show you relationships between rates of functions.
A differential equation is simply an equation with a derivative in it. Here is an
example:
\[ a\cdot f'(x)+ b\cdot f(x) = g(x). \]
Which one is a differential equation?
\(x^2+3x-1=0\)\(f'(x)=x^2+3x-1\)\(f(x)=x^2+3x-1\)
When a mathematician solves a differential equation, they are finding functions
satisfying the equation.
Which of the following functions solve the differential
equation
We can directly check that any function \(f(x)=Ce^x\) is a solution to
our differential equation \(f'(x)=f(x)\). Could there be any others? It turns out that these
are the only solutions. But showing that we didn’t miss any is a bit tricky.
Well, suppose we have some mysterious function \(f\) and all we know is that \(f'(x)=f(x)\). Let’s
define a new function \(g(x)=f(x)/e^x\). Since our denominator is never 0, the quotient rule tells
us that
But we know that a function with a 0 derivative is constant, so \(g(x)=C\).
Plugging this back into our formula for \(g(x)\) tells us that \(C=f(x)/e^x\). Now we rearrange to get
\(f(x)=Ce^x\).
This shows us that any solution to our differential equation must be one of the
functions we’ve already found. Our argument relies on some features that are special
to this problem; proving that we’ve found all the possible solutions to an arbitrary
differential equation is a very difficult task!
A function \(Ce^x\) is called a general solution of the differential equation. Since there are
infinitely many solutions to a differential equation, we can impose an additional
condition (say \(f(0)=1\)), called an initial condition. When we are asked to find a function \(f\)
that satisfies both the differential equation (DE) and the initial condition (IC), this is
called an initial value problem (IVP). Let’s try one out.
Solve the initial value
problem (IVP): \begin{align*} f'(x) & = f(x) && \text {(DE)} \\ f(0) & = 1 && \text {(IC)} \end{align*}
The figure below shows several solutions of the differential equation.
The figure suggests that
the only solution to the initial value problem is the function \(f(x)=e^x\). We can verify this
result.
Since all solutions of the differential equation have the form \(f(x)=Ce^x\) and \(f(0)=1\), it follows that
\[ Ce^0 = 1. \]
Therefore,
\[ C = \answer [given]{1}. \]
So, the unique solution to the given initial value problem is the function \(f(x)=e^x\).
Solve the initial value problem (IVP): \begin{align*} f'(x) & = \sin {x} \\ f(0) & = -1 \end{align*}
First, we have to solve the differential equation. The solution is clearly an
antiderivative of \(\sin {x}\).
\[ f(x)=-\answer [given]{\cos {x}}+C, \]
where \(C\) is an arbitrary constant. We have found the general
solution of the DE and this family of solutions is illustrated by the figure below.
Now we must find the
solution that also satisfies the initial condition \(f(0)=-1\).
Since
\[ f(0)=\answer [given]{-1}+C, \]
it follows that
\[ -1=\answer [given]{-1}+C. \]
Therefore,
\[ C=\answer [given]{0}. \]
The function
\[ f(x)=\answer [given]{-\cos {x}}. \]
is the solution of the initial value
problem and it is shown in the figure above.