$\newenvironment {prompt}{}{} \newcommand {\ungraded }{} \newcommand {\todo }{} \newcommand {\oiint }{{\large \bigcirc }\kern -1.56em\iint } \newcommand {\mooculus }{\textsf {\textbf {MOOC}\textnormal {\textsf {ULUS}}}} \newcommand {\npnoround }{\nprounddigits {-1}} \newcommand {\npnoroundexp }{\nproundexpdigits {-1}} \newcommand {\npunitcommand }{\ensuremath {\mathrm {#1}}} \newcommand {\RR }{\mathbb R} \newcommand {\R }{\mathbb R} \newcommand {\N }{\mathbb N} \newcommand {\Z }{\mathbb Z} \newcommand {\sagemath }{\textsf {SageMath}} \newcommand {\d }{\mathop {}\!d} \newcommand {\l }{\ell } \newcommand {\ddx }{\frac {d}{\d x}} \newcommand {\zeroOverZero }{\ensuremath {\boldsymbol {\tfrac {0}{0}}}} \newcommand {\inftyOverInfty }{\ensuremath {\boldsymbol {\tfrac {\infty }{\infty }}}} \newcommand {\zeroOverInfty }{\ensuremath {\boldsymbol {\tfrac {0}{\infty }}}} \newcommand {\zeroTimesInfty }{\ensuremath {\small \boldsymbol {0\cdot \infty }}} \newcommand {\inftyMinusInfty }{\ensuremath {\small \boldsymbol {\infty -\infty }}} \newcommand {\oneToInfty }{\ensuremath {\boldsymbol {1^\infty }}} \newcommand {\zeroToZero }{\ensuremath {\boldsymbol {0^0}}} \newcommand {\inftyToZero }{\ensuremath {\boldsymbol {\infty ^0}}} \newcommand {\numOverZero }{\ensuremath {\boldsymbol {\tfrac {\#}{0}}}} \newcommand {\dfn }{\textbf } \newcommand {\unit }{\mathop {}\!\mathrm } \newcommand {\eval }{\bigg [ #1 \bigg ]} \newcommand {\seq }{\left ( #1 \right )} \newcommand {\epsilon }{\varepsilon } \newcommand {\phi }{\varphi } \newcommand {\iff }{\Leftrightarrow } \DeclareMathOperator {\arccot }{arccot} \DeclareMathOperator {\arcsec }{arcsec} \DeclareMathOperator {\arccsc }{arccsc} \DeclareMathOperator {\si }{Si} \DeclareMathOperator {\scal }{scal} \DeclareMathOperator {\sign }{sign} \newcommand {\arrowvec }{{\overset {\rightharpoonup }{#1}}} \newcommand {\vec }{{\overset {\boldsymbol {\rightharpoonup }}{\mathbf {#1}}}\hspace {0in}} \newcommand {\point }{\left (#1\right )} \newcommand {\pt }{\mathbf {#1}} \newcommand {\Lim }{\lim _{\point {#1} \to \point {#2}}} \DeclareMathOperator {\proj }{\mathbf {proj}} \newcommand {\veci }{{\boldsymbol {\hat {\imath }}}} \newcommand {\vecj }{{\boldsymbol {\hat {\jmath }}}} \newcommand {\veck }{{\boldsymbol {\hat {k}}}} \newcommand {\vecl }{\vec {\boldsymbol {\l }}} \newcommand {\uvec }{\mathbf {\hat {#1}}} \newcommand {\utan }{\mathbf {\hat {t}}} \newcommand {\unormal }{\mathbf {\hat {n}}} \newcommand {\ubinormal }{\mathbf {\hat {b}}} \newcommand {\dotp }{\bullet } \newcommand {\cross }{\boldsymbol \times } \newcommand {\grad }{\boldsymbol \nabla } \newcommand {\divergence }{\grad \dotp } \newcommand {\curl }{\grad \cross } \newcommand {\lto }{\mathop {\longrightarrow \,}\limits } \newcommand {\bar }{\overline } \newcommand {\surfaceColor }{violet} \newcommand {\surfaceColorTwo }{redyellow} \newcommand {\sliceColor }{greenyellow} \newcommand {\vector }{\left \langle #1\right \rangle } \newcommand {\sectionOutcomes }{} \newcommand {\HyperFirstAtBeginDocument }{\AtBeginDocument }$

There are two ways to establish whether a sequence has a limit.

In the previous section, we defined a sequence as a function defined on a subset of the natural numbers, and we discussed how we can represent this by an ordered list. We chose the notation $\{a_n\}_{n=1}$ to denote the list below.

In the previous section, we found many ways to generate this list. Regardless of how we obtain it, there are two fundamental questions we can ask.

• Do the numbers in the list approach a finite value?
• Can I add all of the numbers in the list together and obtain a finite result?

As it turns out, the second question will be more important for us. However, as we will see in a future section, we can reduce the second question to the first one. As such, we should examine the first question in detail. We begin by giving an intuitive definition.

This intuitive definition of a limit can be made more precise as follows.

This precise definition captures the same idea as the intuitive definition but makes it more precise. The quantity $\epsilon$ measures how close the terms in the sequence are to the limit $L$. We say that the limit exists and is $L$ if we can choose how far we want the terms to be from $L$ and we know the terms in the sequence eventually become and stay that close to $L$.

Suppose that $\{a_n\}_{n=1}$ is a sequence and that $\lim _{n \to \infty } a_n = L$. Intuitively, what can we say about $\lim _{n \to \infty } a_{n+1}$? $\lim _{n \to \infty } a_{n+1}$ exists, but we do not know what its value is. $\lim _{n \to \infty } a_{n+1}$ exists, and $\lim _{n \to \infty } a_{n+1}=L$ exists. $\lim _{n \to \infty } a_{n+1}$ may or may not exist.

One way to think about this is by noting that the sequence $\{a_n\}_{n=1}$ is represented by the list

while the sequence $\{a_{n+1}\}_{n=1}$ is represented by the list below.

If the first sequence tends to $L$, the second sequence must also tend to $L$.

### Connections to real-valued functions

Since sequences are functions defined on the integers, the notion of a “limit at a specific $n$” is not very interesting since we can explicitly find $a_n$ for a given $n$. However, limits at infinity are a different story. An important question can now be asked; given a sequence, how do we determine if it has a limit?

There are several techniques that allow us to find limits of real-valued functions, and we have seen that if we have a sequence, we can often find a real-valued function that agrees with it on their common domains. Suppose that we have found a real-valued function $f(x)$ that agrees with $a_n$ on their common domains, i.e. that $f(n)=a_n$. If we know $\lim _{x\to \infty } f(x)$, can we use this to conclude something about $\lim _{n \to \infty } a_n$?

Before answering this question, consider the following cautionary example.

What can we conclude from the above example?

If $\lim _{n \to \infty } a_n$ exists, then $\lim _{x \to \infty } f(x)$ exists. If $\lim _{x \to \infty } f(x)$ does not exist, then $\lim _{n \to \infty } a_n$ does not exist. If $\lim _{x \to \infty } f(x)$ does not exist, $\lim _{n \to \infty } a_n$ may still exist.

This might lead us to believe that we need to develop a whole new arsenal of techniques in order to determine if limits of sequences exist, but there is good news.

### Calculating limits of sequences

If we think about the theorem a bit further, the conclusion of the theorem and the content of the preceding example should seem reasonable. If the values of $f(x)$ become arbitrarily close to a number $L$ for all arbitrarily large $x$-values, then the result should still hold when we only consider some of these values. However, if we only know what happens for some arbitrarily large $x$-values, we cannot say what happens for all of them!

In practice, we use the above theorem to compute limits without explicitly exhibiting the function of a real variable from which the limit is derived.

### Computing limits of sequences using dominant term analysis

The last example shows us that for many sequences, we can employ the same techniques that we used to compute limits previously. While algebraic techniques and L’Hopital’s rule are useful, in many of the following sections, being able to determine limits quickly is an important skill.

In the preceding example, we say that the dominant term in the numerator is $n^3$ and that the dominant term in the denominator is $-4n^4$ because these terms are the only ones that are relevant when finding the limit.

Sometimes, this technique can be used to find limits where L’Hopital’s rule or an algebraic approach would be complicated.

#### Growth rates

The preceding examples illustrate that higher positive powers of $n$ grow more quickly than lower positive powers of $n$. We can introduce a little notation that captures the rate at which terms in a sequence grow in a succinct way.

In essence, writing $a_n \ll b_n$ says that the sequence $(b_n)$ grows much faster than $(a_n)$.

Many sequences of interest involve terms other than powers of $n$. It is often useful to understand how different types of functions grow relative to each other.

The first inequality in this theorem essentially guarantees that any power of $\ln (n)$ grows more slowly than any power of $n$. For example:

This allows us to extend the dominant term idea to more complicated expressions.

#### The squeeze theorem

Previously, when considering limits, one of our techniques was to replace complicated functions by simpler functions. The Squeeze Theorem tells us one situation where this is possible.

Let’s see an example.

The squeeze theorem is helpful in establishing a more general result about geometric sequences.

Of course, when $r$ is positive, the squeeze theorem is not necessary, but it is useful when establishing the convergence results for $1 as in the preceding example.

### Existence results for limits

Perhaps the best way to determine whether the limit of a sequence exists is to compute it. Even though we’ve been working with sequences that are generated by an explicit formula in this section thus far, not all sequences are defined this way. Sometimes, we’ll only have a recursive description of a sequence rather than an explicit one, and sometimes we will have neither. In many of the coming sections, we will only have a recursive description of a sequence, so we want to determine a good approach for determining whether a limit exists without having to compute it directly. To do this, we introduce some terminology focused on the relationships between the terms of a sequence.

Many facts are true for sequences that are either increasing or decreasing; to talk about this situation without constantly saying “either increasing or decreasing,” we can introduce a single word to cover both cases.

For many instances, we will only need that certain sequences are eventually monotonic, and we will see this in the sections that follow.

If an arithmetic sequence $a_n = m\cdot n + b$ is monotonic, what must be true about $m$ and $b$?

The sign of $m$ is positiveis negativedoes not matter , and the sign of $b$ is is positiveis negativedoes not matter

If a geometric sequence $a_n = a_1 \cdot r^{n-1}$ is monotonic, what must be true about $a_1$ and $r$?

The sign of $a_1$ is positiveis negativedoes not matter , and the sign of $r$ is is positiveis negativedoes not matter

Sometimes we want to classify sequences for which the terms do not get too big or too small.

So what does this definition actually say? Essentially, we say that a sequence is bounded above if its terms cannot become too large, bounded below if its terms cannot become too large and negative, and bounded if the terms cannot become too large and positive or too large and negative.

True or False: If a sequence $(a_n)_{n=0}^\infty$ is increasing it is bounded below by $a_0$.
True False
True or False: If a sequence $(a_n)_{n=0}^\infty$ is decreasing it is bounded above by $a_0$.
True False

So, what do these previous definitions have to do with the idea of a limit? Essentially, there are three reasons that a sequence may diverge:

• the terms eventually are either always positive or always negative but become arbitrarily large in magnitude.
• the terms are never eventually monotonic.
• the terms are never eventually monotonic and become arbitrarily large in magnitude.

Let’s think about the terminology we introduced.

Think about the following statements and choose the correct option.
• If we know that a sequence is monotonic and its limit does not exist, then the terms become too large in magnitudethe terms are never eventually monotonicthe terms are never eventually monotonic and become arbitrarily large in magnitude .
• If we know that a sequence is bounded and its limit does not exist, then the terms become arbitrarily large in magnitude.the terms are never eventually monotonicthe terms are never eventually monotonic and become arbitrarily large in magnitude .

We can now state an important theorem:

To think about the statement of the theorem, if we have a sequence that is bounded, the only way it could diverge is if the terms are never eventually monotonic. However, if we know the sequence is also monotonic, this cannot happen! Thus, the series cannot diverge, so it must have a limit.

In short, bounded monotonic sequences always converge, though we can’t necessarily describe the number to which they converge. Let’s try some examples.

In the previous examples, we could write down a function $f(x)$ corresponding to each series and apply the theorem from earlier in the section. However, this is not always possible.