## Intro

In the third century B.C., Archimedes used limits to approximate the area of curved figures and the volume of spheres.

This work of Archimedes' was lost and remained unknown to the public until 1906, long after limits had been rediscovered by others.

Despite finally receiving his posthumous recognition for this, Archimedes is today better known for running naked through his home town in Greece.

As the story goes, Archimedes jumped out of the bath and didn't even bother getting dressed before he started running home, enthusiastically screaming *eureka, eureka!* (I've got it, I've got it!).

The reason for his excitement was a genius idea that came to him out of the blue, and it is after this incident that we now call such a sudden insight an eureka moment.

## Concept

For some functions, there may be certain values of where the function is not defined.

As an example, we cannot divide by , so a function such as:

will not have a value at .

With this in mind, we are often interested in studying how the function behaves as we close in on such points.

This is where the notion of limits comes in, which looks at the function's value in the vicinity of a point.

## Math

Consider again the function from the previous section:

As tends to , then grows larger and larger. Consequently:

Generally, we denote the limit of a function at any point as:

Similarly, we can study the limit as the value tends to some large positive number:

or some very large negative number:

## Limit at a point

The limit tells us how the function behaves as we keep taking steps closer and closer to a particular value.

For example, have a look at the function .

As approaches , the function value tends to . This is written as

In this case, plugging in will give you the limit as approaches . Easy, right?

However, this case is a luxury. Consider the function:

The function is undefined for , since we can't divide by . Judging by the function graph, it seems as if:

What do we *actually* mean when we're saying "as approaches , the function value tends to "? In symbols, this is:

Well, this means that the function value can get very close to if we take enough steps towards on the axis.

And by "very close to ", we mean that falls within the narrow corridor pictured below.

Now assume that

You give me a tiny corridor, I'll find an value close enough to so that fits. It's just a matter of moving closer to .

Ta-da. That's essentially the definition of a limit.

You give me a tiny corridor, I'll find an value close enough to so that fits

### Rules for calculating limits

Even for simple functions, calculating limits by definition is quite laborious.

Here are a few rules for computing limits.

Let and be functions such that and , when . Then:

a) when ,

b) , when ,

c) if , then when ,

d) if for all , then .

Furthermore, there's a rule for the composition of limits.

Let and be functions such that:

Then:

In the last theorem, all we have is:

The value we feed approaches , so we're effectively saying:

## One-sided limits

The limit of a function , for some value , is a number that let's us know how the function behaves as we approach .

With that in mind, we can talk about the limit of as we move toward from the right or from the left. This is what is meant by *one-sided limits*.

As we move towards from the right is also called to approach from *above*, which we denote by:

Similarly, as we move toward from the left we say that we approach from *below*:

In some cases, the two limits are the same, and at other times they are not.

Limits often reveal useful information about a function at point where it is undefined.

Look at the following function:

We may be tempted to simplify the expression as:

When doing so, we must be careful not to include the point , where the denominator would be zero and the function undefined.

Although the function has no value right at , we can see that as approaches from either side, the value of the function approaches , and so:

or simply:

Now consider this piecewise defined function instead:

Here we see that:

while

Since:

then we can conclude that the limit of does not exist at .

## Limits at infinity

### Intuition

Some functions have the property that, as approaches infinity, then tends to some value .

We say that *the limit of when tends to infinity is *. It's written as:

What we mean by this can be illustrated as a game. The game is between you and me, and I am destined to lose.

Say we have a function which has the limit , as tends to . I`ll start the game by giving you a quite narrow corridor, parallel to the -axis and centered around .

Then as you walk along the -axis you'll find a value for which stays within the corridor for all values of greater than this .

My turn again, IÂ´ll try to make it hard, making my corridor times more narrow. However, you continue your trip along the -axis and have no problem finding a new so that again, for all further to the right, stays nicely within my corridor.

The function is said to converge if there is a value for which stays inside the corridor for all values greater than .

The game can go on forever, but you can always counter my corridor width with a bigger -value and win the round.

### Rules for calculating limits at infinity

As for limits at a point, there are some helpful rules for calculating limits at infinity. They are given by the following theorem:

Let and be functions such that and , when . Then:

a) when

b) , when

c) if , then when

d) if for all , then

The in needs to be chosen so that for every .

Finally, there's a practical rule for the limit of .

Let and be functions such that:

Then:

This last theorem may look messy at first, so let's break it down. Starting from the inside, we notice that we are basically plugging in a random argument which tends to infinity into :

Thus, what we actually end up with is just:

## Infinite limits

Given a function:

As approaches , becomes bigger and bigger - infinitely big. Its function graph shoots off to infinity, so we write:

But wait, isn't a number. How should we deal with the expression above?

Let's play a game. I give you a number , you give me an value such that . Ready?

As a warm-up, I'll give you . You counter by giving me . Easy peasy.

How about ? Just use .

Lets try another number. I say . Well, that's also easy. You can just use . At this point, I give up.

This game illustrates what we mean by expressions like

or

It means that whatever number I give you, you can always find an such that .

Whichever value I give you, you can always find a value such that

It means I'll lose, and you'll win.

## The squeeze theorem

Whenever possible, mathematicians like finding descriptive names for theorems. For instance, there's a *Hairy Ball theorem*, and a *Law of the Unconscious Statistician*.

However, it's hard to beat the *Two Officers and a Drunk theorem*. (Yes, this really is a name for a theorem!) But it's more commonly known by a shorter alias, the *Squeeze theorem*.

So what does the theorem say?

Imagine two police officers escorting a drunk prisoner between them. They all drive at the same speed.

The prisoner may wobble about between the officers, but to no avail. As both police officers drive to the police station, the prisoner also ends up in the police station.

This idea is at the heart of the Squeeze theorem.

In the Squeeze theorem, is analogous to the prisoner, whereas the two officers correspond to two bounding functions.

If the functions and have the same limit at the point , and:

then:

The Squeeze theorem can also be applied to limits at infinity, like

## Standard limits

Let and be two functions that tend to the same value as approaches some point .

What then is the following limit:

For expressions consisting of a function divided by another function, where the two tend toward the same value as approaches some point, it may not be obvious what value the whole expression tends toward.

Both equations could for example grow larger and larger as tends to infinity, but which one of them grows faster?

This question is often interesting to computer scientists studying time complexity, where they aim to compare the speed of different algorithms.

To help solving this problem, there are a handful of *standard limits* with known values we can use.

### Standard limit 1

In the graph, and both have the value of , but for any exponential function with a base greater than , the exponential will grow faster than the power function in the denominator.

### Standard limit 2

Like in the last example, the choice of base for the logarithm and exponent for the polynomial does not matter. The logarithmic curve will always flatten out and be outrun by the polynomial.

### Standard limit 3

Notice how the graph of and follow each other around .