We want to prove that
f (x) = 2x − 1 goes to
infinity as x goes to infinity. To this end we have to show that we can
win the game from the definition. We will use the general one with
neighborhoods.
Assume we were given some
ε > 0. We need to find a
δ > 0 so that if
x∈Uδ(∞),
then also
f (x)∈Uε(∞).
We start by translating this into inequalities. We need to make sure that
f (x) > 1/ε
and we do it by restricting x to the neighborhood of infinity
given by the inequality
x > 1/δ.
We start by exploring the desired inequality.
f (x) > 1/ε
2x − 1 > 1/ε
2x > 1/ε + 1
x > (1/ε + 1)/2.
Note that the operations were all equivalent, so the first and last lines are
equivalent. Now let's review it. We want to make the first line true, and
what we can do is to make x as large as we wish by choosing the
right
δ. Now it seems clear what
we should do: We choose
δ = 2/(1/ε + 1).
Now we check that what we did fulfills the requirement from the definition.
Somebody gave us an arbitrary
ε > 0
(that "arbitrary" is the key here, we did it for all epsilons, not just a
certain nice one). We then decided to choose
δ = 2/(1/ε + 1).
Is this the right delta?
Let's check. Any x that satisfies
x∈Uδ(∞)
then by definition of this neighborhood satisfies
x > 1/δ = (1/ε + 1)/2.
Therefore x also satisfies
f (x) = 2x − 1 > 1/ε,
which means that
f (x)∈Uε(∞)
exactly as needed, we won the game.
The proof is complete.