Line 51: | Line 51: | ||
Recall the function <math>f</math> from the previous section. We found that it was continuous at <math>0</math> because <math>f(x)</math> is close to <math>f(0)</math> if <math>x</math> is close enough to <math>0</math>. We can do something similar with <math>F</math> and <math>G</math> here. From the graph, we can see that <math>F(x)</math> is close to <math>0</math> whenever <math>x</math> is close enough, but not equal to, <math>0</math>. Similarly, we see that <math>G(x)</math> is close to <math>0</math> whenever <math>x</math> is close enough, but not equal to, <math>0</math>. The "not equal" part is important for both <math>F</math> and <math>G</math> because <math>F</math> is undefined at <math>x=0</math> while <math>G</math> has a discontinuity there. The idea here is similar to that of continuity, but we ignore whatever happens at <math>x=0</math>. We are concerned more with how <math>F</math> and <math>G</math> behave near <math>0</math> rather than at <math>0</math>. This leads to the following definition. | Recall the function <math>f</math> from the previous section. We found that it was continuous at <math>0</math> because <math>f(x)</math> is close to <math>f(0)</math> if <math>x</math> is close enough to <math>0</math>. We can do something similar with <math>F</math> and <math>G</math> here. From the graph, we can see that <math>F(x)</math> is close to <math>0</math> whenever <math>x</math> is close enough, but not equal to, <math>0</math>. Similarly, we see that <math>G(x)</math> is close to <math>0</math> whenever <math>x</math> is close enough, but not equal to, <math>0</math>. The "not equal" part is important for both <math>F</math> and <math>G</math> because <math>F</math> is undefined at <math>x=0</math> while <math>G</math> has a discontinuity there. The idea here is similar to that of continuity, but we ignore whatever happens at <math>x=0</math>. We are concerned more with how <math>F</math> and <math>G</math> behave near <math>0</math> rather than at <math>0</math>. This leads to the following definition. | ||
− | '''DEFINITION.''' Let <math>f</math> be a function defined for all real numbers, with possibly one exception, and with range <math>\mathbb{R}</math>. Let <math>c</math> be any real number. We say that the limit of <math>f</math> at <math>c</math> is <math>b</math> and write <math>\lim_{x\to c}f(x)=b</math> if for every <math>\varepsilon>0</math>, there is a <math>\displaystyle\delta>0</math> such that <math>|f(x)-b|<\varepsilon</math> whenever <math>0<|x-c|<\delta</math>. | + | '''DEFINITION.''' Let <math>f</math> be a function defined for all real numbers, with possibly one exception, and with range <math>\mathbb{R}</math>. Let <math>c</math> be any real number. We say that the limit of <math>f</math> at <math>c</math> is <math>b</math> and write <math>\lim_{x\to c}f(x)=b</math> if for every <math>\varepsilon>0</math>, there is a <math>\displaystyle\delta>0</math> such that <math>|f(x)-b|<\varepsilon</math> whenever <math>\displaystyle 0<|x-c|<\delta</math>. |
+ | This is the same as the definition for continuity, except we ignore what happens at <math>c</math>. We can see this in two places in the above definition. The first is the use of <math>b</math> instead of <math>f(c)</math>, and the second is the condition <math>\displaystyle 0<|x-c|<\delta</math>, which says that <math>x</math> is close enough, but not equal to, <math>c</math>. This definition allows us restate the definition of continuity at a point more succinctly: if <math>c</math> is in the domain of <math>f</math>, then <math>f</math> is continuous at <math>c</math> if and only if <math>\displaystyle\lim_{x\to c}f(x)=f(c)</math>. | ||
---- | ---- |
Revision as of 01:15, 12 May 2014
Contents
Limit of a Function at a Point
by: Michael Yeh, proud Member of the Math Squad.
keyword: tutorial, limit, function, sequence
INTRODUCTION
Provided here is a brief introduction to the concept of "limit," which features prominently in calculus. We first discuss the limit of a function at a point; to help motivate the definition, we first consider continuity at a point. Unless otherwise mentioned, all functions here will have domain and range $ \mathbb{R} $, the real numbers. Words such as "all," "every," "each," "some," and "there is/are" are quite important here; read carefully!
Continuity at a point
Let's consider the the following three functions along with their graphs (in blue). The red dots in each correspond to $ x=0 $, e.g. for $ f $, the red dot is the point $ (0,f(0))=(0,0) $. Ignore the red dashed lines for now; we will explain them later.
- $ \displaystyle f(x)=x^3 $
- $ g(x)=\begin{cases}-x^2-\frac{1}{2} &\text{if}~x<0\\ x^2+\frac{1}{2} &\text{if}~x\geq 0\end{cases} $
- $ h(x)=\begin{cases} \sin\left(\frac{1}{x}\right) &\text{if}~x\neq 0\\ 0 &\text{if}~x=0\end{cases} $
We can see from the graphs that $ f $ is "continuous" at $ 0 $, and that $ g $ and $ h $ are "discontinuous" at 0. But, what exactly do we mean? Intuitively, $ f $ seems to be continuous at $ 0 $ because $ f(x) $ is close to $ f(0) $ whenever $ x $ is close to $ 0 $. On the other hand, $ g $ appears to be discontinuous at $ 0 $ because there are points $ x $ which are close to $ 0 $ but such that $ g(x) $ is far away from $ g(0) $. The same observation applies to $ h $.
Let's make these observations more precise. First, we will try to estimate $ f(0) $ with error at most $ 0.25 $, say. In the graph of $ f $, we have marked off a band of width $ 0.5 $ about $ f(0) $. So, any point in the band will provide a good approximation here. As a first try, we might think that if $ x $ is close enough to $ 0 $, then $ f(x) $ will be a good estimate of $ f(0) $. Indeed, we see from the graph that for any $ x $ in the interval $ (-\sqrt[3]{0.25},\sqrt[3]{0.25}) $, $ f(x) $ lies in the band (or if we wish to be more pedantic, we would say that $ (x,f(x)) $ lies in the band). So, "close enough to $ 0 $" here means in the interval $ (-\sqrt[3]{0.25},\sqrt[3]{0.25}) $; note that any point which is close enough to $ 0 $ provides a good approximation of $ f(0) $.
There is nothing special about our error bound $ 0.25 $. Choose a positive number $ \varepsilon $, and suppose we would like to estimate $ f(0) $ with error at most $ \varepsilon $. Then, as above, we can find some interval $ \displaystyle(-\delta,\delta) $ about $ 0 $ (if you like to be concrete, any $ \displaystyle\delta $ such that $ 0<\delta<\sqrt[3]{\varepsilon} $ will do) such that $ f(x) $ is a good estimate for $ f(0) $ for any $ x $ in $ \displaystyle(-\delta,\delta) $. In other words, for any $ x $ which is close enough to $ 0 $, $ f(x) $ will be no more than $ \varepsilon $ away from $ f(0) $.
Can we do the same for $ g $? That is, if $ x $ is close enough to $ 0 $, then will $ g(x) $ be a good estimate of $ g(0) $? Well, we see from the graph that $ g(0.25) $ provides a good approximation to $ g(0) $. But if $ 0.25 $ is close enough to $ 0 $, then certainly $ -0.25 $ should be too; however, the graph shows that $ g(-0.25) $ is not a good estimate of $ g(0) $. In fact, for any $ x>0 $, $ g(-x) $ will never be a good approximation for $ g(0) $, even though $ x $ and $ -x $ are the same distance from $ 0 $.
In contrast to $ f $, we see that for any interval $ \displaystyle(-\delta,\delta) $ about $ 0 $, we can find an $ x $ in $ \displaystyle(-\delta,\delta) $ such that $ g(x) $ is more than $ 0.25 $ away from $ g(0) $.
The same is true for $ h $. Whenever we find an $ x_1 $ such that $ h(x_1) $ lies in the band, we can always find a point $ x_2 $ such that 1) $ x_2 $ is just as close or closer to $ 0 $ and 2) $ h(x_2) $ lies outside the band. So, it is not true that if $ x $ is close enough to $ 0 $, then $ h(x) $ will be a good estimate for $ h(0) $.
Let's summarize what we have found. For $ f $, we saw that for each $ \varepsilon>0 $, we can find an interval $ \displaystyle(-\delta,\delta) $ about $ 0 $ ($ \displaystyle\delta $ depends on $ \varepsilon $) so that for every $ x $ in $ \displaystyle(-\delta,\delta) $, $ |f(x)-f(0)|<\varepsilon $. However, $ g $ does not satisfy this property. More specifically, there is an $ \varepsilon>0 $, namely $ \varepsilon=0.25 $, so that for any interval $ \displaystyle(-\delta,\delta) $ about $ 0 $, we can find an $ x $ in $ \displaystyle(-\delta,\delta) $ such that $ |g(x)-g(0)|\geq\varepsilon $. The same is true of $ h $.
Now we state the formal definition of continuity at a point. Compare this carefully with the previous paragraph.
DEFINITION. Let $ f $ be a function from $ \displaystyle A $ to $ \mathbb{R} $, where $ A\subset\mathbb{R} $. Then $ f $ is continuous at a point $ c\in A $ if for every $ \varepsilon>0 $, there is a $ \displaystyle\delta>0 $ such that $ |f(x)-f(c)|<\varepsilon $ for any $ x $ that satisfies $ \displaystyle|x-c|<\delta $. $ f $ is said to be continuous if it is continuous at every point of $ A $.
In our language above, $ \varepsilon $ is the error bound, and $ \displaystyle\delta $ is our measure of "close enough (to $ c $)." Note that continuity is defined only for points in a function's domain. So, the function $ k(x)=1/x $ is technically continuous because $ 0 $ is not in the domain of $ k $. If, however, we defined $ k(0)=0 $, then $ k $ will no longer be continuous.
The Limit of a Function at a Point
Now, let's consider the two functions $ F $ and $ G $ below. Note that $ F $ is left undefined at $ 0 $.
- $ F(x)=\begin{cases}-x^2-x &\text{if}~x<0\\ x&\text{if}~x>0\end{cases} $
- $ G(x)=\begin{cases}0 &\text{if}~x\neq 0\\ \frac{1}{2}&\text{if}~x=0\end{cases} $
Recall the function $ f $ from the previous section. We found that it was continuous at $ 0 $ because $ f(x) $ is close to $ f(0) $ if $ x $ is close enough to $ 0 $. We can do something similar with $ F $ and $ G $ here. From the graph, we can see that $ F(x) $ is close to $ 0 $ whenever $ x $ is close enough, but not equal to, $ 0 $. Similarly, we see that $ G(x) $ is close to $ 0 $ whenever $ x $ is close enough, but not equal to, $ 0 $. The "not equal" part is important for both $ F $ and $ G $ because $ F $ is undefined at $ x=0 $ while $ G $ has a discontinuity there. The idea here is similar to that of continuity, but we ignore whatever happens at $ x=0 $. We are concerned more with how $ F $ and $ G $ behave near $ 0 $ rather than at $ 0 $. This leads to the following definition.
DEFINITION. Let $ f $ be a function defined for all real numbers, with possibly one exception, and with range $ \mathbb{R} $. Let $ c $ be any real number. We say that the limit of $ f $ at $ c $ is $ b $ and write $ \lim_{x\to c}f(x)=b $ if for every $ \varepsilon>0 $, there is a $ \displaystyle\delta>0 $ such that $ |f(x)-b|<\varepsilon $ whenever $ \displaystyle 0<|x-c|<\delta $.
This is the same as the definition for continuity, except we ignore what happens at $ c $. We can see this in two places in the above definition. The first is the use of $ b $ instead of $ f(c) $, and the second is the condition $ \displaystyle 0<|x-c|<\delta $, which says that $ x $ is close enough, but not equal to, $ c $. This definition allows us restate the definition of continuity at a point more succinctly: if $ c $ is in the domain of $ f $, then $ f $ is continuous at $ c $ if and only if $ \displaystyle\lim_{x\to c}f(x)=f(c) $.
Questions and comments
If you have any questions, comments, etc. please, please please post them below:
- Comment / question 1
- Comment / question 2
The Spring 2014 Math Squad was supported by an anonymous gift to Project Rhea. If you enjoyed reading these tutorials, please help Rhea "help students learn" with a donation to this project. Your contribution is greatly appreciated.