Line 70: Line 70:
  
 
<math>\lim_{N\to\infty}\frac{S(N) + S(N+1)}{2}=\lim_{N\to\infty}\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2}=\frac{\pi}{4} + 0 = \frac{\pi}{4}</math>
 
<math>\lim_{N\to\infty}\frac{S(N) + S(N+1)}{2}=\lim_{N\to\infty}\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2}=\frac{\pi}{4} + 0 = \frac{\pi}{4}</math>
 +
 +
Okay, so the new series still approaches <math>\frac{\pi}{4}</math>, but does it approach it more quickly?  Again, we can use error estimation to predict the convergence of the series.
 +
 +
<math>|E(N)|\le\frac{1}{2(N+1)+1}\le\frac{1}{2N}</math>
 +
 +
<math>|E(N+1)|\le\frac{1}{2(N+2)+1}\le\frac{1}{2N+2}</math>
 +
 +
Also remember that either E(N) or E(N+1) is negative and that <math>|E(N)|>|E(N+1|</math> for all N.  So if we sum them, the absolute value of the sum will be the difference of their absolute values.
 +
 +
<math>|E(N)+E(N+1)|=|E(N)|-|E(N+1)|\le\frac{1}{2N}-\frac{1}{2N+2}=\frac{2}{(2N)(2N+2)}</math>
 +
 +
Finally, the new error function is this sum divided by two.
 +
 +
<math>|E'(N)|=\frac{|E(N)+E(N+1)|}{2}\le\frac{\frac{2}{(2N)(2N+2)}}{2}=\frac{1}{(2N)(2N+2)}</math>
 +
 +
Wow!  By taking the average of two consecutive sums we can guarantee that the new error will be less than <math>\frac{1}{(2N)^2}</math>.
 +
 +
So taking the mean of two consecutive series is in fact much more accurate than doing the simple series, especially for large N.  Oh, but what if we were to take the mean of two consecutive means.  Does the same principal apply?  Let's review.
 +
 +
The first average worked because for any two consecutive sums, one would be too high and one too little,  Furthermore, the absolute value of the second error would be less than the absolute value of the first.  Because of these two important characteristics, when we take the average of two consecutive terms the average of their errors is actually the difference of their absolute values.  Upon further investigation, we see that since the absolute value of the preceding error is greater than the absolute value of the second error, the new error of the average will have the same sign as the first error.  Or in other words.
 +
 +
<math>\text{For N even, }E'(N)>0</math>
 +
 +
<math>\text{For N odd, }E'(N)<0</math>
 +
 +
And from our new error estimates we see that <math>|E'(N)|>|E'(N+1)|</math>
 +
 +
Hooray! The same characteristics that allowed us to take averages of the original series and find a new estimate that was more accurate than both of the previous estimates are now present in this new method.  So let's try to take the average of two consecutive averages from the beginning and see what happens.
 +
 +
<math>\frac{\frac{S(N)+S(N+1)}{2}+\frac{S(N+1)+S(N+2)}{2}}{2}=\frac{S(N)+2S(N+1)+S(N+2)}{4}=\frac{\pi+E(N)+2E(N+1)+E(N+2)}{4}=\frac{\pi}{4}+\frac{E(N)+2E(N+1)+E(N+2)}{4}</math>

Revision as of 11:43, 7 December 2008

We will start this from the beginning with the series:

$ 1+r+r^2+r^3+...+r^N=\frac{1}{1-r}-\frac{r^{N+1}}{1-r} $

From here we substitute $ r=-x^2 $ to get

$ 1-x^2+x^4-x^6+...+(-1)^Nx^{2N}=\frac{1}{1+x^2}-\frac{(-1)^{N+1}x^{2(N+1)}}{1+x^2} $

Now integrate both sides from 0 to 1 with respect to x.

$ \int_0^11-x^2+x^4-x^6+...+(-1)^Nx^{2N}dx=\int_0^1\frac{1}{1+x^2}dx-\int_0^1\frac{(-1)^{N+1}x^{2(N+1)}}{1+x^2}dx $

$ 1-1/3+1/5-1/7+...+\frac{(-1)^N}{2N+1}=\frac{\pi}{4}+\int_0^1\frac{(-1)^{N}x^{2(N+1)}}{1+x^2}dx $

Notice that $ \lim_{N\to\infty}\int_0^1\frac{(-1)^{N}x^{2(N+1)}}{1+x^2}dx=0 $ so the more terms we use on the left side of the equation the closer we get to $ \frac{\pi}{4} $. This integral can therefore be called the error function.

For the sake of space we will give labels for several parts.

$ S(N)=1-1/3+1/5-1/7+...+\frac{(-1)^N}{2N+1} $

$ E(N)=\int_0^1\frac{(-1)^{N}x^{2(N+1)}}{1+x^2}dx $

$ S(N)=\frac{\pi}{4}+E(N) $

We can call S(N) the sum and E(N) the error, since as you add more and more terms to the sum, the error becomes closer and closer to zero.

Just how quickly does this sum approach $ \frac{\pi}{4} $? To find out, we can estimate the error for any N by the following comparison:

for $ 0\le x\le 1 $

$ 0\le x^2\le 1 $

$ 1\le 1+x^2 \le 2 $

$ \frac{1}{2}\le \frac{1}{1+x^2}\le 1 $

so

$ \frac{x^{2(N+1)}}{1+x^2}\le x^{2(N+1)} $

And finally

$ |E(N)|=|\int_0^1\frac{(-1)^{N}x^{2(N+1)}}{1+x^2}dx|=\int_0^1\frac{x^{2(N+1)}}{1+x^2}dx\le\int_0^1x^{2(N+1)}dx $

Integrating the estimate gives us:

$ |E(N)|\le\frac{1}{2(N+1)+1}\le\frac{1}{2N} $

Wow, the sum of N terms can reall only be guaranteed to be within $ \frac{1}{2N} $ of $ \frac{\pi}{4} $. That's a lot of terms we have to take to guarantee even two digits of accuracy. And it's even worse when we multiply everything by four to estimate pi.

We need a faster way. There's lots of faster ways. But let's say we're too lazy to find another method. We just want to work with this series. What can we do to make it converge to $ \frac{\pi}{4} $ faster?

First, let's notice that for every other term the sum is too great and then too little. $ \frac{\pi}{4} $ is always between two consecutive sums. Dr. Bell noticed this and posed the challenge to exploit this property in order to make a new way to estimate that converges more quickly. He said we should try to take the average of two consecutive terms. Since the target lies between the two sums, the average of the sums should yield a new estimate that is closer than both the other two. The challenge was to prove this true.

So we notice that for

$ N=\text{odd}, E(N)<0 $

$ N=\text{even}, E(N)>0 $

Now we add two consecutive sums together and divide by two to get the average:

$ \frac{S(N) + S(N+1)}{2} = \frac{2\frac{\pi}{4}+E(N)+E(N+1)}{2}=\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2} $

So now we have a new error because

$ \lim_{N\to\infty}\frac{E(N)+E(N+1)}{2}=\lim_{N\to\infty}\frac{E(N)}{2}+\lim_{N\to\infty}\frac{E(N+1)}{2}=0+0=0 $

And so the average still goes to pi/4 as N goes to infinity because:

$ \lim_{N\to\infty}\frac{S(N) + S(N+1)}{2}=\lim_{N\to\infty}\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2}=\frac{\pi}{4} + 0 = \frac{\pi}{4} $

Okay, so the new series still approaches $ \frac{\pi}{4} $, but does it approach it more quickly? Again, we can use error estimation to predict the convergence of the series.

$ |E(N)|\le\frac{1}{2(N+1)+1}\le\frac{1}{2N} $

$ |E(N+1)|\le\frac{1}{2(N+2)+1}\le\frac{1}{2N+2} $

Also remember that either E(N) or E(N+1) is negative and that $ |E(N)|>|E(N+1| $ for all N. So if we sum them, the absolute value of the sum will be the difference of their absolute values.

$ |E(N)+E(N+1)|=|E(N)|-|E(N+1)|\le\frac{1}{2N}-\frac{1}{2N+2}=\frac{2}{(2N)(2N+2)} $

Finally, the new error function is this sum divided by two.

$ |E'(N)|=\frac{|E(N)+E(N+1)|}{2}\le\frac{\frac{2}{(2N)(2N+2)}}{2}=\frac{1}{(2N)(2N+2)} $

Wow! By taking the average of two consecutive sums we can guarantee that the new error will be less than $ \frac{1}{(2N)^2} $.

So taking the mean of two consecutive series is in fact much more accurate than doing the simple series, especially for large N. Oh, but what if we were to take the mean of two consecutive means. Does the same principal apply? Let's review.

The first average worked because for any two consecutive sums, one would be too high and one too little, Furthermore, the absolute value of the second error would be less than the absolute value of the first. Because of these two important characteristics, when we take the average of two consecutive terms the average of their errors is actually the difference of their absolute values. Upon further investigation, we see that since the absolute value of the preceding error is greater than the absolute value of the second error, the new error of the average will have the same sign as the first error. Or in other words.

$ \text{For N even, }E'(N)>0 $

$ \text{For N odd, }E'(N)<0 $

And from our new error estimates we see that $ |E'(N)|>|E'(N+1)| $

Hooray! The same characteristics that allowed us to take averages of the original series and find a new estimate that was more accurate than both of the previous estimates are now present in this new method. So let's try to take the average of two consecutive averages from the beginning and see what happens.

$ \frac{\frac{S(N)+S(N+1)}{2}+\frac{S(N+1)+S(N+2)}{2}}{2}=\frac{S(N)+2S(N+1)+S(N+2)}{4}=\frac{\pi+E(N)+2E(N+1)+E(N+2)}{4}=\frac{\pi}{4}+\frac{E(N)+2E(N+1)+E(N+2)}{4} $

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang