(End Warning)
(No difference)

Revision as of 15:35, 21 October 2008

WARNING!

This page has some serious errors that I need to fix before it can be consulted as anything. However, I don't have time to fix them right now and I don't want to delete everything(even though I could just use history and restore it at anytime...). Just don't look to this for anything helpful, yet.

End Warning

(Yes, I made up the word Multimeans)

It was pointed out to me by Professor Bell that if averaging two consecutive sums significantly increases the accuracy of the sum, then wouldn't averaging two consecutive averages of three consecutive sums be even more accurate? And then we could add in another layer of averages. In fact we could do N-1 layers of accuracy for a series that goes to x^N (Which would mean (N-1)! averages to make along the way). This makes sense since the error of two consecutive averages would either be positive or negative since the average always favor the less accurate side (In the case of the average of S(N) and S(N+1), the error would be negative if the error of S(N) is negative and positive if the error of S(N) is positive.). Therefore, within just a very few sums and averages (which are much easier to make than sums themselves), we could get very accurate very fast, no? I've decided to show it works.

For the sake of space and organization let $ S(N)=\int_0^1S_N(x)dx=\frac{\pi}{4}+E(N) $

We will show averaging functions with subscripted functions where:

$ A_0(N)=\frac{S(N)+S(N+1)}{2} $

$ A_1(N)=\frac{A_0(N)+A_0(N+1)}{2} $

$ A_K(N)=\frac{A_{K-1}(N)+A_{K-1}(N+1)}{2} $ Where $ 1\le K $

Okay, now to make this work I'm gonna have to clarify certain things. We're concerned not with finding the actual averages (That can be done later), but the error of these averages. As I showed in ChallengeProblem_MA181Fall2008bell the error of an average where one average is too great the other average is too small is the average of the errors which is the difference in the absolute values of the errors divided by two. Because if:

$ X<0 $

$ Y>0 $

$ \frac{X + Y}{2}=\frac{|X|-|Y|}{2} $

So remember from the past proof that

$ A_0(N)=\frac{S(N)+S(N+1)}{2}=\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2} $

Where $ \frac{E(N)+E(N+1)}{2} $ is our error for $ A_0(N) $ And this error turned out to be: $ |ERROR|\le\frac{1}{4N^2+16N+15} $

Now we want to find the error of taking the average of two consecutive averages.

$ A_1(N)=\frac{A_0(N)+A_0(N+1)}{2} $

We know what A0(N) is, but what is A0(N+1)?

$ A_0(N+1)=\frac{S(N+1)+S(N+2)}{2}=\frac{2\frac{\pi}{4}+E(N+1)+E(N+2)}{2}=\frac{\pi}{4}+\frac{E(N+1)+E(N+2)}{2} $

so

$ A_1(N)=\frac{\frac{\pi}{4}+\frac{E(N)+E(N+1)}{2}+\frac{\pi}{4}+\frac{E(N+1)+E(N+2)}{2}}{2}=\frac{\pi}{4}+\frac{\frac{E(N+1)+E(N+2)}{2}+\frac{E(N)+E(N+1)}{2}}{2} $

Once again the error is the average of the errors of the previous sum.

Okay, we already know that

$ |E(N)|\le\frac{1}{2(N+1)+1} $

It immediately follows then that

$ |E(N+1)|\le\frac{1}{2(N+2)+1} $

$ |E(N+2)|\le\frac{1}{2(N+3)+1} $

So we can find that

$ |\frac{E(N)+E(N+1)}{2}|=\frac{|E(N)|-|E(N+1)|}{2}\le|\frac{\frac{1}{2(N+1)+1}-\frac{1}{2(N+2)+1}}{2}|=\frac{1}{(2N+3)(2N+5)} $

and

$ |\frac{E(N+1)+E(N+2)}{2}|=\frac{|E(N+1)|-|E(N+2)|}{2}\le|\frac{\frac{1}{2(N+2)+1}-\frac{1}{2(N+3)+1}}{2}|=\frac{1}{(2N+5)(2N+7)} $

So now the error of A1(N) is:

$ \frac{\frac{E(N+1)+E(N+2)}{2}+\frac{E(N)+E(N+1)}{2}}{2}\le|\frac{\frac{1}{(2N+3)(2N+5)}-\frac{1}{(2N+5)(2N+7)}}{2}|=|\frac{\frac{(2N+7)-(2N+3)}{(2N+3)(2N+5)(2N+7)}}{2}|=\frac{2}{(2N+3)(2N+5)(2N+7)} $

Which is now a third order on the bottom and no order on the top! Excellent! So now the error is going to get smaller even faster. If you want to really generalize the accuracy of finding the averages of consecutive terms after N and finding the averages of those averages and so on, we can write this formula:

$ |\text{Error}(A_K(N))|\le\frac{2K}{(2N+3)(2N+5)...(2N+2(K+2)+1)} $ where $ 1\le K $

Which means that we can get really accurate estimates with very few sums.

NOTE: I have to continue to test my error estimate formula at the end I'm not entirely sure if it's correct or not. I don't have time right now though. Anyone else, feel welcome to jump in at this time and add/subtract stuff.)

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang