Line 81: Line 81:
  
 
Thus <math>S_n</math> and <math>X_i-S_n</math> are uncorrelated.
 
Thus <math>S_n</math> and <math>X_i-S_n</math> are uncorrelated.
 +
 +
===Solution 3===
 +
 +
Our goal is to show that <math>S_n</math> and <math>X_i</math> - <math>S_n</math> are uncorrelated <math>\forall i \in 1, 2, ..., n</math>. If we can show that the covariance between <math>S_n</math> and <math>X_i - S_n</math> is equal to 0 <math>\forall i</math>, then we will have shown the aforementioned property. Recalling that
 +
 +
<math>
 +
cov(X,Y) = E[XY] - E[X]E[Y],
 +
</math>
 +
 +
we aim to show that
 +
 +
<math>
 +
E[(S_n)(X_i -S_n)] - E[S_n]E[X_i-S_n] = 0.
 +
</math>
 +
 +
Let us consider the LHS of the above equation. This can be written as
 +
 +
<math>
 +
E[(S_n)(X_i -S_n)] - (E[S_n]E[X_i]-E[S_n]E[S_n]).
 +
</math>
 +
 +
We are given that <math>E[X_i] = 0 \forall i</math>, and since <math>S_n \triangleq \frac{1}{n}\sum^n_{j=1}X_j</math> it is easy to show that <math>E[S_n] = 0</math> as well. Thus the above becomes
 +
 +
<math>
 +
E[(S_n)(X_i -S_n)]
 +
</math>
 +
 +
or
 +
 +
<math>
 +
E[S_n X_i] - E[S_n^2].
 +
</math>
 +
 +
Recalling that the above expression must be equal to 0, our problem has reduced to showing that
 +
 +
<math>
 +
E[S_n X_i] = E[S_n^2].
 +
</math>
 +
 +
Let us first examine <math>E[S_n X_i]</math>. This can be rewritten as
 +
 +
<math>
 +
E[S_n X_i] = E\left[\left(\frac{1}{n}\sum^n_{j = 1}X_j \right)\cdot X_i\right] = \frac{1}{n}E[X_1 X_i + X_2 Xi + ... + X_i X_i + ... + X_n X_i].
 +
</math>
 +
 +
Since the sequence <math>X_1, X_2, ..., X_n</math> is i.i.d, if <math>i\neq j, E[X_i X_j] = 0</math>. Thus, the above becomes
 +
 +
<math>
 +
\frac{1}{n}\left(0 + 0 + ...\, E[X_i^2] + ... + 0 + 0\right).
 +
</math>
 +
 +
Since <math>X_i</math> is zero-mean <math>\forall i</math>, we know that <math>E[X_i^2] = var(X_i) = \sigma^2</math> for any <math>i</math>, and that <math>E[S_n X_i] = \frac{\sigma^2}{n}</math>. Now let us examine <math>E[S_n^2]</math>. This can be rewritten as
 +
 +
<math>
 +
E[S_n^2] = E\left[\frac{1}{n}\left(X_1 + X_2 + ... + X_n\right)\frac{1}{n}\left(X_1 + X_2 + ... + X_n\right)\right] = \frac{1}{n^2}E\left[\left(X_1 + X_2 + ... + X_n\right)^2\right]
 +
</math>.
 +
 +
Squaring out the expression inside the expectation operator will result in an expression with <math>n</math> square terms and <math>\sum^n_{j-1}(j-1)</math> cross terms. Recall that the expectation values of the cross terms will all be zero since <math>X_i</math> is zero-mean <math>\forall i</math>. Then we have
 +
 +
<math>
 +
\frac{1}{n^2}E\left[\left(X_1 + X_2 + ... + X_n\right)^2\right] = \frac{1}{n^2}\left(E[X_1^2] + E[X_2^2] + ... + E[X_n^2]\right) = \frac{1}{n}(n\sigma^2) = \frac{\sigma^2}{n}.
 +
</math>
 +
 +
Thus we have shown that <math>E[S_n X_i] = E[S_n^2] = \frac{\sigma^2}{n}</math>, and we are done.
 +
 
----
 
----
 
[[ECE-QE_CS1-2015|Back to QE CS question 1, August 2015]]
 
[[ECE-QE_CS1-2015|Back to QE CS question 1, August 2015]]
  
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]

Revision as of 16:59, 26 January 2016


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2015


Solution 1

$ E(S_n)=E(\frac{1}{n}\sum_i^n X_i) =\frac{1}{n}\sum_i^n E(X_i)=0 $

$ E(X_i-S_n)=E(X_i-\frac{1}{n}\sum_k^n X_k) =E(X_i)-E(\frac{1}{n}\sum_k^n X_k)=0 $

$ E((X_i-S_n)S_n)=E(X_iS_n-S_n^2) $

As for any $ i,j\in \{1,2,...,n\} $, we have $ E(X_i\cdot X_j) = E(X_i)E(X_j)=0 $

$ E(X_iS_n-S_n^2) = E(X_iS_n)-E(S_n^2)\\ =E(\sum_k^nX_iX_K) - E(\sum_i^n\sum_k^nX_iX_K)\\ =\sum_k^nE(X_iX_K) - \sum_i^n\sum_k^nE(X_iX_K) \\ =0 $

Thus $ E(X_i-S_n)E(S_n)=E((X_i-S_n)S_n) $, $ S_n $ and $ X_i-S_n $ are uncorrelated.

Solution 2

$ S_n=\frac{1}{n}\sum_{j=1}{n}X_j $, note: in the problem statement, it should be $ \frac{1}{n}, because <math>S_n $ is the sample mean.

$ E[S_n]=E[\frac{1}{n}\sum_{j=1}{n}X_j] = \frac{1}{n}\sum_{j=1}{n}E[X_j ] = \frac{1}{n}\sum_{j=1}{n} \mu = 0\\ E[(X_i-\mu)^2]=E[X_i^2]=\sigma^2 $

$ E[X_iX_j]=\int_{-\infty}^{+\infty}x_ix_jf_{X_iX_j}(x_i,x_j)dx_idx_j=\int_{-\infty}^{+\infty}x_if_{X_i}(x_i)x_jf_{X_j}(x_j)dx_idx_j=E[X_i]E[X_j]=\mu\cdot\mu=0 $

$ E[X_i-S_n]=E[X_i]-E[S_n]=0-0=0 $

$ E[X_i\cdot S_n]=E[\frac{1}{n}\sum_{j=1}^{n}X_j\cdot X_i]=\frac{1}{n}\sum_{j=1}^{n}E[X_j\cdot X_i]=\frac{1}{n}\cdot \sigma^2 $

$ E[S_n^2]=E[\frac{1}{n^2}\sum_{j=1}^{n}\sum_{i=1}^{n}X_j\cdot X_i]=\frac{1}{n^2}\sum_{j=1}^{n}E[X_i^2]+\frac{1}{n^2}\sum_{j=1}^{n}\sum_{i=1}^{n}E[X_i\cdot X_j]=\frac{1}{n^2}\cdot (n\cdot \sigma^2) + \frac{1}{n^2}\cdot 0 = \frac{\sigma^2}{n} $

Therefore,

$ E[(S_n-0)(X_i-S_n-0)]=E[S_nX_i-S_n^2]= E[S_nX_i]-E[S_n^2]=\frac{\sigma^2}{n}-\frac{\sigma^2}{n}=0 $

So $ r = \frac{cov(S_n,X_i-S_n)}{\sigma_{S_n}\sigma_{X_i-S_n}}=0 $

Thus $ S_n $ and $ X_i-S_n $ are uncorrelated.

Solution 3

Our goal is to show that $ S_n $ and $ X_i $ - $ S_n $ are uncorrelated $ \forall i \in 1, 2, ..., n $. If we can show that the covariance between $ S_n $ and $ X_i - S_n $ is equal to 0 $ \forall i $, then we will have shown the aforementioned property. Recalling that

$ cov(X,Y) = E[XY] - E[X]E[Y], $

we aim to show that

$ E[(S_n)(X_i -S_n)] - E[S_n]E[X_i-S_n] = 0. $

Let us consider the LHS of the above equation. This can be written as

$ E[(S_n)(X_i -S_n)] - (E[S_n]E[X_i]-E[S_n]E[S_n]). $

We are given that $ E[X_i] = 0 \forall i $, and since $ S_n \triangleq \frac{1}{n}\sum^n_{j=1}X_j $ it is easy to show that $ E[S_n] = 0 $ as well. Thus the above becomes

$ E[(S_n)(X_i -S_n)] $

or

$ E[S_n X_i] - E[S_n^2]. $

Recalling that the above expression must be equal to 0, our problem has reduced to showing that

$ E[S_n X_i] = E[S_n^2]. $

Let us first examine $ E[S_n X_i] $. This can be rewritten as

$ E[S_n X_i] = E\left[\left(\frac{1}{n}\sum^n_{j = 1}X_j \right)\cdot X_i\right] = \frac{1}{n}E[X_1 X_i + X_2 Xi + ... + X_i X_i + ... + X_n X_i]. $

Since the sequence $ X_1, X_2, ..., X_n $ is i.i.d, if $ i\neq j, E[X_i X_j] = 0 $. Thus, the above becomes

$ \frac{1}{n}\left(0 + 0 + ...\, E[X_i^2] + ... + 0 + 0\right). $

Since $ X_i $ is zero-mean $ \forall i $, we know that $ E[X_i^2] = var(X_i) = \sigma^2 $ for any $ i $, and that $ E[S_n X_i] = \frac{\sigma^2}{n} $. Now let us examine $ E[S_n^2] $. This can be rewritten as

$ E[S_n^2] = E\left[\frac{1}{n}\left(X_1 + X_2 + ... + X_n\right)\frac{1}{n}\left(X_1 + X_2 + ... + X_n\right)\right] = \frac{1}{n^2}E\left[\left(X_1 + X_2 + ... + X_n\right)^2\right] $.

Squaring out the expression inside the expectation operator will result in an expression with $ n $ square terms and $ \sum^n_{j-1}(j-1) $ cross terms. Recall that the expectation values of the cross terms will all be zero since $ X_i $ is zero-mean $ \forall i $. Then we have

$ \frac{1}{n^2}E\left[\left(X_1 + X_2 + ... + X_n\right)^2\right] = \frac{1}{n^2}\left(E[X_1^2] + E[X_2^2] + ... + E[X_n^2]\right) = \frac{1}{n}(n\sigma^2) = \frac{\sigma^2}{n}. $

Thus we have shown that $ E[S_n X_i] = E[S_n^2] = \frac{\sigma^2}{n} $, and we are done.


Back to QE CS question 1, August 2015

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

BSEE 2004, current Ph.D. student researching signal and image processing.

Landis Huffman