Line 66: Line 66:
  
 
<math>
 
<math>
E[S_n^2]=E[\frac{1}{n^2}\sum_{j=1}^{n}\sum_{i=1}^{n}X_j\cdot X_i]=\frac{1}{n^2}\sum_{j=1}^{n}E[X_i^2]+\frac{1}{n^2}\sum_{j=1}^{n}\sum_{i=1}{n}E[X_i\cdot X_j]=\frac{1}{n^2}\cdot (n\cdot \sigma^2) + \frac{1}{n^2}\cdot 0 = \frac{\sigma^2}{n}
+
E[S_n^2]=E[\frac{1}{n^2}\sum_{j=1}^{n}\sum_{i=1}^{n}X_j\cdot X_i]=\frac{1}{n^2}\sum_{j=1}^{n}E[X_i^2]+\frac{1}{n^2}\sum_{j=1}^{n}\sum_{i=1}^{n}E[X_i\cdot X_j]=\frac{1}{n^2}\cdot (n\cdot \sigma^2) + \frac{1}{n^2}\cdot 0 = \frac{\sigma^2}{n}
 +
</math>
 +
 
 +
Therefore,
 +
 
 +
<math>
 +
E[(S_n-0)(X_i-S_n-0)]=E[S_nX_i-S_n^2]= E[S_nX_i]-E[S_n^2]=\frac{\sigma^2}{n}-\frac{\sigma^2}{n}=0
 +
</math>
 +
 
 +
So
 +
<math>
 +
r = \frac{cov(S_n,X_i-S_n)}{\sigma_{S_n}\sigma_{X_i-S_n}}
 
</math>
 
</math>
 
----
 
----

Revision as of 13:29, 7 December 2015


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2015


Solution 1

$ E(S_n)=E(\frac{1}{n}\sum_i^n X_i) =\frac{1}{n}\sum_i^n E(X_i)=0 $

$ E(X_i-S_n)=E(X_i-\frac{1}{n}\sum_k^n X_k) =E(X_i)-E(\frac{1}{n}\sum_k^n X_k)=0 $

$ E((X_i-S_n)S_n)=E(X_iS_n-S_n^2) $

As for any $ i,j\in \{1,2,...,n\} $, we have $ E(X_i\cdot X_j) = E(X_i)E(X_j)=0 $

$ E(X_iS_n-S_n^2) = E(X_iS_n)-E(S_n^2)\\ =E(\sum_k^nX_iX_K) - E(\sum_i^n\sum_k^nX_iX_K)\\ =\sum_k^nE(X_iX_K) - \sum_i^n\sum_k^nE(X_iX_K) \\ =0 $

Thus $ E(X_i-S_n)E(S_n)=E((X_i-S_n)S_n) $, $ S_n $ and $ X_i-S_n $ are uncorrelated.

Solution 2

$ S_n=\frac{1}{n}\sum_{j=1}{n}X_j $, note: in the problem statement, it should be $ \frac{1}{n}, because <math>S_n $ is the sample mean.

$ E[S_n]=E[\frac{1}{n}\sum_{j=1}{n}X_j] = \frac{1}{n}\sum_{j=1}{n}E[X_j ] = \frac{1}{n}\sum_{j=1}{n} \mu = 0\\ E[(X_i-\mu)^2]=E[X_i^2]=\sigma^2 $

$ E[X_iX_j]=\int_{-\infty}^{+\infty}x_ix_jf_{X_iX_j}(x_i,x_j)dx_idx_j=\int_{-\infty}^{+\infty}x_if_{X_i}(x_i)x_jf_{X_j}(x_j)dx_idx_j=E[X_i]E[X_j]=\mu\cdot\mu=0 $

$ E[X_i-S_n]=E[X_i]-E[S_n]=0-0=0 $

$ E[X_i\cdot S_n]=E[\frac{1}{n}\sum_{j=1}^{n}X_j\cdot X_i]=\frac{1}{n}\sum_{j=1}^{n}E[X_j\cdot X_i]=\frac{1}{n}\cdot \sigma^2 $

$ E[S_n^2]=E[\frac{1}{n^2}\sum_{j=1}^{n}\sum_{i=1}^{n}X_j\cdot X_i]=\frac{1}{n^2}\sum_{j=1}^{n}E[X_i^2]+\frac{1}{n^2}\sum_{j=1}^{n}\sum_{i=1}^{n}E[X_i\cdot X_j]=\frac{1}{n^2}\cdot (n\cdot \sigma^2) + \frac{1}{n^2}\cdot 0 = \frac{\sigma^2}{n} $

Therefore,

$ E[(S_n-0)(X_i-S_n-0)]=E[S_nX_i-S_n^2]= E[S_nX_i]-E[S_n^2]=\frac{\sigma^2}{n}-\frac{\sigma^2}{n}=0 $

So $ r = \frac{cov(S_n,X_i-S_n)}{\sigma_{S_n}\sigma_{X_i-S_n}} $


Back to QE CS question 1, August 2015

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Sees the importance of signal filtering in medical imaging

Dhruv Lamba, BSEE2010