Revision as of 13:08, 7 December 2015 by Yan115 (Talk | contribs)


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2015


Solution 1

$ E(S_n)=E(\frac{1}{n}\sum_i^n X_i) =\frac{1}{n}\sum_i^n E(X_i)=0 $

$ E(X_i-S_n)=E(X_i-\frac{1}{n}\sum_k^n X_k) =E(X_i)-E(\frac{1}{n}\sum_k^n X_k)=0 $

$ E((X_i-S_n)S_n)=E(X_iS_n-S_n^2) $

As for any $ i,j\in \{1,2,...,n\} $, we have $ E(X_i\cdot X_j) = E(X_i)E(X_j)=0 $

$ E(X_iS_n-S_n^2) = E(X_iS_n)-E(S_n^2)\\ =E(\sum_k^nX_iX_K) - E(\sum_i^n\sum_k^nX_iX_K)\\ =\sum_k^nE(X_iX_K) - \sum_i^n\sum_k^nE(X_iX_K) \\ =0 $

Thus $ E(X_i-S_n)E(S_n)=E((X_i-S_n)S_n) $, $ S_n $ and $ X_i-S_n $ are uncorrelated.

Solution 2

$ S_n=\frac{1}{n}\sum_{j=1}{n}X_j $, note: in the problem statement, it should be $ \frac{1}{n}, because <math>S_n $ is the sample mean.

$ E[S_n]=E[\frac{1}{n}\sum_{j=1}{n}X_j] = \frac{1}{n}\sum_{j=1}{n}E[X_j ] = \frac{1}{n}\sum_{j=1}{n} \mu = 0\\ E[(X_i-\mu)^2]=E[X_i^2]=\sigma^2 $

$ E[X_iX_j]=\int_{-\infty}^{+\infty}X_iX_jf_{X_iX_j}(X_i,X_j)dX_idX_j $


Back to QE CS question 1, August 2015

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett