Line 113: Line 113:
 
<math>r = \frac{COV(Y_1,Y_2)}{\sigma1\sigma2} = 0</math>
 
<math>r = \frac{COV(Y_1,Y_2)}{\sigma1\sigma2} = 0</math>
  
<math>E(Y_1Y_2) - E(Y_1)E(Y_2) = 0</math>
+
which indicates that
  
 +
<math>E(Y_1Y_2) - E(Y_1)E(Y_2) = 0</math>
  
 +
<math>E(Y_1Y_2) = E[(aX_i+bX_j)(cX_i+dX_j)] = ac({\sigma}^{2}+{E(X-i)}^2)+ad() = 0</math>
  
  

Revision as of 23:10, 22 February 2017


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2013



Part 2

Let $ X_1,X_2,... $ be a sequence of jointly Gaussian random variables with covariance

$ Cov(X_i,X_j) = \left\{ \begin{array}{ll} {\sigma}^2, & i=j\\ \rho{\sigma}^2, & |i-j|=1\\ 0, & otherwise \end{array} \right. $

Suppose we take 2 consecutive samples from this sequence to form a vector $ X $, which is then linearly transformed to form a 2-dimensional random vector $ Y=AX $. Find a matrix $ A $ so that the components of $ Y $ are independent random variables You must justify your answer.


Solution 1

Suppose

$ A=\left(\begin{array}{cc} a & b\\ c & d \end{array} \right) $.

Then the new 2-D random vector can be expressed as

$ Y=\left(\begin{array}{c}Y_1 \\ Y_2\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}aX_i+bX_j \\ cX_i+dX_j\end{array} \right) $


Therefore,

$ \begin{array}{l}Cov(Y_1,Y_2)=E[(aX_i+bX_j-E(aX_i+bX_j))(cX_i+dX_j-E(cX_i+dX_j))] \\ =E[(aX_i+bX_j-aE(X_i)-bE(X_j))(cX_i+dX_j-cE(X_i)-dE(X_j))] \\ =E[acX_i^2+adX_iX_j-acX_iE(X_i)-adX_iE(X_j)+bcX_iX_j+bdX_j^2-bcX_jE(X_i)\\ -bdX_jE(X_j)-acX_iE(X_i)-adX_jE(X_i)+acE(X_i)^2+adE(X_i)E(X_j)\\ -bcX_iE(X_j)-bdX_jE(X_j)+bcE(X_i)E(X_j)+bdE(X_i)^2]\\ =E(ac(X_i-E(X_i))^2+(ad+bc)(X_i-E(X_i)(X_j-E(X_j))+bd(X_j-E(X_j))^2]\\ =(ac)Cov(X_i,X_i)+(ad+bc)Cov(X-i,X_j)+(bd)Cov(X_j,X_j)\\ =ac\sigma^2+(ad+bc)\rho\sigma^2+bd\sigma^2 \end{array} $

Let the above formula equal to 0 and $ a=b=d=1 $, we get $ c=-1 $.

Therefore, a solution is

$ A=\left(\begin{array}{cc} 1 & 1\\ -1 & 1 \end{array} \right) $.



Solution 2

Assume

$ Y=\left(\begin{array}{c}Y_i \\ Y_j\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}a_{11}X_i+a_{12}X_j \\ a_{21}X_i+a_{22}X_j\end{array} \right) $.

Then

$ \begin{array}{l}E(Y_iY_j)=E[(a_{11}X_i+a_{12}X_j)(a_{21}X_i+a_{22}X_j)]\\ =a_{11}a_{21}\sigma^2+a_{12}a_{22}\sigma^2+(a_{11}a_{21}+a_{22}a_{11})E(X_iX_j) \end{array} $

For $ |i-j|\geq1 $, $ E(X_i,X_j)=0 $. Therefore, $ a_{11}a_{21}+a_{12}a_{22}=0 $.

One solution can be

$ A=\left(\begin{array}{cc} 1 & -1\\ 1 & 1 \end{array} \right) $.


Critique on Solution 2:

1. $ E(Y_iY_j)=0 $ is not the condition for the two random variables to be independent.

2. "For $ |i-j|\geq1 $, $ E(X_i,X_j)=0 $" is not supported by the given conditions.


Solution 3

$ Y=\left(\begin{array}{c}Y_1 \\ Y_2\end{array} \right)=AX=\left(\begin{array}{cc} a & b\\ c & d \end{array} \right)\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}aX_i+bX_j \\ cX_i+dX_j\end{array} \right) $

We know that the sum of two independent Gaussian distributed random variables is still Gaussian distributed.

Thus, $ Y_1,Y_2 $ are Gaussian distributed random variables. If they are uncorrelated, then they are also independent.

$ r = \frac{COV(Y_1,Y_2)}{\sigma1\sigma2} = 0 $

which indicates that

$ E(Y_1Y_2) - E(Y_1)E(Y_2) = 0 $

$ E(Y_1Y_2) = E[(aX_i+bX_j)(cX_i+dX_j)] = ac({\sigma}^{2}+{E(X-i)}^2)+ad() = 0 $



Back to QE CS question 1, August 2013

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang