Line 73: Line 73:
 
==Solution 2==
 
==Solution 2==
  
For <math>n</math> flips, there are <math>n-1</math> changeovers at most. Assume random variable <math>k_i</math> for changeover,
+
Assume  
  
<math>p({k_i}=1)=p(1-p)+(1-p)p=2p(1-p)</math>
+
<math>Y=\left(\begin{array}{c}Y_i \\ Y_j\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}a_{11}X_i+a_{12}X_j \\ a_{21}X_i+a_{22}X_j\end{array} \right)</math>.
 +
 
 +
Then
 +
 
 +
<math>\begin{array}{l}E(Y_iY_j)=E[(a_{11}X_i+a_{12}X_j)(a_{21}X_i+a_{22}X_j)]\\
 +
=a_{11}a_{21}\sigma^2+a_{12}a_{22}\sigma^2+(a_{11}a_{21}+a_{22}a_{11})E(X_iX_j)
 +
\end{array}</math>
 +
 
 +
For <math>|i-j|\geq1</math>, E(X_i,X_j)=0. Therefore, <math>a_{11}a_{21}+a_{12}a_{22}=0</math>.
 +
 
 +
One solution can be
 +
 
 +
<math>A=\left(\begin{array}{cc}
 +
1 & -1\\
 +
1 & 1
 +
\end{array} \right)</math>.
  
<math>E(k)=\sum_{i_1}^{n-1}p(k_i=1)=2(n-1)p(p-1)</math>
 
  
 
<font color="red"><u>'''Critique on Solution 2:'''</u>  
 
<font color="red"><u>'''Critique on Solution 2:'''</u>  
  
It might be better to claim the changeover as a Bernoulli random variable so the logic is easier to understand.
+
1. <math>E(Y_iY_j)=0</math> is not the condition for the two random variables to be independent.
 +
2. "For <math>|i-j|\geq1</math>, E(X_i,X_j)=0" is not supported by the given conditions.
  
 
</font>
 
</font>

Revision as of 19:00, 3 November 2014


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2013



Part 2

Let $ X_1,X_2,... $ be a sequence of jointly Gaussian random variables with covariance

$ Cov(X_i,X_j) = \left\{ \begin{array}{ll} {\sigma}^2, & i=j\\ \rho{\sigma}^2, & |i-j|=1\\ 0, & otherwise \end{array} \right. $

Suppose we take 2 consecutive samples from this sequence to form a vector $ X $, which is then linearly transformed to form a 2-dimensional random vector $ Y=AX $. Find a matrix $ A $ so that the components of $ Y $ are independent random variables You must justify your answer.


Solution 1

Suppose

$ A=\left(\begin{array}{cc} a & b\\ c & d \end{array} \right) $.

Then the new 2-D random vector can be expressed as

$ Y=\left(\begin{array}{c}Y_1 \\ Y_2\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}aX_i+bX_j \\ cX_i+dX_j\end{array} \right) $


Therefore,

$ \begin{array}{l}Cov(Y_1,Y_2)=E[(aX_i+bX_j-E(aX_i+bX_j))(cX_i+dX_j-E(cX_i+dX_j))] \\ =E[(aX_i+bX_j-aE(X_i)-bE(X_j))(cX_i+dX_j-cE(X_i)-dE(X_j))] \\ =E[acX_i^2+adX_iX_j-acX_iE(X_i)-adX_iE(X_j)+bcX_iX_j+bdX_j^2-bcX_jE(X_i)\\ -bdX_jE(X_j)-acX_iE(X_i)-adX_jE(X_i)+acE(X_i)^2+adE(X_i)E(X_j)\\ -bcX_iE(X_j)-bdX_jE(X_j)+bcE(X_i)E(X_j)+bdE(X_i)^2]\\ =E(ac(X_i-E(X_i))^2+(ad+bc)(X_i-E(X_i)(X_j-E(X_j))+bd(X_j-E(X_j))^2]\\ =(ac)Cov(X_i,X_i)+(ad+bc)Cov(X-i,X_j)+(bd)Cov(X_j,X_j)\\ =ac\sigma^2+(ad+bc)\rho\sigma^2+bd\sigma^2 \end{array} $

Let the above formula equal to 0 and $ a=b=d=1 $, we get $ c=-1 $.

Therefore, a solution is

$ A=\left(\begin{array}{cc} 1 & 1\\ -1 & 1 \end{array} \right) $.



Solution 2

Assume

$ Y=\left(\begin{array}{c}Y_i \\ Y_j\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}a_{11}X_i+a_{12}X_j \\ a_{21}X_i+a_{22}X_j\end{array} \right) $.

Then

$ \begin{array}{l}E(Y_iY_j)=E[(a_{11}X_i+a_{12}X_j)(a_{21}X_i+a_{22}X_j)]\\ =a_{11}a_{21}\sigma^2+a_{12}a_{22}\sigma^2+(a_{11}a_{21}+a_{22}a_{11})E(X_iX_j) \end{array} $

For $ |i-j|\geq1 $, E(X_i,X_j)=0. Therefore, $ a_{11}a_{21}+a_{12}a_{22}=0 $.

One solution can be

$ A=\left(\begin{array}{cc} 1 & -1\\ 1 & 1 \end{array} \right) $.


Critique on Solution 2:

1. $ E(Y_iY_j)=0 $ is not the condition for the two random variables to be independent. 2. "For $ |i-j|\geq1 $, E(X_i,X_j)=0" is not supported by the given conditions.


Back to QE CS question 1, August 2013

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett