(New page: Category:ECE Category:QE Category:CNSIP Category:problem solving Category:random variables Category:probability <center> <font size= 4> [[ECE_PhD_Qualifying_Exams...)
 
 
(19 intermediate revisions by 2 users not shown)
Line 23: Line 23:
 
----
 
----
 
=Part 2=
 
=Part 2=
Consider <math class="inline">n</math> independent flips of a coin having probability <math class="inline">p</math> of landing on heads. Say that a changeover occurs whenever an outcome differs from the one preceding it. For instance, if <math class="inline">n=5</math> and the sequence <math class="inline">HHTHT</math> is observed, then there are 3 changeovers. Find the expected number of changeovers for <math class="inline">n</math> flips. ''Hint'': Express the number of changeovers as a sum of Bernoulli random variables.
+
Let <math>X_1,X_2,...</math> be a sequence of jointly Gaussian random variables with covariance
 +
 
 +
<math>Cov(X_i,X_j) = \left\{ \begin{array}{ll}
 +
{\sigma}^2, & i=j\\
 +
\rho{\sigma}^2, & |i-j|=1\\
 +
0, & otherwise
 +
  \end{array} \right.</math>
 +
 
 +
Suppose we take 2 consecutive samples from this sequence to form a vector <math>X</math>, which is then linearly transformed to form a 2-dimensional random vector <math>Y=AX</math>. Find a matrix <math>A</math> so that the components of <math>Y</math> are independent random variables You must justify your answer.
 
----
 
----
 
=Solution 1=
 
=Solution 1=
  
The number of changeovers <math>Y</math> can be expressed as the sum of n-1 Bernoulli random variables:
+
Suppose
  
<math>Y=\sum_{i=1}^{n-1}X_i</math>.
+
<math>A=\left(\begin{array}{cc}
 +
a & b\\
 +
c & d
 +
\end{array} \right)</math>.
  
Therefore,
+
Then the new 2-D random vector can be expressed as
  
<math>E(Y)=E(\sum_{i=1}^{n-1}X_i)=\sum_{i=1}^{n-1}E(X_i)</math>.
+
<math>Y=\left(\begin{array}{c}Y_1 \\ Y_2\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}aX_i+bX_j \\ cX_i+dX_j\end{array} \right)</math>
  
For Bernoulli random variables,
 
  
<math>E(X_i)=p(E_i=1)=p(1-p)+(1-p)p=2p(1-p)</math>.
+
Therefore,
  
Thus
+
<math>\begin{array}{l}Cov(Y_1,Y_2)=E[(aX_i+bX_j-E(aX_i+bX_j))(cX_i+dX_j-E(cX_i+dX_j))] \\
 +
=E[(aX_i+bX_j-aE(X_i)-bE(X_j))(cX_i+dX_j-cE(X_i)-dE(X_j))] \\
 +
=E[acX_i^2+adX_iX_j-acX_iE(X_i)-adX_iE(X_j)+bcX_iX_j+bdX_j^2-bcX_jE(X_i)\\
 +
-bdX_jE(X_j)-acX_iE(X_i)-adX_jE(X_i)+acE(X_i)^2+adE(X_i)E(X_j)\\
 +
-bcX_iE(X_j)-bdX_jE(X_j)+bcE(X_i)E(X_j)+bdE(X_i)^2]\\
 +
=E(ac(X_i-E(X_i))^2+(ad+bc)(X_i-E(X_i)(X_j-E(X_j))+bd(X_j-E(X_j))^2]\\
 +
=(ac)Cov(X_i,X_i)+(ad+bc)Cov(X-i,X_j)+(bd)Cov(X_j,X_j)\\
 +
=ac\sigma^2+(ad+bc)\rho\sigma^2+bd\sigma^2
 +
\end{array}</math>
  
<math>E(Y)=2(n-1)p(1-p)</math>.
+
Let the above formula equal to 0 and <math>a=b=d=1</math>, we get <math>c=-1</math>.
 +
 
 +
Therefore, a solution is
 +
 
 +
<math>A=\left(\begin{array}{cc}
 +
1 & 1\\
 +
-1 & 1
 +
\end{array} \right)</math>.
 +
 
 +
<font color="red"><u>'''Comments on solution 1'''</u>
 +
 
 +
More procedures and explanations would be better.
 +
 
 +
</font>
  
 
----
 
----
 
==Solution 2==
 
==Solution 2==
  
For <math>n</math> flips, there are <math>n-1</math> changeovers at most. Assume random variable <math>k_i</math> for changeover,
+
Assume  
  
<math>p({k_i}=1)=p(1-p)+(1-p)p=2p(1-p)</math>
+
<math>Y=\left(\begin{array}{c}Y_i \\ Y_j\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}a_{11}X_i+a_{12}X_j \\ a_{21}X_i+a_{22}X_j\end{array} \right)</math>.
 +
 
 +
Then
 +
 
 +
<math>\begin{array}{l}E(Y_iY_j)=E[(a_{11}X_i+a_{12}X_j)(a_{21}X_i+a_{22}X_j)]\\
 +
=a_{11}a_{21}\sigma^2+a_{12}a_{22}\sigma^2+(a_{11}a_{21}+a_{22}a_{11})E(X_iX_j)
 +
\end{array}</math>
 +
 
 +
For <math>|i-j|\geq1</math>, <math>E(X_i,X_j)=0</math>. Therefore, <math>a_{11}a_{21}+a_{12}a_{22}=0</math>.
 +
 
 +
One solution can be
 +
 
 +
<math>A=\left(\begin{array}{cc}
 +
1 & -1\\
 +
1 & 1
 +
\end{array} \right)</math>.
  
<math>E(k)=\sum_{i_1}^{n-1}p(k_i=1)=2(n-1)p(p-1)</math>
 
  
 
<font color="red"><u>'''Critique on Solution 2:'''</u>  
 
<font color="red"><u>'''Critique on Solution 2:'''</u>  
  
It might be better to claim the changeover as a Bernoulli random variable so the logic is easier to understand.
+
1. <math>E(Y_iY_j)=0</math> is not the condition for the two random variables to be independent.
 +
 
 +
2. "For <math>|i-j|\geq1</math>, <math>E(X_i,X_j)=0</math>" is not supported by the given conditions.
  
 
</font>
 
</font>
 +
----
 +
==Solution 3==
 +
<math>Y=\left(\begin{array}{c}Y_1 \\ Y_2\end{array} \right)=AX=\left(\begin{array}{cc}
 +
a & b\\
 +
c & d
 +
\end{array} \right)\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}aX_i+bX_j \\ cX_i+dX_j\end{array} \right)</math>
 +
 +
We know that the sum of two independent Gaussian distributed random variables is still Gaussian distributed.
 +
 +
Thus, <math>Y_1,Y_2</math> are Gaussian distributed random variables. If they are uncorrelated, then they are also independent.
 +
 +
<math>r = \frac{COV(Y_1,Y_2)}{\sigma1\sigma2} = 0</math>
 +
 +
which indicates that
 +
 +
<math>E(Y_1Y_2) - E(Y_1)E(Y_2) = 0</math>
 +
 +
we know that <math>|i-j|=1</math>
 +
 +
<math>E(Y_1Y_2) = E((aX_i+bX_j)(cX_i+dX_j)) = E(acX_iX_i+adX_iX_j+bcX_jX_i+bdX_jX_j) \\
 +
= ac({\sigma}^{2}+{E(X_i)}^2)+ad(\rho{\sigma}^{2}+E(X_j)E(X_i))+bc(\rho{\sigma}^{2}+E(X_j)E(X_i))+bd({\sigma}^{2}+{E(X_j)}^2) = 0</math>
 +
 +
<math>E(Y_1)E(Y_2) = E((aX_i+bX_j))E((cX_i+dX_j)) = (aE(X_i)+bE(X_j))(cE(X_i)+dE(X_j)) \\
 +
= ac{E(X_i)}^{2}+adE(X_i)E(X_j)+bcE(X_i)E(X_j)+bd{E(X_j)}^{2}</math>
 +
 +
Therefore,
 +
 +
<math>E(Y_1Y_2) - E(Y_1)E(Y_2) = ac{\sigma}^{2}+ad\rho{\sigma}^{2}+bc\rho{\sigma}^{2}+bd{\sigma}^{2} = 0</math>
 +
 +
<math>\left\{ \begin{array}{ll}  ac = -bd\\  ad=-bc  \end{array} \right.</math>
 +
 +
Thus, one of the possible solutions is <math>a = 1,b = -1,c = 1, d = 1</math>
 +
 +
=Similar Question=
 +
Let <math>X_1,X_2,...</math> be a sequence of jointly Gaussian random variables with the same mean <math>u</math> and with covariance function where <math>|\rho|<1</math>
 +
 +
<math>Cov(X_i,X_j) = \left\{ \begin{array}{ll}
 +
{\sigma}^2, & i=j\\
 +
\rho{\sigma}^2, & |i-j|=1\\
 +
0, & otherwise
 +
  \end{array} \right.</math>
 +
 +
Find the mean and variance of <math>S_n = X_1 + ...+ X_n</math>
 
----
 
----
 
[[ECE-QE_CS1-2013|Back to QE CS question 1, August 2013]]
 
[[ECE-QE_CS1-2013|Back to QE CS question 1, August 2013]]
  
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]

Latest revision as of 16:43, 24 February 2017


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2013



Part 2

Let $ X_1,X_2,... $ be a sequence of jointly Gaussian random variables with covariance

$ Cov(X_i,X_j) = \left\{ \begin{array}{ll} {\sigma}^2, & i=j\\ \rho{\sigma}^2, & |i-j|=1\\ 0, & otherwise \end{array} \right. $

Suppose we take 2 consecutive samples from this sequence to form a vector $ X $, which is then linearly transformed to form a 2-dimensional random vector $ Y=AX $. Find a matrix $ A $ so that the components of $ Y $ are independent random variables You must justify your answer.


Solution 1

Suppose

$ A=\left(\begin{array}{cc} a & b\\ c & d \end{array} \right) $.

Then the new 2-D random vector can be expressed as

$ Y=\left(\begin{array}{c}Y_1 \\ Y_2\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}aX_i+bX_j \\ cX_i+dX_j\end{array} \right) $


Therefore,

$ \begin{array}{l}Cov(Y_1,Y_2)=E[(aX_i+bX_j-E(aX_i+bX_j))(cX_i+dX_j-E(cX_i+dX_j))] \\ =E[(aX_i+bX_j-aE(X_i)-bE(X_j))(cX_i+dX_j-cE(X_i)-dE(X_j))] \\ =E[acX_i^2+adX_iX_j-acX_iE(X_i)-adX_iE(X_j)+bcX_iX_j+bdX_j^2-bcX_jE(X_i)\\ -bdX_jE(X_j)-acX_iE(X_i)-adX_jE(X_i)+acE(X_i)^2+adE(X_i)E(X_j)\\ -bcX_iE(X_j)-bdX_jE(X_j)+bcE(X_i)E(X_j)+bdE(X_i)^2]\\ =E(ac(X_i-E(X_i))^2+(ad+bc)(X_i-E(X_i)(X_j-E(X_j))+bd(X_j-E(X_j))^2]\\ =(ac)Cov(X_i,X_i)+(ad+bc)Cov(X-i,X_j)+(bd)Cov(X_j,X_j)\\ =ac\sigma^2+(ad+bc)\rho\sigma^2+bd\sigma^2 \end{array} $

Let the above formula equal to 0 and $ a=b=d=1 $, we get $ c=-1 $.

Therefore, a solution is

$ A=\left(\begin{array}{cc} 1 & 1\\ -1 & 1 \end{array} \right) $.

Comments on solution 1

More procedures and explanations would be better.


Solution 2

Assume

$ Y=\left(\begin{array}{c}Y_i \\ Y_j\end{array} \right)=A\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}a_{11}X_i+a_{12}X_j \\ a_{21}X_i+a_{22}X_j\end{array} \right) $.

Then

$ \begin{array}{l}E(Y_iY_j)=E[(a_{11}X_i+a_{12}X_j)(a_{21}X_i+a_{22}X_j)]\\ =a_{11}a_{21}\sigma^2+a_{12}a_{22}\sigma^2+(a_{11}a_{21}+a_{22}a_{11})E(X_iX_j) \end{array} $

For $ |i-j|\geq1 $, $ E(X_i,X_j)=0 $. Therefore, $ a_{11}a_{21}+a_{12}a_{22}=0 $.

One solution can be

$ A=\left(\begin{array}{cc} 1 & -1\\ 1 & 1 \end{array} \right) $.


Critique on Solution 2:

1. $ E(Y_iY_j)=0 $ is not the condition for the two random variables to be independent.

2. "For $ |i-j|\geq1 $, $ E(X_i,X_j)=0 $" is not supported by the given conditions.


Solution 3

$ Y=\left(\begin{array}{c}Y_1 \\ Y_2\end{array} \right)=AX=\left(\begin{array}{cc} a & b\\ c & d \end{array} \right)\left(\begin{array}{c}X_i \\ X_j\end{array} \right)=\left(\begin{array}{c}aX_i+bX_j \\ cX_i+dX_j\end{array} \right) $

We know that the sum of two independent Gaussian distributed random variables is still Gaussian distributed.

Thus, $ Y_1,Y_2 $ are Gaussian distributed random variables. If they are uncorrelated, then they are also independent.

$ r = \frac{COV(Y_1,Y_2)}{\sigma1\sigma2} = 0 $

which indicates that

$ E(Y_1Y_2) - E(Y_1)E(Y_2) = 0 $

we know that $ |i-j|=1 $

$ E(Y_1Y_2) = E((aX_i+bX_j)(cX_i+dX_j)) = E(acX_iX_i+adX_iX_j+bcX_jX_i+bdX_jX_j) \\ = ac({\sigma}^{2}+{E(X_i)}^2)+ad(\rho{\sigma}^{2}+E(X_j)E(X_i))+bc(\rho{\sigma}^{2}+E(X_j)E(X_i))+bd({\sigma}^{2}+{E(X_j)}^2) = 0 $

$ E(Y_1)E(Y_2) = E((aX_i+bX_j))E((cX_i+dX_j)) = (aE(X_i)+bE(X_j))(cE(X_i)+dE(X_j)) \\ = ac{E(X_i)}^{2}+adE(X_i)E(X_j)+bcE(X_i)E(X_j)+bd{E(X_j)}^{2} $

Therefore,

$ E(Y_1Y_2) - E(Y_1)E(Y_2) = ac{\sigma}^{2}+ad\rho{\sigma}^{2}+bc\rho{\sigma}^{2}+bd{\sigma}^{2} = 0 $

$ \left\{ \begin{array}{ll} ac = -bd\\ ad=-bc \end{array} \right. $

Thus, one of the possible solutions is $ a = 1,b = -1,c = 1, d = 1 $

Similar Question

Let $ X_1,X_2,... $ be a sequence of jointly Gaussian random variables with the same mean $ u $ and with covariance function where $ |\rho|<1 $

$ Cov(X_i,X_j) = \left\{ \begin{array}{ll} {\sigma}^2, & i=j\\ \rho{\sigma}^2, & |i-j|=1\\ 0, & otherwise \end{array} \right. $

Find the mean and variance of $ S_n = X_1 + ...+ X_n $


Back to QE CS question 1, August 2013

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett