Line 5: Line 5:
 
Given two coins; the first coin is fair and the second coin has two heads. One coin is picked at random and tossed two times. It shows heads both times. What is the probability that the coin picked is fair?
 
Given two coins; the first coin is fair and the second coin has two heads. One coin is picked at random and tossed two times. It shows heads both times. What is the probability that the coin picked is fair?
  
• <math>F=\left\{ \text{fair coin is selected}\right\}</math>   
+
• <math class="inline">F=\left\{ \text{fair coin is selected}\right\}</math>   
  
• <math>S=\left\{ \text{the coin that has two heads is selected}\right\}</math>   
+
• <math class="inline">S=\left\{ \text{the coin that has two heads is selected}\right\}</math>   
  
• <math>H2=\left\{ \text{heads are shown both times}\right\}</math>   
+
• <math class="inline">H2=\left\{ \text{heads are shown both times}\right\}</math>   
  
• <math>P\left(H2|F\right)=\frac{1}{4},\; P\left(H2|S\right)=1,\; P\left(F\right)=P\left(S\right)=\frac{1}{2}.</math>  
+
• <math class="inline">P\left(H2|F\right)=\frac{1}{4},\; P\left(H2|S\right)=1,\; P\left(F\right)=P\left(S\right)=\frac{1}{2}.</math>  
  
• By using Bayes' theorem,<math>P\left(F|H2\right)=\frac{P\left(H2|F\right)P\left(F\right)}{P\left(H2|F\right)P\left(F\right)+P\left(H2|S\right)P\left(S\right)}=\frac{P\left(H2|F\right)}{P\left(H2|F\right)+P\left(H2|S\right)}=\frac{\frac{1}{4}}{\frac{1}{4}+1}=\frac{1}{5}.</math>  
+
• By using Bayes' theorem,<math class="inline">P\left(F|H2\right)=\frac{P\left(H2|F\right)P\left(F\right)}{P\left(H2|F\right)P\left(F\right)+P\left(H2|S\right)P\left(S\right)}=\frac{P\left(H2|F\right)}{P\left(H2|F\right)+P\left(H2|S\right)}=\frac{\frac{1}{4}}{\frac{1}{4}+1}=\frac{1}{5}.</math>  
  
 
2. (20 pts)
 
2. (20 pts)
  
Let <math>\mathbf{X}_{t}</math>  and <math>\mathbf{Y}_{t}</math>  by jointly wide sense stationary continous parameter random processes with <math>E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0</math> . Show that <math>R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right)</math> .
+
Let <math class="inline">\mathbf{X}_{t}</math>  and <math class="inline">\mathbf{Y}_{t}</math>  by jointly wide sense stationary continous parameter random processes with <math class="inline">E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0</math> . Show that <math class="inline">R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right)</math> .
  
<math>E\left[\mathbf{X}\left(t\right)\left(\mathbf{X}^{\star}\left(t+\tau\right)-\mathbf{Y}^{\star}\left(t+\tau\right)\right)\right]=E\left[\mathbf{X}\left(t\right)\mathbf{X}^{\star}\left(t+\tau\right)\right]-E\left[\mathbf{X}\left(t\right)\mathbf{Y}^{\star}\left(t+\tau\right)\right]=R_{\mathbf{X}}\left(\tau\right)-R_{\mathbf{XY}}\left(\tau\right).</math>  
+
<math class="inline">E\left[\mathbf{X}\left(t\right)\left(\mathbf{X}^{\star}\left(t+\tau\right)-\mathbf{Y}^{\star}\left(t+\tau\right)\right)\right]=E\left[\mathbf{X}\left(t\right)\mathbf{X}^{\star}\left(t+\tau\right)\right]-E\left[\mathbf{X}\left(t\right)\mathbf{Y}^{\star}\left(t+\tau\right)\right]=R_{\mathbf{X}}\left(\tau\right)-R_{\mathbf{XY}}\left(\tau\right).</math>  
  
<math>E\left[\left|\mathbf{X}\left(t\right)\right|^{2}\right]=E\left[\mathbf{X}\left(t\right)\mathbf{X}^{\star}\left(t\right)\right]=R_{\mathbf{X}}\left(0\right).</math>  
+
<math class="inline">E\left[\left|\mathbf{X}\left(t\right)\right|^{2}\right]=E\left[\mathbf{X}\left(t\right)\mathbf{X}^{\star}\left(t\right)\right]=R_{\mathbf{X}}\left(0\right).</math>  
  
<math>E\left[\left|\mathbf{X}\left(t+\tau\right)-\mathbf{Y}\left(t+\tau\right)\right|^{2}\right]=E\left[\left(\mathbf{X}\left(t+\tau\right)-\mathbf{Y}\left(t+\tau\right)\right)\left(\mathbf{X}^{\star}\left(t+\tau\right)-\mathbf{Y}^{\star}\left(t+\tau\right)\right)\right]</math><math>=R_{\mathbf{X}}\left(0\right)-R_{\mathbf{YX}}\left(0\right)-R_{\mathbf{XY}}\left(0\right)+R_{\mathbf{Y}}\left(0\right)</math><math>=E\left[\mathbf{X}\left(0\right)\mathbf{X}^{\star}\left(0\right)\right]-E\left[\mathbf{Y}\left(0\right)\mathbf{X}^{\star}\left(0\right)\right]-E\left[\mathbf{X}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right]+E\left[\mathbf{Y}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right]</math><math>=E\left[\mathbf{X}\left(0\right)\mathbf{X}^{\star}\left(0\right)-\mathbf{Y}\left(0\right)\mathbf{X}^{\star}\left(0\right)-\mathbf{X}\left(0\right)\mathbf{Y}^{\star}\left(0\right)+\mathbf{Y}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right]</math><math>=E\left[\left(\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right)\left(\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right)^{\star}\right]=E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]. </math>
+
<math class="inline">E\left[\left|\mathbf{X}\left(t+\tau\right)-\mathbf{Y}\left(t+\tau\right)\right|^{2}\right]=E\left[\left(\mathbf{X}\left(t+\tau\right)-\mathbf{Y}\left(t+\tau\right)\right)\left(\mathbf{X}^{\star}\left(t+\tau\right)-\mathbf{Y}^{\star}\left(t+\tau\right)\right)\right]</math><math class="inline">=R_{\mathbf{X}}\left(0\right)-R_{\mathbf{YX}}\left(0\right)-R_{\mathbf{XY}}\left(0\right)+R_{\mathbf{Y}}\left(0\right)</math><math class="inline">=E\left[\mathbf{X}\left(0\right)\mathbf{X}^{\star}\left(0\right)\right]-E\left[\mathbf{Y}\left(0\right)\mathbf{X}^{\star}\left(0\right)\right]-E\left[\mathbf{X}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right]+E\left[\mathbf{Y}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right]</math><math class="inline">=E\left[\mathbf{X}\left(0\right)\mathbf{X}^{\star}\left(0\right)-\mathbf{Y}\left(0\right)\mathbf{X}^{\star}\left(0\right)-\mathbf{X}\left(0\right)\mathbf{Y}^{\star}\left(0\right)+\mathbf{Y}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right]</math><math class="inline">=E\left[\left(\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right)\left(\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right)^{\star}\right]=E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]. </math>
  
By Cauchy-Schwarz inequality, <math>\left|R_{\mathbf{X}}\left(\tau\right)-R_{\mathbf{XY}}\left(\tau\right)\right|^{2}\leq R_{\mathbf{X}}\left(0\right)E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0</math> .
+
By Cauchy-Schwarz inequality, <math class="inline">\left|R_{\mathbf{X}}\left(\tau\right)-R_{\mathbf{XY}}\left(\tau\right)\right|^{2}\leq R_{\mathbf{X}}\left(0\right)E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0</math> .
  
<math>\therefore\; R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right).</math>  Similarly,   
+
<math class="inline">\therefore\; R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right).</math>  Similarly,   
  
<math>E\left[\left(\mathbf{X}\left(t\right)-\mathbf{Y}\left(t\right)\right)\mathbf{Y}^{\star}\left(t+\tau\right)\right]^{2}\leq E\left[\left|\mathbf{X}\left(t\right)-\mathbf{Y}\left(t\right)\right|^{2}\right]E\left[\left|\mathbf{Y}\left(t+\tau\right)\right|^{2}\right]</math><math>\left|R_{\mathbf{XY}}\left(\tau\right)-R_{\mathbf{Y}}\left(\tau\right)\right|^{2}\leq E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]R_{\mathbf{Y}}\left(0\right)=0.</math>
+
<math class="inline">E\left[\left(\mathbf{X}\left(t\right)-\mathbf{Y}\left(t\right)\right)\mathbf{Y}^{\star}\left(t+\tau\right)\right]^{2}\leq E\left[\left|\mathbf{X}\left(t\right)-\mathbf{Y}\left(t\right)\right|^{2}\right]E\left[\left|\mathbf{Y}\left(t+\tau\right)\right|^{2}\right]</math><math class="inline">\left|R_{\mathbf{XY}}\left(\tau\right)-R_{\mathbf{Y}}\left(\tau\right)\right|^{2}\leq E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]R_{\mathbf{Y}}\left(0\right)=0.</math>
  
<math>\therefore\; R_{\mathbf{XY}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right).</math>  
+
<math class="inline">\therefore\; R_{\mathbf{XY}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right).</math>  
  
Thus, <math>R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right).</math>  
+
Thus, <math class="inline">R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right).</math>  
  
 
3. (20 pts)
 
3. (20 pts)
  
Let <math>\mathbf{X}_{t}</math>  be a zero mean continuous parameter random process. Let <math>g(t)</math>  and <math>w\left(t\right)</math>  be measurable functions defined on the real numbers. Further, let <math>w\left(t\right)</math>  be even. Let the autocorrelation function of <math>\mathbf{X}_{t}</math>  be <math>\frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)}</math> . From the new random process <math>\mathbf{Y}_{i}=\frac{\mathbf{X}\left(t\right)}{g\left(t\right)}</math> . Is <math>\mathbf{Y}_{t}</math>  w.s.s. ?
+
Let <math class="inline">\mathbf{X}_{t}</math>  be a zero mean continuous parameter random process. Let <math class="inline">g(t)</math>  and <math class="inline">w\left(t\right)</math>  be measurable functions defined on the real numbers. Further, let <math class="inline">w\left(t\right)</math>  be even. Let the autocorrelation function of <math class="inline">\mathbf{X}_{t}</math>  be <math class="inline">\frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)}</math> . From the new random process <math class="inline">\mathbf{Y}_{i}=\frac{\mathbf{X}\left(t\right)}{g\left(t\right)}</math> . Is <math class="inline">\mathbf{Y}_{t}</math>  w.s.s. ?
  
<math>E\left[\mathbf{Y}\left(t\right)\right]=E\left[\frac{\mathbf{X}\left(t\right)}{g\left(t\right)}\right]=\frac{1}{g\left(x\right)}E\left[\mathbf{X}\left(t\right)\right]=0.</math>  
+
<math class="inline">E\left[\mathbf{Y}\left(t\right)\right]=E\left[\frac{\mathbf{X}\left(t\right)}{g\left(t\right)}\right]=\frac{1}{g\left(x\right)}E\left[\mathbf{X}\left(t\right)\right]=0.</math>  
  
<math>E\left[\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)\right]=E\left[\frac{\mathbf{X}\left(t_{1}\right)\mathbf{X}^{\star}\left(t_{2}\right)}{g\left(t_{1}\right)g\left(t_{2}\right)}\right]=\frac{1}{g\left(t_{1}\right)g\left(t_{2}\right)}E\left[\mathbf{X}\left(t_{1}\right)\mathbf{X}^{\star}\left(t_{2}\right)\right]</math><math>=\frac{1}{g\left(t_{1}\right)g\left(t_{2}\right)}\times\frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)}=\frac{1}{w\left(t_{1}-t_{2}\right)},</math>   
+
<math class="inline">E\left[\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)\right]=E\left[\frac{\mathbf{X}\left(t_{1}\right)\mathbf{X}^{\star}\left(t_{2}\right)}{g\left(t_{1}\right)g\left(t_{2}\right)}\right]=\frac{1}{g\left(t_{1}\right)g\left(t_{2}\right)}E\left[\mathbf{X}\left(t_{1}\right)\mathbf{X}^{\star}\left(t_{2}\right)\right]</math><math class="inline">=\frac{1}{g\left(t_{1}\right)g\left(t_{2}\right)}\times\frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)}=\frac{1}{w\left(t_{1}-t_{2}\right)},</math>   
  
which depends on <math>t_{1}-t_{2}</math> .  
+
which depends on <math class="inline">t_{1}-t_{2}</math> .  
 
<br>
 
<br>
<math>\therefore\;\mathbf{Y}_{t}\text{ is wide-sense stationary.}</math>  
+
<math class="inline">\therefore\;\mathbf{Y}_{t}\text{ is wide-sense stationary.}</math>  
 
<br>
 
<br>
 
4. (20 pts)
 
4. (20 pts)
  
Let <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be i.i.d.  random variables with absolutely continuous probability distribution function <math>F\left(x\right)</math> . Let the random variable <math>\mathbf{Y}_{j}</math>  be the <math>j</math> -th order statistic of the <math>\mathbf{X}_{i}</math> 's. that is: <math>\mathbf{Y}_{j}=j\text{-th smallest of }\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\}</math> .  
+
Let <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be i.i.d.  random variables with absolutely continuous probability distribution function <math class="inline">F\left(x\right)</math> . Let the random variable <math class="inline">\mathbf{Y}_{j}</math>  be the <math class="inline">j</math> -th order statistic of the <math class="inline">\mathbf{X}_{i}</math> 's. that is: <math class="inline">\mathbf{Y}_{j}=j\text{-th smallest of }\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\}</math> .  
  
 
(a)
 
(a)
Line 67: Line 67:
 
Find the probability density function of the first order statistic. (You may assume n  is odd.)
 
Find the probability density function of the first order statistic. (You may assume n  is odd.)
  
<math>F_{\mathbf{Y}_{1}}\left(y\right)=P\left(\left\{ \mathbf{Y}_{1}\leq y\right\} \right)=1-P\left(\left\{ \mathbf{Y}_{1}>y\right\} \right)</math><math>=1-P\left(\left\{ \mathbf{X}_{1}>y\right\} \cap\left\{ \mathbf{X}_{2}>y\right\} \cap\cdots\cap\left\{ \mathbf{X}_{n}>y\right\} \right)</math><math>=1-\prod_{i=1}^{n}P\left(\mathbf{X}_{i}>y\right)=1-\left(1-F_{\mathbf{X}}\left(y\right)\right)^{n}.  
+
<math class="inline">F_{\mathbf{Y}_{1}}\left(y\right)=P\left(\left\{ \mathbf{Y}_{1}\leq y\right\} \right)=1-P\left(\left\{ \mathbf{Y}_{1}>y\right\} \right)</math><math class="inline">=1-P\left(\left\{ \mathbf{X}_{1}>y\right\} \cap\left\{ \mathbf{X}_{2}>y\right\} \cap\cdots\cap\left\{ \mathbf{X}_{n}>y\right\} \right)</math><math class="inline">=1-\prod_{i=1}^{n}P\left(\mathbf{X}_{i}>y\right)=1-\left(1-F_{\mathbf{X}}\left(y\right)\right)^{n}.  
  
<math>f_{\mathbf{Y}_{1}}\left(y\right)=\frac{d}{dy}F_{\mathbf{Y}_{1}}\left(y\right)=n\left(1-F_{\mathbf{X}}\left(y\right)\right)^{n-1}f_{\mathbf{X}}\left(y\right).</math>  
+
<math class="inline">f_{\mathbf{Y}_{1}}\left(y\right)=\frac{d}{dy}F_{\mathbf{Y}_{1}}\left(y\right)=n\left(1-F_{\mathbf{X}}\left(y\right)\right)^{n-1}f_{\mathbf{X}}\left(y\right).</math>  
  
 
5. (20 pts)
 
5. (20 pts)
  
Let <math>\mathbf{X}</math>  be a random variable with absolutely continuous probability distribution function. Show that for any <math>\alpha>0</math>  and any real number <math>s</math> :<math>P\left(e^{s\mathbf{X}}\geq\alpha\right)\leq\frac{\phi\left(s\right)}{\alpha}</math> where <math>\phi\left(s\right)</math>  is the moment generating function, <math>\phi\left(s\right)=E\left[e^{s\mathbf{X}}\right]</math> . Note: <math>\phi\left(s\right)</math>  can be related to the Laplace Transform of <math>f_{\mathbf{X}}\left(x\right)</math> .
+
Let <math class="inline">\mathbf{X}</math>  be a random variable with absolutely continuous probability distribution function. Show that for any <math class="inline">\alpha>0</math>  and any real number <math class="inline">s</math> :<math class="inline">P\left(e^{s\mathbf{X}}\geq\alpha\right)\leq\frac{\phi\left(s\right)}{\alpha}</math> where <math class="inline">\phi\left(s\right)</math>  is the moment generating function, <math class="inline">\phi\left(s\right)=E\left[e^{s\mathbf{X}}\right]</math> . Note: <math class="inline">\phi\left(s\right)</math>  can be related to the Laplace Transform of <math class="inline">f_{\mathbf{X}}\left(x\right)</math> .
  
 
Note
 
Note
Line 79: Line 79:
 
This is similar to the proof of [[ECE 600 Chebyshev Inequality|Chebyshev Inequality]].
 
This is similar to the proof of [[ECE 600 Chebyshev Inequality|Chebyshev Inequality]].
  
<math>g_{1}\left(x\right)=1_{\left(x\right)_{\left\{ r:e^{sx}\geq\alpha\right\} }},\; g_{2}\left(x\right)=\frac{e^{sx}}{\alpha}.</math>  
+
<math class="inline">g_{1}\left(x\right)=1_{\left(x\right)_{\left\{ r:e^{sx}\geq\alpha\right\} }},\; g_{2}\left(x\right)=\frac{e^{sx}}{\alpha}.</math>  
  
 
[[Image:pasted19.png]]
 
[[Image:pasted19.png]]
  
<math>E\left[g_{2}\left(\mathbf{X}\right)-g_{1}\left(\mathbf{X}\right)\right]=E\left[g_{2}\left(\mathbf{X}\right)\right]-E\left[g_{1}\left(\mathbf{X}\right)\right]=\frac{\phi\left(s\right)}{\alpha}-P\left(\left\{ e^{s\mathbf{X}}\geq\alpha\right\} \right)\geq0.</math>  
+
<math class="inline">E\left[g_{2}\left(\mathbf{X}\right)-g_{1}\left(\mathbf{X}\right)\right]=E\left[g_{2}\left(\mathbf{X}\right)\right]-E\left[g_{1}\left(\mathbf{X}\right)\right]=\frac{\phi\left(s\right)}{\alpha}-P\left(\left\{ e^{s\mathbf{X}}\geq\alpha\right\} \right)\geq0.</math>  
  
<math>\therefore\; P\left(\left\{ e^{s\mathbf{X}}\geq\alpha\right\} \right)\leq\frac{\phi\left(s\right)}{\alpha}.</math>  
+
<math class="inline">\therefore\; P\left(\left\{ e^{s\mathbf{X}}\geq\alpha\right\} \right)\leq\frac{\phi\left(s\right)}{\alpha}.</math>  
  
 
----
 
----

Revision as of 07:24, 1 December 2010

7.4 QE 2002 January

1. (20 pts)

Given two coins; the first coin is fair and the second coin has two heads. One coin is picked at random and tossed two times. It shows heads both times. What is the probability that the coin picked is fair?

$ F=\left\{ \text{fair coin is selected}\right\} $

$ S=\left\{ \text{the coin that has two heads is selected}\right\} $

$ H2=\left\{ \text{heads are shown both times}\right\} $

$ P\left(H2|F\right)=\frac{1}{4},\; P\left(H2|S\right)=1,\; P\left(F\right)=P\left(S\right)=\frac{1}{2}. $

• By using Bayes' theorem,$ P\left(F|H2\right)=\frac{P\left(H2|F\right)P\left(F\right)}{P\left(H2|F\right)P\left(F\right)+P\left(H2|S\right)P\left(S\right)}=\frac{P\left(H2|F\right)}{P\left(H2|F\right)+P\left(H2|S\right)}=\frac{\frac{1}{4}}{\frac{1}{4}+1}=\frac{1}{5}. $

2. (20 pts)

Let $ \mathbf{X}_{t} $ and $ \mathbf{Y}_{t} $ by jointly wide sense stationary continous parameter random processes with $ E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0 $ . Show that $ R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right) $ .

$ E\left[\mathbf{X}\left(t\right)\left(\mathbf{X}^{\star}\left(t+\tau\right)-\mathbf{Y}^{\star}\left(t+\tau\right)\right)\right]=E\left[\mathbf{X}\left(t\right)\mathbf{X}^{\star}\left(t+\tau\right)\right]-E\left[\mathbf{X}\left(t\right)\mathbf{Y}^{\star}\left(t+\tau\right)\right]=R_{\mathbf{X}}\left(\tau\right)-R_{\mathbf{XY}}\left(\tau\right). $

$ E\left[\left|\mathbf{X}\left(t\right)\right|^{2}\right]=E\left[\mathbf{X}\left(t\right)\mathbf{X}^{\star}\left(t\right)\right]=R_{\mathbf{X}}\left(0\right). $

$ E\left[\left|\mathbf{X}\left(t+\tau\right)-\mathbf{Y}\left(t+\tau\right)\right|^{2}\right]=E\left[\left(\mathbf{X}\left(t+\tau\right)-\mathbf{Y}\left(t+\tau\right)\right)\left(\mathbf{X}^{\star}\left(t+\tau\right)-\mathbf{Y}^{\star}\left(t+\tau\right)\right)\right] $$ =R_{\mathbf{X}}\left(0\right)-R_{\mathbf{YX}}\left(0\right)-R_{\mathbf{XY}}\left(0\right)+R_{\mathbf{Y}}\left(0\right) $$ =E\left[\mathbf{X}\left(0\right)\mathbf{X}^{\star}\left(0\right)\right]-E\left[\mathbf{Y}\left(0\right)\mathbf{X}^{\star}\left(0\right)\right]-E\left[\mathbf{X}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right]+E\left[\mathbf{Y}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right] $$ =E\left[\mathbf{X}\left(0\right)\mathbf{X}^{\star}\left(0\right)-\mathbf{Y}\left(0\right)\mathbf{X}^{\star}\left(0\right)-\mathbf{X}\left(0\right)\mathbf{Y}^{\star}\left(0\right)+\mathbf{Y}\left(0\right)\mathbf{Y}^{\star}\left(0\right)\right] $$ =E\left[\left(\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right)\left(\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right)^{\star}\right]=E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]. $

By Cauchy-Schwarz inequality, $ \left|R_{\mathbf{X}}\left(\tau\right)-R_{\mathbf{XY}}\left(\tau\right)\right|^{2}\leq R_{\mathbf{X}}\left(0\right)E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0 $ .

$ \therefore\; R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right). $ Similarly,

$ E\left[\left(\mathbf{X}\left(t\right)-\mathbf{Y}\left(t\right)\right)\mathbf{Y}^{\star}\left(t+\tau\right)\right]^{2}\leq E\left[\left|\mathbf{X}\left(t\right)-\mathbf{Y}\left(t\right)\right|^{2}\right]E\left[\left|\mathbf{Y}\left(t+\tau\right)\right|^{2}\right] $$ \left|R_{\mathbf{XY}}\left(\tau\right)-R_{\mathbf{Y}}\left(\tau\right)\right|^{2}\leq E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]R_{\mathbf{Y}}\left(0\right)=0. $

$ \therefore\; R_{\mathbf{XY}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right). $

Thus, $ R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right). $

3. (20 pts)

Let $ \mathbf{X}_{t} $ be a zero mean continuous parameter random process. Let $ g(t) $ and $ w\left(t\right) $ be measurable functions defined on the real numbers. Further, let $ w\left(t\right) $ be even. Let the autocorrelation function of $ \mathbf{X}_{t} $ be $ \frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)} $ . From the new random process $ \mathbf{Y}_{i}=\frac{\mathbf{X}\left(t\right)}{g\left(t\right)} $ . Is $ \mathbf{Y}_{t} $ w.s.s. ?

$ E\left[\mathbf{Y}\left(t\right)\right]=E\left[\frac{\mathbf{X}\left(t\right)}{g\left(t\right)}\right]=\frac{1}{g\left(x\right)}E\left[\mathbf{X}\left(t\right)\right]=0. $

$ E\left[\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)\right]=E\left[\frac{\mathbf{X}\left(t_{1}\right)\mathbf{X}^{\star}\left(t_{2}\right)}{g\left(t_{1}\right)g\left(t_{2}\right)}\right]=\frac{1}{g\left(t_{1}\right)g\left(t_{2}\right)}E\left[\mathbf{X}\left(t_{1}\right)\mathbf{X}^{\star}\left(t_{2}\right)\right] $$ =\frac{1}{g\left(t_{1}\right)g\left(t_{2}\right)}\times\frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)}=\frac{1}{w\left(t_{1}-t_{2}\right)}, $

which depends on $ t_{1}-t_{2} $ .
$ \therefore\;\mathbf{Y}_{t}\text{ is wide-sense stationary.} $
4. (20 pts)

Let $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n} $ be i.i.d. random variables with absolutely continuous probability distribution function $ F\left(x\right) $ . Let the random variable $ \mathbf{Y}_{j} $ be the $ j $ -th order statistic of the $ \mathbf{X}_{i} $ 's. that is: $ \mathbf{Y}_{j}=j\text{-th smallest of }\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\} $ .

(a)

What is another name for the first order statistic?

minimum

(b)

What is another name for the n/2 order statistic?

sample median

(c)

Find the probability density function of the first order statistic. (You may assume n is odd.)

$ F_{\mathbf{Y}_{1}}\left(y\right)=P\left(\left\{ \mathbf{Y}_{1}\leq y\right\} \right)=1-P\left(\left\{ \mathbf{Y}_{1}>y\right\} \right) $$ =1-P\left(\left\{ \mathbf{X}_{1}>y\right\} \cap\left\{ \mathbf{X}_{2}>y\right\} \cap\cdots\cap\left\{ \mathbf{X}_{n}>y\right\} \right) $$ =1-\prod_{i=1}^{n}P\left(\mathbf{X}_{i}>y\right)=1-\left(1-F_{\mathbf{X}}\left(y\right)\right)^{n}. <math class="inline">f_{\mathbf{Y}_{1}}\left(y\right)=\frac{d}{dy}F_{\mathbf{Y}_{1}}\left(y\right)=n\left(1-F_{\mathbf{X}}\left(y\right)\right)^{n-1}f_{\mathbf{X}}\left(y\right). $

5. (20 pts)

Let $ \mathbf{X} $ be a random variable with absolutely continuous probability distribution function. Show that for any $ \alpha>0 $ and any real number $ s $ :$ P\left(e^{s\mathbf{X}}\geq\alpha\right)\leq\frac{\phi\left(s\right)}{\alpha} $ where $ \phi\left(s\right) $ is the moment generating function, $ \phi\left(s\right)=E\left[e^{s\mathbf{X}}\right] $ . Note: $ \phi\left(s\right) $ can be related to the Laplace Transform of $ f_{\mathbf{X}}\left(x\right) $ .

Note

This is similar to the proof of Chebyshev Inequality.

$ g_{1}\left(x\right)=1_{\left(x\right)_{\left\{ r:e^{sx}\geq\alpha\right\} }},\; g_{2}\left(x\right)=\frac{e^{sx}}{\alpha}. $

Pasted19.png

$ E\left[g_{2}\left(\mathbf{X}\right)-g_{1}\left(\mathbf{X}\right)\right]=E\left[g_{2}\left(\mathbf{X}\right)\right]-E\left[g_{1}\left(\mathbf{X}\right)\right]=\frac{\phi\left(s\right)}{\alpha}-P\left(\left\{ e^{s\mathbf{X}}\geq\alpha\right\} \right)\geq0. $

$ \therefore\; P\left(\left\{ e^{s\mathbf{X}}\geq\alpha\right\} \right)\leq\frac{\phi\left(s\right)}{\alpha}. $


Back to ECE600

Back to ECE 600 QE

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood