(New page: ==7.6 QE 2003 January== '''Problem 1 (30 points)''' '''i)''' Let <math>\mathbf{X}</math> and <math>\mathbf{Y}</math> be jointly Gaussian (normal) distributed random variables with mea...)
 
 
(One intermediate revision by one other user not shown)
Line 5: Line 5:
 
'''i)'''
 
'''i)'''
  
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be jointly Gaussian (normal) distributed random variables with mean <math>0</math> , <math>E\left[\mathbf{X}^{2}\right]=E\left[\mathbf{Y}^{2}\right]=\sigma^{2}</math>  and <math>E\left[\mathbf{XY}\right]=\rho\sigma^{2}</math>  with <math>\left|\rho\right|<1</math> . Find the joint characteristic function <math>E\left[e^{i\left(h_{1}\mathbf{X}+h_{2}\mathbf{Y}\right)}\right]</math> .
+
Let <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  be jointly Gaussian (normal) distributed random variables with mean <math class="inline">0</math> , <math class="inline">E\left[\mathbf{X}^{2}\right]=E\left[\mathbf{Y}^{2}\right]=\sigma^{2}</math>  and <math class="inline">E\left[\mathbf{XY}\right]=\rho\sigma^{2}</math>  with <math class="inline">\left|\rho\right|<1</math> . Find the joint characteristic function <math class="inline">E\left[e^{i\left(h_{1}\mathbf{X}+h_{2}\mathbf{Y}\right)}\right]</math> .
  
• We can find the correlation coefficient using the covariance and variances <math>r=\frac{Cov\left(\mathbf{X},\mathbf{Y}\right)}{\sigma^{2}}=\frac{E\left[\mathbf{XY}\right]-E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right]}{\sigma^{2}}=\frac{\rho\sigma^{2}-0\cdot0}{\sigma^{2}}=\rho.</math>  
+
• We can find the correlation coefficient using the covariance and variances <math class="inline">r=\frac{Cov\left(\mathbf{X},\mathbf{Y}\right)}{\sigma^{2}}=\frac{E\left[\mathbf{XY}\right]-E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right]}{\sigma^{2}}=\frac{\rho\sigma^{2}-0\cdot0}{\sigma^{2}}=\rho.</math>  
  
• Now, we can get the joint characteristic function <math>\Phi_{\mathbf{X}\mathbf{Y}}\left(\omega_{1},\omega_{2}\right)=e^{i\left(0\cdot\omega_{1}+0\cdot\omega_{2}\right)}e^{-\frac{1}{2}\left(\sigma^{2}\omega_{1}^{2}+2r\sigma^{2}\omega_{1}\omega_{2}+\sigma^{2}\omega_{2}\right)}=e^{-\frac{1}{2}\sigma^{2}\omega^{2}\left(2+2\rho\right)}=e^{-\sigma^{2}\omega^{2}\left(1+\rho\right)}.</math>  
+
• Now, we can get the joint characteristic function <math class="inline">\Phi_{\mathbf{X}\mathbf{Y}}\left(\omega_{1},\omega_{2}\right)=e^{i\left(0\cdot\omega_{1}+0\cdot\omega_{2}\right)}e^{-\frac{1}{2}\left(\sigma^{2}\omega_{1}^{2}+2r\sigma^{2}\omega_{1}\omega_{2}+\sigma^{2}\omega_{2}\right)}=e^{-\frac{1}{2}\sigma^{2}\omega^{2}\left(2+2\rho\right)}=e^{-\sigma^{2}\omega^{2}\left(1+\rho\right)}.</math>  
  
 
'''ii)'''
 
'''ii)'''
  
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be two jointly Gaussian distributed r.v's with identical means and variances but are not necessarily independent. Show that the r.v. <math>\mathbf{V}=\mathbf{X}+\mathbf{Y}</math>  is independeent of the r.v. <math>\mathbf{W}=\mathbf{X}-\mathbf{Y}</math> . Is the same answer true for <math>\mathbf{A}=f\left(\mathbf{V}\right)</math>  and <math>\mathbf{B}=g\left(\mathbf{W}\right)</math>  where <math>f\left(\cdot\right)</math>  and <math>g\left(\cdot\right)</math>  are suitable functions such that <math>E\left[f\left(\mathbf{V}\right)\right]<\infty</math>  and <math>E\left[g\left(\mathbf{W}\right)\right]<\infty</math> . Given reasons.
+
Let <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  be two jointly Gaussian distributed r.v's with identical means and variances but are not necessarily independent. Show that the r.v. <math class="inline">\mathbf{V}=\mathbf{X}+\mathbf{Y}</math>  is independeent of the r.v. <math class="inline">\mathbf{W}=\mathbf{X}-\mathbf{Y}</math> . Is the same answer true for <math class="inline">\mathbf{A}=f\left(\mathbf{V}\right)</math>  and <math class="inline">\mathbf{B}=g\left(\mathbf{W}\right)</math>  where <math class="inline">f\left(\cdot\right)</math>  and <math class="inline">g\left(\cdot\right)</math>  are suitable functions such that <math class="inline">E\left[f\left(\mathbf{V}\right)\right]<\infty</math>  and <math class="inline">E\left[g\left(\mathbf{W}\right)\right]<\infty</math> . Given reasons.
  
 
'''iii)'''
 
'''iii)'''
  
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be independent <math>N\left(m,1\right)</math>  random variables. Show that the sample mean <math>\mathbf{M}=\frac{\mathbf{X}+\mathbf{Y}}{2}</math>  is independent of the sample variance <math>\mathbf{V}=\left(\mathbf{X}-\mathbf{M}\right)^{2}+\left(\mathbf{Y}-\mathbf{M}\right)^{2}</math> . Note: <math>\mathbf{V}</math>  is not a Gaussian random variable.
+
Let <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  be independent <math class="inline">N\left(m,1\right)</math>  random variables. Show that the sample mean <math class="inline">\mathbf{M}=\frac{\mathbf{X}+\mathbf{Y}}{2}</math>  is independent of the sample variance <math class="inline">\mathbf{V}=\left(\mathbf{X}-\mathbf{M}\right)^{2}+\left(\mathbf{Y}-\mathbf{M}\right)^{2}</math> . Note: <math class="inline">\mathbf{V}</math>  is not a Gaussian random variable.
  
 
'''Problem 2 (35 points)'''
 
'''Problem 2 (35 points)'''
  
Consider the stochastic process <math>\left\{ \mathbf{X}_{n}\right\}</math>  defined by: <math>\mathbf{X}_{n+1}=a\mathbf{X}_{n}+b\mathbf{W}_{n} where \mathbf{X}_{0}\sim N\left(0,\sigma^{2}\right)</math>  and <math>\left\{ \mathbf{W}_{n}\right\}</math>  is an i.i.d.  <math>N\left(0,1\right)</math>  sequence of r.v's independent of <math>\mathbf{X}_{0}</math> .
+
Consider the stochastic process <math class="inline">\left\{ \mathbf{X}_{n}\right\}</math>  defined by: <math class="inline">\mathbf{X}_{n+1}=a\mathbf{X}_{n}+b\mathbf{W}_{n} where \mathbf{X}_{0}\sim N\left(0,\sigma^{2}\right)</math>  and <math class="inline">\left\{ \mathbf{W}_{n}\right\}</math>  is an i.i.d.  <math class="inline">N\left(0,1\right)</math>  sequence of r.v's independent of <math class="inline">\mathbf{X}_{0}</math> .
  
 
'''i)'''
 
'''i)'''
  
Show that if <math>R_{k}=cov\left(\mathbf{X}_{k},\mathbf{X}_{k}\right)</math>  converges as <math>k\rightarrow\infty</math> , then <math>\left\{ \mathbf{X}_{k}\right\}</math>  converges to a w.s.s. process.
+
Show that if <math class="inline">R_{k}=cov\left(\mathbf{X}_{k},\mathbf{X}_{k}\right)</math>  converges as <math class="inline">k\rightarrow\infty</math> , then <math class="inline">\left\{ \mathbf{X}_{k}\right\}</math>  converges to a w.s.s. process.
  
 
'''ii)'''
 
'''ii)'''
  
Show that if <math>\sigma^{2}</math>  is chosen appropriately and <math>\left|a\right|<1</math> , then <math>\left\{ \mathbf{X}_{k}\right\}</math>  will be a stationary process for all <math>k</math> .
+
Show that if <math class="inline">\sigma^{2}</math>  is chosen appropriately and <math class="inline">\left|a\right|<1</math> , then <math class="inline">\left\{ \mathbf{X}_{k}\right\}</math>  will be a stationary process for all <math class="inline">k</math> .
  
 
'''iii)'''
 
'''iii)'''
  
If <math>\left|a\right|>1</math> , show that the variance of the process <math>\left\{ \mathbf{X}_{k}\right\}</math>  diverges but <math>\frac{\mathbf{X}_{k}}{\left|a\right|^{k}}</math>  converges in the mean square.
+
If <math class="inline">\left|a\right|>1</math> , show that the variance of the process <math class="inline">\left\{ \mathbf{X}_{k}\right\}</math>  diverges but <math class="inline">\frac{\mathbf{X}_{k}}{\left|a\right|^{k}}</math>  converges in the mean square.
  
 
Problem 3 (35 points)
 
Problem 3 (35 points)
Line 39: Line 39:
 
'''i)'''
 
'''i)'''
  
Catastrophes occur at times <math>\mathbf{T}_{1},\mathbf{T}_{2},\cdots</math>,  where <math>\mathbf{T}_{i}=\sum_{k=1}^{i}\mathbf{X}_{k}</math>  where the <math>\mathbf{X}_{k}</math> 's are independent, identically distributed positive random variables. Let <math>\mathbf{N}_{t}=\max\left\{ n:\mathbf{T}_{n}\leq t\right\}</math>  be the number of catastrophes which have occurred by time <math>t</math> . Show that if <math>E\left[\mathbf{X}_{1}\right]<\infty</math>  then <math>\mathbf{N}_{t}\rightarrow\infty</math>  almost surely (a.s.) and <math>\frac{\mathbf{N}_{t}}{t}\rightarrow\frac{1}{E\left[\mathbf{X}_{1}\right]}</math>  as <math>t\rightarrow\infty</math>  a.s.
+
Catastrophes occur at times <math class="inline">\mathbf{T}_{1},\mathbf{T}_{2},\cdots</math>,  where <math class="inline">\mathbf{T}_{i}=\sum_{k=1}^{i}\mathbf{X}_{k}</math>  where the <math class="inline">\mathbf{X}_{k}</math> 's are independent, identically distributed positive random variables. Let <math class="inline">\mathbf{N}_{t}=\max\left\{ n:\mathbf{T}_{n}\leq t\right\}</math>  be the number of catastrophes which have occurred by time <math class="inline">t</math> . Show that if <math class="inline">E\left[\mathbf{X}_{1}\right]<\infty</math>  then <math class="inline">\mathbf{N}_{t}\rightarrow\infty</math>  almost surely (a.s.) and <math class="inline">\frac{\mathbf{N}_{t}}{t}\rightarrow\frac{1}{E\left[\mathbf{X}_{1}\right]}</math>  as <math class="inline">t\rightarrow\infty</math>  a.s.
  
 
'''ii)'''
 
'''ii)'''
  
Let <math>\left\{ \mathbf{X}_{t},t\geq0\right\}</math>  be a stochastic process defined by: <math>\mathbf{X}_{t}=\sqrt{2}\cos\left(2\pi\xi t\right)</math> where <math>\xi</math>  is a <math>N\left(0,1\right)</math>  random variable. Show that as <math>t\rightarrow\infty,\;\left\{ \mathbf{X}_{t}\right\}</math>  converges to a wide sense stationary process. Find the spectral density of the limit process.
+
Let <math class="inline">\left\{ \mathbf{X}_{t},t\geq0\right\}</math>  be a stochastic process defined by: <math class="inline">\mathbf{X}_{t}=\sqrt{2}\cos\left(2\pi\xi t\right)</math> where <math class="inline">\xi</math>  is a <math class="inline">N\left(0,1\right)</math>  random variable. Show that as <math class="inline">t\rightarrow\infty,\;\left\{ \mathbf{X}_{t}\right\}</math>  converges to a wide sense stationary process. Find the spectral density of the limit process.
  
 
'''Hint:'''
 
'''Hint:'''
  
Use the fact that the characteristic function of a <math>N\left(0,1\right)</math>  r.v. is given by <math>E\left[e^{ih\mathbf{X}}\right]=e^{-\frac{h^{2}}{2}}</math> .
+
Use the fact that the characteristic function of a <math class="inline">N\left(0,1\right)</math>  r.v. is given by <math class="inline">E\left[e^{ih\mathbf{X}}\right]=e^{-\frac{h^{2}}{2}}</math> .
  
 
----
 
----
 
[[ECE600|Back to ECE600]]
 
[[ECE600|Back to ECE600]]
  
[[ECE 600 QE|Back to ECE 600 QE]]
+
[[ECE 600 QE|Back to my ECE 600 QE page]]
 +
 
 +
[[ECE_PhD_Qualifying_Exams|Back to the general ECE PHD QE page]] (for problem discussion)

Latest revision as of 08:26, 27 June 2012

7.6 QE 2003 January

Problem 1 (30 points)

i)

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be jointly Gaussian (normal) distributed random variables with mean $ 0 $ , $ E\left[\mathbf{X}^{2}\right]=E\left[\mathbf{Y}^{2}\right]=\sigma^{2} $ and $ E\left[\mathbf{XY}\right]=\rho\sigma^{2} $ with $ \left|\rho\right|<1 $ . Find the joint characteristic function $ E\left[e^{i\left(h_{1}\mathbf{X}+h_{2}\mathbf{Y}\right)}\right] $ .

• We can find the correlation coefficient using the covariance and variances $ r=\frac{Cov\left(\mathbf{X},\mathbf{Y}\right)}{\sigma^{2}}=\frac{E\left[\mathbf{XY}\right]-E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right]}{\sigma^{2}}=\frac{\rho\sigma^{2}-0\cdot0}{\sigma^{2}}=\rho. $

• Now, we can get the joint characteristic function $ \Phi_{\mathbf{X}\mathbf{Y}}\left(\omega_{1},\omega_{2}\right)=e^{i\left(0\cdot\omega_{1}+0\cdot\omega_{2}\right)}e^{-\frac{1}{2}\left(\sigma^{2}\omega_{1}^{2}+2r\sigma^{2}\omega_{1}\omega_{2}+\sigma^{2}\omega_{2}\right)}=e^{-\frac{1}{2}\sigma^{2}\omega^{2}\left(2+2\rho\right)}=e^{-\sigma^{2}\omega^{2}\left(1+\rho\right)}. $

ii)

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be two jointly Gaussian distributed r.v's with identical means and variances but are not necessarily independent. Show that the r.v. $ \mathbf{V}=\mathbf{X}+\mathbf{Y} $ is independeent of the r.v. $ \mathbf{W}=\mathbf{X}-\mathbf{Y} $ . Is the same answer true for $ \mathbf{A}=f\left(\mathbf{V}\right) $ and $ \mathbf{B}=g\left(\mathbf{W}\right) $ where $ f\left(\cdot\right) $ and $ g\left(\cdot\right) $ are suitable functions such that $ E\left[f\left(\mathbf{V}\right)\right]<\infty $ and $ E\left[g\left(\mathbf{W}\right)\right]<\infty $ . Given reasons.

iii)

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be independent $ N\left(m,1\right) $ random variables. Show that the sample mean $ \mathbf{M}=\frac{\mathbf{X}+\mathbf{Y}}{2} $ is independent of the sample variance $ \mathbf{V}=\left(\mathbf{X}-\mathbf{M}\right)^{2}+\left(\mathbf{Y}-\mathbf{M}\right)^{2} $ . Note: $ \mathbf{V} $ is not a Gaussian random variable.

Problem 2 (35 points)

Consider the stochastic process $ \left\{ \mathbf{X}_{n}\right\} $ defined by: $ \mathbf{X}_{n+1}=a\mathbf{X}_{n}+b\mathbf{W}_{n} where \mathbf{X}_{0}\sim N\left(0,\sigma^{2}\right) $ and $ \left\{ \mathbf{W}_{n}\right\} $ is an i.i.d. $ N\left(0,1\right) $ sequence of r.v's independent of $ \mathbf{X}_{0} $ .

i)

Show that if $ R_{k}=cov\left(\mathbf{X}_{k},\mathbf{X}_{k}\right) $ converges as $ k\rightarrow\infty $ , then $ \left\{ \mathbf{X}_{k}\right\} $ converges to a w.s.s. process.

ii)

Show that if $ \sigma^{2} $ is chosen appropriately and $ \left|a\right|<1 $ , then $ \left\{ \mathbf{X}_{k}\right\} $ will be a stationary process for all $ k $ .

iii)

If $ \left|a\right|>1 $ , show that the variance of the process $ \left\{ \mathbf{X}_{k}\right\} $ diverges but $ \frac{\mathbf{X}_{k}}{\left|a\right|^{k}} $ converges in the mean square.

Problem 3 (35 points)

i)

Catastrophes occur at times $ \mathbf{T}_{1},\mathbf{T}_{2},\cdots $, where $ \mathbf{T}_{i}=\sum_{k=1}^{i}\mathbf{X}_{k} $ where the $ \mathbf{X}_{k} $ 's are independent, identically distributed positive random variables. Let $ \mathbf{N}_{t}=\max\left\{ n:\mathbf{T}_{n}\leq t\right\} $ be the number of catastrophes which have occurred by time $ t $ . Show that if $ E\left[\mathbf{X}_{1}\right]<\infty $ then $ \mathbf{N}_{t}\rightarrow\infty $ almost surely (a.s.) and $ \frac{\mathbf{N}_{t}}{t}\rightarrow\frac{1}{E\left[\mathbf{X}_{1}\right]} $ as $ t\rightarrow\infty $ a.s.

ii)

Let $ \left\{ \mathbf{X}_{t},t\geq0\right\} $ be a stochastic process defined by: $ \mathbf{X}_{t}=\sqrt{2}\cos\left(2\pi\xi t\right) $ where $ \xi $ is a $ N\left(0,1\right) $ random variable. Show that as $ t\rightarrow\infty,\;\left\{ \mathbf{X}_{t}\right\} $ converges to a wide sense stationary process. Find the spectral density of the limit process.

Hint:

Use the fact that the characteristic function of a $ N\left(0,1\right) $ r.v. is given by $ E\left[e^{ih\mathbf{X}}\right]=e^{-\frac{h^{2}}{2}} $ .


Back to ECE600

Back to my ECE 600 QE page

Back to the general ECE PHD QE page (for problem discussion)

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang