Revision as of 08:32, 27 June 2012 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

7.5 QE 2002 August

1. (25 Points)

Consider a random experiment in which a point is selected at random from the unit square (sample space $ S=\left[0,1\right]\times\left[0,1\right] ) $. Assume that all points in $ S $ are equally likely to be selected. Let the random variable $ \mathbf{X}\left(\omega\right) $ be the distance from the outcome $ \omega $ to the origin (the lower left corner of the unit square). Find the cumulative distribution function (cdf) $ F_{\mathbf{X}}\left(x\right)=P\left(\left\{ \mathbf{X}\leq x\right\} \right) $ of the random variable $ \mathbf{X} $ . Make sure and specify your answer for all $ x\in\mathbf{R} $ .

Pasted20.png

$ F_{\mathbf{X}}\left(x\right)=P\left(\left\{ \mathbf{X}\leq x\right\} \right)=P\left(\left\{ w:\mathbf{X}\left(w\right)\leq x\right\} \right). $

$ i)\; x<0,\; F_{\mathbf{X}}\left(x\right)=0 $

$ ii)\;0\leq x\leq1,\; F_{\mathbf{X}}\left(x\right)=\frac{\pi}{4}x^{2} $

$ iii)\;1<x<\sqrt{2}, F_{\mathbf{X}}\left(x\right)=2\left(\frac{1}{2}\times1\times\sqrt{x^{2}-1}\right)+\pi x^{2}\times\frac{\frac{\pi}{2}-2\theta}{2\pi} $$ =\sqrt{x^{2}-1}+\frac{\pi}{4}x^{2}-\theta x^{2}=\sqrt{x^{2}-1}+\frac{\pi}{4}x^{2}-x^{2}\cos^{-1}\frac{1}{x} $$ =\sqrt{x^{2}-1}+\left(\frac{\pi}{4}-\cos^{-1}\frac{1}{x}\right)x^{2}. $

$ iv)\; x\geq\sqrt{2},\; F_{\mathbf{X}}\left(x\right)=1 $

$ \therefore\; F_{\mathbf{X}}\left(x\right)=\begin{cases} \begin{array}{lll} 0 & & ,\; x<0\\ \frac{\pi}{4}x^{2} & & ,\;0\leq x\leq1\\ \sqrt{x^{2}-1}+\left(\frac{\pi}{4}-\cos^{-1}\frac{1}{x}\right)x^{2} & & ,\;1<x<\sqrt{2}\\ 1 & & ,\; x\geq\sqrt{2}. \end{array}\end{cases} $

2. (25 Points)

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be two jointly distributed Gaussian random variables. The random variable $ \mathbf{X} $ has mean $ \mu_{\mathbf{X}} $ and variance $ \sigma_{\mathbf{X}}^{2} $ . The correlation coefficient between $ \mathbf{X} $ and $ \mathbf{Y} $ is $ r $ . Define a new random variable $ \mathbf{Z} $ by $ \mathbf{Z}=a\mathbf{X}+b\mathbf{Y} $, where $ a $ and $ b $ are real numbers.

(a)

Prove that $ \mathbf{Z} $ is a Gaussian random variable.

$ \Phi_{\mathbf{Z}}\left(\omega\right)=E\left[e^{i\omega\mathbf{Z}}\right]=E\left[e^{i\omega\left(a\mathbf{X}+b\mathbf{Y}\right)}\right]=\Phi_{\mathbf{XY}}\left(a\omega,b\omega\right). $

$ \Phi_{\mathbf{XY}}\left(\omega_{1},\omega_{2}\right)=\exp\left[i\left(\mu_{\mathbf{X}}\omega_{1}+\mu_{\mathbf{Y}}\omega_{2}\right)-\frac{1}{2}\left(\sigma_{\mathbf{X}}^{2}\omega_{1}^{2}+2r\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}\omega_{1}\omega_{2}+\sigma_{\mathbf{Y}}^{2}\omega_{2}^{2}\right)\right]. $

$ \Phi_{\mathbf{Z}}\left(\omega\right)=\Phi_{\mathbf{XY}}\left(a\omega,b\omega\right)=\exp\left[i\left(a\mu_{\mathbf{X}}+b\mu_{\mathbf{Y}}\right)\omega-\frac{1}{2}\left(a^{2}\sigma_{\mathbf{X}}^{2}+2rab\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}+b^{2}\sigma_{\mathbf{Y}}^{2}\right)\omega^{2}\right], $

which is the characteristic function of a Gaussian random variable with mean $ a\mu_{\mathbf{X}}+b\mu_{\mathbf{Y}} $ and variance $ a^{2}\sigma_{\mathbf{X}}^{2}+2rab\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}+b^{2}\sigma_{\mathbf{Y}}^{2} $ .

(b)

Find the mean of $ \mathbf{Z} $ . Express your answer in terms of the parameters $ \mu_{\mathbf{X}} $ , $ \sigma_{\mathbf{X}}^{2} $ , $ \mu_{\mathbf{Y}} $ , $ \sigma_{\mathbf{Y}}^{2} $ , $ r $ , $ a $ , and $ b $ .

$ E\left[\mathbf{Z}\right]=a\mu_{\mathbf{X}}+b\mu_{\mathbf{Y}}. $

(c)

Find the variance of $ \mathbf{Z} $ . Express your answer in terms of the parameters $ \mu_{\mathbf{X}} $ , $ \sigma_{\mathbf{X}}^{2} $ , $ \mu_{\mathbf{Y}} $ , $ \sigma_{\mathbf{Y}}^{2} $ , $ r $ , $ a $ , and $ b $ .

$ Var\left[\mathbf{Z}\right]=a^{2}\sigma_{\mathbf{X}}^{2}+2rab\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}+b^{2}\sigma_{\mathbf{Y}}^{2}. $

3. (25 Points)

Let $ \mathbf{X}\left(t\right) $ be a wide-sense stationary Gaussian random process with mean $ \mu_{\mathbf{X}} $ and autocorrelation function $ R_{\mathbf{XX}}\left(\tau\right) $ . Let $ \mathbf{Y}\left(t\right)=c_{1}\mathbf{X}\left(t\right)-c_{2}\mathbf{X}\left(t-\tau\right), $ where $ c_{1} $ and $ c_{2} $ are real numbers. What is the probability that $ \mathbf{Y}\left(t\right) $ is less than or equal to a real number $ \gamma $ ? Express your answer in terms of “phi-function”$ \Phi\left(x\right)=\int_{-\infty}^{x}\frac{1}{\sqrt{2\pi}}e^{-z^{2}/2}dz. $

Solution

Since $ \mathbf{X}\left(t\right) $ is a WSS Gaussian random process, $ \mathbf{Y}\left(t\right) $ is a Gaussian process.

$ E\left[\mathbf{Y}\left(t\right)\right]=c_{1}E\left[\mathbf{X}\left(t\right)\right]-c_{2}E\left[\mathbf{X}\left(t-\tau\right)\right]=\left(c_{1}-c_{2}\right)\mu_{\mathbf{X}}. $

$ E\left[\mathbf{Y}^{2}\left(t\right)\right]=E\left[\left(c_{1}\mathbf{X}\left(t\right)-c_{2}\mathbf{X}\left(t-\tau\right)\right)^{2}\right] $$ =c_{1}^{2}E\left[\mathbf{X}^{2}\left(t\right)\right]-2c_{1}c_{2}E\left[\mathbf{X}\left(t\right)\mathbf{X}\left(t-\tau\right)\right]+c_{2}^{2}E\left[\mathbf{X}^{2}\left(t-\tau\right)\right] $$ =\left(c_{1}^{2}+c_{2}^{2}\right)R_{\mathbf{X}}\left(0\right)-2c_{1}c_{2}R_{\mathbf{X}}\left(-\tau\right). $

$ Var\left[\mathbf{Y}\left(t\right)\right]=E\left[\mathbf{Y}^{2}\left(t\right)\right]-E\left[\mathbf{Y}\left(t\right)\right]^{2} $$ =\left(c_{1}^{2}+c_{2}^{2}\right)R_{\mathbf{X}}\left(0\right)-2c_{1}c_{2}R_{\mathbf{X}}\left(-\tau\right)-\left(\left(c_{1}-c_{2}\right)\mu_{\mathbf{X}}\right)^{2} $$ =\left(c_{1}^{2}+c_{2}^{2}\right)R_{\mathbf{X}}\left(0\right)-2c_{1}c_{2}R_{\mathbf{X}}\left(-\tau\right)-\left(c_{1}^{2}+c_{2}^{2}\right)\mu_{\mathbf{X}}^{2}+2c_{1}c_{2}\mu_{\mathbf{X}}^{2} $$ =\left(c_{1}^{2}+c_{2}^{2}\right)\left(R_{\mathbf{X}}\left(0\right)-\mu_{\mathbf{X}}^{2}\right)+2c_{1}c_{2}\left(\mu_{\mathbf{X}}^{2}-R_{\mathbf{X}}\left(-\tau\right)\right). $

$ P\left(\left\{ \mathbf{Y}\left(t\right)\leq r\right\} \right)=\Phi\left(\frac{r-\left(c_{1}-c_{2}\right)\mu_{\mathbf{X}}}{\sqrt{\left(c_{1}^{2}+c_{2}^{2}\right)\left(R_{\mathbf{X}}\left(0\right)-\mu_{\mathbf{X}}^{2}\right)+2c_{1}c_{2}\left(\mu_{\mathbf{X}}^{2}-R_{\mathbf{X}}\left(-\tau\right)\right)}}\right). $

4. (25 Points)

Assume that the distribution of stars within a galaxy is accurately modeled by a 3-dimensional homogeneous Poisson process for which the following two facts are known to be true:

• The number of starts in a region of volume $ V $ is a Poisson random variable with mean $ \lambda V $ , where $ \lambda>0 $ .

• The number of starts in any two disjoint regions are statistically independent.

Assume you are located at an arbitrary position near the center of the galaxy.

(a)

Find the probability density function (pdf) of the distance to the nearest star.

Let $ \mathbf{R} $ be the distance to nearest star.

$ F_{\mathbf{R}}\left(r\right)=P\left(\left\{ \mathbf{R}\leq r\right\} \right). $

$ i)\; r<0,\; F_{\mathbf{R}}\left(r\right)=0. $

$ ii)\; r\geq0, F_{\mathbf{R}}\left(r\right)=P\left(\left\{ \text{there exist one or more stars in the sphere of radius }r\right\} \right) $$ =1-P\left(\left\{ \text{no star exists in the sphere of radius }r\right\} \right) $$ =1-e^{-\frac{4}{3}\pi r^{3}\lambda}. $

$ \therefore f_{\mathbf{R}}\left(r\right)=\begin{cases} \begin{array}{lll} 4\pi r^{2}\lambda e^{-\frac{4}{3}\pi r^{3}\lambda} & & ,\; r\geq0\\ 0 & & ,\; r<0. \end{array}\end{cases} $

(b)

Find the most likely distance to the nearest star.

$ \frac{df_{\mathbf{R}}\left(r\right)}{dr}=8\pi r\lambda e^{-\frac{4}{3}\pi r^{3}\lambda}-\left(4\pi r^{2}\lambda\right)^{2}e^{-\frac{4}{3}\pi r^{3}\lambda} = 0 $
$ e^{-\frac{4}{3}\pi r^{3}\lambda}\left(8\pi r\lambda-\left(4\pi r^{2}\lambda\right)^{2}\right) = 0 $
$ 8\pi r\lambda-16\pi^{2}r^{4}\lambda^{2} = 0 $
$ 1-8\pi r^{3}\lambda = 0 $

$ \therefore r=\left(\frac{1}{2\pi\lambda}\right)^{\frac{1}{3}}. $


Back to ECE600

Back to my ECE 600 QE page

Back to the general ECE PHD QE page (for problem discussion)

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett