Revision as of 08:29, 27 June 2012 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

7.12 QE 2006 August

1

Let $ \mathbf{U}_{n} $ be a sequence of independent, identically distributed zero-mean, unit-variance Gaussian random variables. The sequence $ \mathbf{X}_{n} $ , $ n\geq1 $ , is given by $ \mathbf{X}_{n}=\frac{1}{2}\mathbf{U}_{n}+\left(\frac{1}{2}\right)^{2}\mathbf{U}_{n-1}+\cdots+\left(\frac{1}{2}\right)^{n}\mathbf{U}_{1}. $

(a) (15 points)

Find the mean and variance of $ \mathbf{X}_{n} $ .

i) Find $ E\left[\mathbf{X}_{n}\right] $

$ \mathbf{X}_{n}=\sum_{k=0}^{n-1}\left(\frac{1}{2}\right)^{k+1}\mathbf{U}_{n-k}. E\left[\mathbf{X}_{n}\right]=E\left(\sum_{k=0}^{n-1}\left(\frac{1}{2}\right)^{k+1}\mathbf{U}_{n-k}\right)=\sum_{k=0}^{n-1}\left(\frac{1}{2}\right)^{k+1}E\left[\mathbf{U}_{n-k}\right]=0. $

ii) Find $ E\left[\mathbf{X}_{n}^{2}\right] $

$ E\left[\mathbf{X}_{n}^{2}\right]=E\left[\left(\sum_{k=0}^{n-1}\left(\frac{1}{2}\right)^{k+1}\mathbf{U}_{n-k}\right)^{2}\right]=E\left[\sum_{k=0}^{n-1}\sum_{j=0}^{n-1}\left(\frac{1}{2}\right)^{k+1}\left(\frac{1}{2}\right)^{j+1}\mathbf{U}_{n-k}\mathbf{U}_{n-j}\right] $$ =E\left[\sum_{k=0}^{n-1}\left(\frac{1}{2}\right)^{2k+2}\mathbf{U}_{n-k}^{2}+\underset{k\neq j}{\sum_{k=0}^{n-1}\sum_{j=0}^{n-1}}\left(\frac{1}{2}\right)^{k+1}\left(\frac{1}{2}\right)^{j+1}\mathbf{U}_{n-k}\mathbf{U}_{n-j}\right] $$ =\sum_{k=0}^{n-1}\left(\frac{1}{2}\right)^{2k+2}E\left[\mathbf{U}_{n-k}^{2}\right]+\underset{k\neq j}{\sum_{k=0}^{n-1}\sum_{j=0}^{n-1}}\left(\frac{1}{2}\right)^{k+1}\left(\frac{1}{2}\right)^{j+1}E\left[\mathbf{U}_{n-k}\right]E\left[\mathbf{U}_{n-j}\right] $$ =\sum_{k=0}^{n-1}\left(\frac{1}{2}\right)^{2k+2}=\sum_{k=1}^{n}\left(\frac{1}{2}\right)^{2k}=\frac{\left(\frac{1}{2}\right)^{2}\left(1-\left(\frac{1}{2}\right)^{2n}\right)}{1-\left(\frac{1}{2}\right)^{2}}=\frac{1}{3}\left(1-\left(\frac{1}{2}\right)^{2n}\right). $

iii) Find $ Var\left[\mathbf{X}_{n}\right] $

$ Var\left[\mathbf{X}_{n}\right]=E\left[\mathbf{X}_{n}^{2}\right]-\left(E\left[\mathbf{X_{n}}\right]\right)^{2}=\frac{1}{3}\left(1-\left(\frac{1}{2}\right)^{2n}\right). $

(b) (15 points)

Find the characteristic function of $ \mathbf{X}_{n} $ .

Since $ \mathbf{U}_{n} $ is a sequence of i.i.d. Gaussian random variables, $ \mathbf{X}_{n} $ is a sequence of Gaussian random variables with zero mean and variance $ \sigma_{\mathbf{X}_{n}}^{2}=\frac{1}{3}\left(1-\left(\frac{1}{2}\right)^{2n}\right) $ . Hence the characteristic function of $ \mathbf{X}_{n} $ is $ \Phi_{\mathbf{X}_{n}}\left(\omega\right)=\exp\left(i\mu_{\mathbf{X}_{n}}\omega-\frac{1}{2}\sigma_{\mathbf{X}_{n}}^{2}\omega^{2}\right)=\exp\left(-\frac{\omega^{2}}{6}\left(1-\left(\frac{1}{2}\right)^{2n}\right)\right). $

(c) (10 points)

Does the sequence $ \mathbf{X}_{n} $ converge in distribution? A simple yes or no answer is not sufficient. You must justify your answer.

$ \Phi=F_{\mathbf{X}_{n}}\left(x\right)=\int_{-\infty}^{x}\frac{1}{\sqrt{2\pi}\sigma_{\mathbf{X}_{n}}}\exp\left(-\frac{x'^{2}}{2\sigma_{\mathbf{X}_{n}}^{2}}\right)dx' $ where $ \sigma_{\mathbf{X}_{n}}^{2}=\frac{1}{3}\left(1-\left(\frac{1}{2}\right)^{2n}\right) $ .

Since $ \lim_{n\rightarrow\infty}\sigma_{\mathbf{X}_{n}}^{2}=\frac{1}{3} , \lim_{n\rightarrow\infty}F_{\mathbf{X}_{n}}=\int_{-\infty}^{x}\frac{1}{\sqrt{\frac{2\pi}{3}}}\exp\left(-\frac{x'^{2}}{2\sigma_{\mathbf{X}_{n}}^{2}}\right)dx'=F_{\mathbf{X}}\left(x\right). $

$ \therefore $ The squance $ \mathbf{X}_{n} $ converges in distribution.

2

Let $ \Phi $ be the standard normal distribution, i.e., the distribution function of a zero-mean, unit-variance Gaussian random variable. Let $ \mathbf{X} $ be a normal random variable with mean $ \mu $ and variance 1 . We want to find $ E\left[\Phi\left(\mathbf{X}\right)\right] $ .

(a) (10 points)

First show that $ E\left[\Phi\left(\mathbf{X}\right)\right]=P\left(\mathbf{Z}\leq\mathbf{X}\right) $ , where $ \mathbf{Z} $ is a standard normal random variable independent of $ \mathbf{X} $ . Hint: Use an intermediate random variable $ \mathbf{I} $ defined as

$ \mathbf{I}=\left\{ \begin{array}{lll} 1 & & \text{if }\mathbf{Z}\leq\mathbf{X}\\ 0 & & \text{if }\mathbf{Z}>\mathbf{X}. \end{array}\right. $

$ P\left(\mathbf{Z}\leq\mathbf{X}\right)=\int_{-\infty}^{\infty}P\left(\mathbf{Z}\leq x|\mathbf{X}=x\right)\cdot f_{\mathbf{X}}\left(x\right)dx=\int_{-\infty}^{\infty}\Phi\left(x\right)\cdot f_{\mathbf{X}}\left(x\right)dx=E\left[\Phi\left(\mathbf{X}\right)\right]. $

(b) (10 points)

Now use the result from Part (a) to show that $ E\left[\Phi\left(\mathbf{X}\right)\right]=\Phi\left(\frac{\mu}{\sqrt{2}}\right) $ .

Let $ \mathbf{Y}=\mathbf{Z}-\mathbf{X} $ . Since $ \mathbf{Z} $ and $ \mathbf{X} $ are Gaussian random variables, $ \mathbf{Y} $ is also a Gaussian random variable.

$ E\left[\mathbf{Y}\right]=E\left[\mathbf{Z}\right]-E\left[\mathbf{X}\right]=-\mu. $

$ Var\left[\mathbf{Y}\right]=E\left[\left(\mathbf{Y}-E\left[\mathbf{Y}\right]\right)^{2}\right]=E\left[\left(\mathbf{Z}-\left(\mathbf{X}-\mu\right)\right)^{2}\right]=E\left[\mathbf{Z}^{2}\right]+E\left[\left(\mathbf{X}-\mu\right)^{2}\right]-2E\left[\mathbf{Z}\right]E\left[\mathbf{X}-\mu\right] $$ =E\left[\mathbf{Z}^{2}\right]-E\left[\mathbf{Z}\right]E\left[\mathbf{X}-\mu\right]+E\left[\left(\mathbf{X}-\mu\right)^{2}\right]-E\left[\mathbf{Z}\right]E\left[\mathbf{X}-\mu\right] $$ =E\left[\mathbf{Z}^{2}\right]-\left(E\left[\mathbf{Z}\right]\right)^{2}+E\left[\left(\mathbf{X}-\mu\right)^{2}\right]-\left(E\left[\mathbf{X}-\mu\right]\right)^{2}=Var\left[\mathbf{Z}\right]+Var\left[\mathbf{X}\right]=2. $

$ E\left[\Phi\left(\mathbf{X}\right)\right]=P\left(\left\{ \mathbf{Z}\leq\mathbf{X}\right\} \right)=P\left(\left\{ \mathbf{Y}\leq0\right\} \right)=\Phi\left(\frac{0-\left(-\mu\right)}{\sqrt{2}}\right)=\Phi\left(\frac{\mu}{\sqrt{2}}\right). $

3 (15 points)

Let $ \mathbf{Y}(t) $ be the output of linear system with impulse response $ h\left(t\right) $ and input $ \mathbf{X}\left(t\right)+\mathbf{N}\left(t\right) $ , where $ \mathbf{X}\left(t\right) $ and $ \mathbf{N}\left(t\right) $ are jointly wide-sense stationary independent random processes. If $ \mathbf{Z}\left(t\right)=\mathbf{X}\left(t\right)-\mathbf{Y}\left(t\right) $ , find the power spectral density $ S_{\mathbf{Z}}\left(\omega\right) $ in terms of $ S_{\mathbf{X}}\left(\omega\right) , S_{\mathbf{N}}\left(\omega\right) , m_{\mathbf{X}}=E\left[\mathbf{X}\right] $ , and $ m_{\mathbf{Y}}=E\left[\mathbf{Y}\right] $ .

Solution

Let $ \mathbf{M}\left(t\right)=\mathbf{X}\left(t\right)+\mathbf{N}\left(t\right) $ . Since $ \mathbf{X}\left(t\right) $ and $ \mathbf{N}\left(t\right) $ are jointly wide-sense stationary. $ \mathbf{M}\left(t\right) $ is also a wide-sense stationary random process.

$ \mathbf{Y}\left(t\right)=\mathbf{M}\left(t\right)*h\left(t\right). $

$ R_{\mathbf{Y}}\left(\tau\right)=\left(R_{\mathbf{M}}*h*\tilde{h}\right)\left(\tau\right)\text{ where }\left(\tilde{h}\left(t\right)=h\left(-t\right)\right). $

$ R_{\mathbf{M}}\left(\tau\right)=E\left[\mathbf{M}\left(t\right)\mathbf{M}\left(t+\tau\right)\right] $$ =E\left[\mathbf{X}\left(t\right)\mathbf{X}\left(t+\tau\right)\right]+E\left[\mathbf{X}\left(t\right)\right]E\left[\mathbf{N}\left(t+\tau\right)\right]+E\left[\mathbf{X}\left(t+\tau\right)\right]E\left[\mathbf{N}\left(t\right)\right]+E\left[\mathbf{N}\left(t\right)\mathbf{N}\left(t+\tau\right)\right] $$ =R_{\mathbf{X}}\left(\tau\right)+2m_{\mathbf{X}}m_{\mathbf{N}}+R_{\mathbf{N}}\left(\tau\right) $

$ R_{\mathbf{XY}}\left(\tau\right)=E\left[\mathbf{X}\left(t\right)\mathbf{Y}\left(t+\tau\right)\right] $$ =E\left[\mathbf{X}\left(t\right)\int_{-\infty}^{\infty}\left(\mathbf{X}\left(t+\tau-\alpha\right)+\mathbf{N}\left(t+\tau-\alpha\right)\right)h\left(\alpha\right)d\alpha\right] $$ =\int_{-\infty}^{\infty}\left(R_{\mathbf{X}}\left(\tau-\alpha\right)+E\left[\mathbf{X}\left(t\right)\right]E\left[\mathbf{N}\left(t+\tau-\alpha\right)\right]\right)h\left(\alpha\right)d\alpha $$ =R_{\mathbf{X}}\left(\tau\right)*h\left(\tau\right)+m_{\mathbf{X}}m_{\mathbf{N}}*h\left(\tau\right). $

$ R_{\mathbf{Z}}\left(\tau\right)=E\left[\mathbf{Z}\left(t\right)\mathbf{Z}\left(t+\tau\right)\right]=E\left[\left(\mathbf{X}\left(t\right)-\mathbf{Y}\left(t\right)\right)\left(\mathbf{X}\left(t+\tau\right)-\mathbf{Y}\left(t+\tau\right)\right)\right] $$ =R_{\mathbf{X}}\left(\tau\right)-R_{\mathbf{YX}}\left(\tau\right)-R_{\mathbf{XY}}\left(\tau\right)+R_{\mathbf{YY}}\left(\tau\right). $

$ S_{\mathbf{Z}}\left(\omega\right)=S_{\mathbf{X}}\left(\omega\right)-S_{\mathbf{YX}}\left(\omega\right)-S_{\mathbf{XY}}\left(\omega\right)+S_{\mathbf{Y}}\left(\omega\right)=S_{\mathbf{X}}\left(\omega\right)-S_{\mathbf{XY}}^{*}\left(\omega\right)-S_{\mathbf{XY}}\left(\omega\right)+S_{\mathbf{Y}}\left(\omega\right) $$ =S_{\mathbf{X}}\left(\omega\right)-2\Re\left\{ S_{\mathbf{XY}}\left(\omega\right)\right\} +S_{\mathbf{M}}\left(\omega\right)\Bigl|H\left(\omega\right)\Bigr|^{2} $$ =S_{\mathbf{X}}\left(\omega\right)-2\Re\left\{ S_{\mathbf{X}}\left(\omega\right)H\left(\omega\right)+2\pi m_{\mathbf{X}}m_{\mathbf{N}}\delta\left(\omega\right)H\left(\omega\right)\right\} +\left\{ S_{\mathbf{X}}\left(\omega\right)+2\pi m_{\mathbf{X}}m_{\mathbf{N}}\delta\left(\omega\right)+S_{\mathbf{N}}\left(\omega\right)\right\} \Bigl|H\left(\omega\right)\Bigr|^{2} $$ =S_{\mathbf{X}}\left(\omega\right)-2\Re\left\{ S_{\mathbf{X}}\left(\omega\right)H\left(\omega\right)+2\pi m_{\mathbf{X}}\left(m_{\mathbf{Y}}-m_{\mathbf{X}}H\left(0\right)\right)\delta\left(\omega\right)\right\} + $$ \left\{ S_{\mathbf{X}}\left(\omega\right)+S_{\mathbf{N}}\left(\omega\right)\right\} \Bigl|H\left(\omega\right)\Bigr|^{2}+2\pi m_{\mathbf{X}}\left(m_{\mathbf{Y}}-m_{\mathbf{X}}H\left(0\right)\right)H\left(0\right)\delta\left(\omega\right). $

$ \because m_{\mathbf{Y}}=m_{\mathbf{M}}*h\left(t\right)=\int_{-\infty}^{\infty}\left(m_{\mathbf{X}}+m_{\mathbf{N}}\right)h\left(t\right)dt=\left(m_{\mathbf{X}}+m_{\mathbf{N}}\right)H\left(0\right)\Rightarrow m_{\mathbf{N}}H\left(0\right)=m_{\mathbf{Y}}-m_{\mathbf{X}}H\left(0\right). $

4

Suppose customer orders arrive according to an i.i.d. Bernoulli random process $ \mathbf{X}_{n} $ with parameter $ p $ . Thus, an order arrives at time index $ n $ (i.e., $ \mathbf{X}_{n}=1 $ ) with probability $ p $ ; if an order does not arrive at time index $ n $ , then $ \mathbf{X}_{n}=0 $ . When an order arrives, its size is an exponential random variable with parameter $ \lambda $ . Let $ \mathbf{S}_{n} $ be the total size of all orders up to time $ n $ .

(a) (20 points)

Find the mean and autocorrelation function of $ \mathbf{S}_{n} $ .

Let $ \mathbf{Y}_{n} $ be the size of an order at time index $ n $ , then $ \mathbf{Y}_{n} $ is a sequence of i.i.d. exponential random variables.

$ \mathbf{S}_{n}=\sum_{k=1}^{n}\mathbf{X}_{n}\mathbf{Y}_{n}. $

$ E\left[\mathbf{S}_{n}\right]=\sum_{k=1}^{n}E\left[\mathbf{X}_{n}\right]E\left[\mathbf{Y}_{n}\right]=\sum_{k=1}^{n}p\cdot\frac{1}{\lambda}=\frac{np}{\lambda}. $

$ R_{\mathbf{S}}\left(n,m\right)=E\left[\mathbf{S}_{n}\mathbf{S}_{m}\right]=\sum_{k=1}^{n}\sum_{l=1}^{m}E\left[\mathbf{X}_{n}\right]E\left[\mathbf{X}_{m}\right]E\left[\mathbf{Y}_{n}\right]E\left[\mathbf{Y}_{m}\right]=\sum_{k=1}^{n}\sum_{l=1}^{m}\frac{p^{2}}{\lambda^{2}}=nm\frac{p^{2}}{\lambda^{2}}. $

(b) (5 points)

Is $ \mathbf{S}_{n} $ a stationary random process? Explain.

• Approach 1: $ \mathbf{S}_{n} $ is not a stationary random process since $ R_{\mathbf{S}}\left(n,m\right) $ does not depend on only $ m-n $ .

• Approach 2: $ \mathbf{S}_{n} $ is not a stationary random process since $ E\left[\mathbf{S}_{n}\right] $ is not constant.


Back to ECE600

Back to my ECE 600 QE page

Back to the general ECE PHD QE page (for problem discussion)

Alumni Liaison

has a message for current ECE438 students.

Sean Hu, ECE PhD 2009