6.3 MRB 2004 Spring Final

2. II

Identical twins come from the same egg and hence are of the same sex. Fraternal twins have a probability $ \frac{1}{2} $ of being of the same sex. Among twins, the probability of a fraternal set is p and of identical set is $ q=1-p $ . Given that a set of twins selected at random are of the same sex, what is the probability they are fraternal?

A. $ \frac{q}{p}/math> B. <math class="inline">\frac{p}{1+q} $ C. $ 1 $ D. $ p $ E. $ \frac{2q}{1+p} $

Solution

• Note that fraternal twin $ \leftrightarrow $ identical twin

• We can define events

– F : Fraternal twins

– I : Identical twins

– S : Twins are same sex

• We know that

$ P\left(F\right)=p $

$ P\left(I\right)=q=1-p $

$ P\left(S|F\right)=\frac{1}{2} $

$ P\left(S|I\right)=1 $

• Now, by using Bayes' theorem,$ P\left(F|S\right)=\frac{P\left(F\cap S\right)}{P\left(S\right)}=\frac{P\left(S|F\right)P\left(F\right)}{P\left(S|F\right)P\left(F\right)+P\left(S|I\right)P\left(I\right)}=\frac{\frac{1}{2}\cdot p}{\frac{1}{2}\cdot p+1\cdot q}=\frac{p}{p+2q}=\frac{p}{\left(1-q\right)+2q}=\frac{p}{1+q}. $

2. III

Andy writes a letter to Betty but does not receive a letter in reply. Assuming that one letter in n is lost in the mail, what is the probability that Betty did not receive Andy's letter? (Assuming that Betty would have answered Andy's letter had she received it.)

A. $ \frac{n}{n-1} $ B. $ \frac{n-1}{n^{2}} $ C. $ \frac{n-1}{2n-1} $ D. $ \frac{n}{2n-1} $ E. $ \frac{n-1}{n^{2}} $

Solution

$ \bar{A}<1ex>[r]^{\frac{n-1}{n}} $, $ \bar{B}<1ex>[l]^{\frac{n-1}{n}} $


• We can define events

– A : Andy receives a letter

$ \bar{A} $ : Andy does not receive a letter

– B : Betty receives a letter

$ \bar{B} $ : Betty does not receive a letter.

• We know that

$ P\left(\textrm{lost}\right)=\frac{1}{n} $

$ P\left(B\right)=1-\frac{1}{n} $

$ P\left(\bar{B}\right)=\frac{1}{n} $

$ P\left(\bar{A}|\bar{B}\right)=1 $

$ P\left(\bar{A}|B\right)=\frac{1}{n} $

• Now, by using Bayes' theorem,

$ P\left(\bar{B}|\bar{A}\right)=\frac{P\left(\bar{A}\cap\bar{B}\right)}{P\left(\bar{A}\right)}=\frac{P\left(\bar{A}|\bar{B}\right)P\left(\bar{B}\right)}{P\left(\bar{A}|\bar{B}\right)P\left(\bar{B}\right)+P\left(\bar{A}|B\right)P\left(B\right)}=\frac{1\cdot\frac{1}{n}}{1\cdot\frac{1}{n}+\frac{1}{n}\cdot\left(1-\frac{1}{n}\right)}=\frac{1}{1+1-\frac{1}{n}}=\frac{n}{2n-1}. $

2. V

The number of hits $ \mathbf{X} $ in a baseball game is a Poisson random variable. If the probability of a no-hit game is 1/3 , what is the probability of having two or more hits in a game?

Solution

$ P\left(\mathbf{X}=k\right)=\frac{e^{-\lambda}\lambda^{k}}{k!}. $

$ P\left(\mathbf{X}=0\right)=e^{-\lambda}=\frac{1}{3}\Longrightarrow-\lambda=-\ln3\Longrightarrow\therefore\lambda=\ln3. $

$ P\left(\mathbf{X}\geq2\right)=1-P\left(\mathbf{X}=1\right)-P\left(\mathbf{X}=0\right)=1-\frac{1}{3}\ln3-\frac{1}{3}=\frac{2}{3}-\frac{\ln3}{3}=\frac{2-\ln3}{3}. $

2. VI

$ \Phi_{\mathbf{X}}\left(\omega\right)=\frac{1}{\left(1-ia\omega\right)^{k}},\; E\left[\mathbf{X}\right]? $

Solution

$ E\left[\mathbf{X}\right]=\frac{d}{ds}\phi\left(s\right)|_{s=0}=\frac{d}{ds}\left(1-as\right)^{-k}|_{s=0}=-k\left(1-as\right)^{-\left(k+1\right)}\left(-a\right)|_{s=0}=ak. $

4

A biased quarter having a probability p of coming up “heads” is tossed n times. Each time the quarter comes up heads, a biased nickle having probability r of coming up “heads” is tossed. Let $ \mathbf{M} $ be the random variable giving the number of times the biased nickle comes up “heads” in this experiment.

(a) Find the probability mass function (pmf) of $ \mathbf{M} $ .

$ \mathbf{Q} $ : the random variable giving the number of times the biased quater comes up heads.

$ P\left(\left\{ \mathbf{Q}=k\right\} \right)=\left(\begin{array}{c} n\\ k \end{array}\right)p^{k}\left(1-p\right)^{n-k}. $

$ P\left(\left\{ \mathbf{M}=m\right\} |\left\{ \mathbf{Q}=k\right\} \right)=\left(\begin{array}{c} k\\ m \end{array}\right)r^{m}\left(1-r\right)^{k-m}. $

$ E\left[e^{i\omega\mathbf{M}}|\left\{ \mathbf{Q}=k\right\} \right]=\sum_{m=0}^{\infty}\left(\begin{array}{c} k\\ m \end{array}\right)r^{m}\left(1-r\right)^{k-m}\cdot e^{i\omega m}=\sum_{m=0}^{\infty}\left(\begin{array}{c} k\\ m \end{array}\right)\left(r\cdot e^{i\omega}\right)^{m}\left(1-r\right)^{k-m}=\left(r\cdot e^{i\omega}+1-r\right)^{k}. $

$ \Phi_{\mathbf{M}}\left(\omega\right)=\sum_{k=0}^{\infty}E\left[e^{i\omega\mathbf{M}}|\left\{ \mathbf{Q}=k\right\} \right]\cdot P\left(\left\{ \mathbf{Q}=k\right\} \right)=\sum_{k=0}^{\infty}\left(r\cdot e^{i\omega}+1-r\right)^{k}\cdot\left(\begin{array}{c} n\\ k \end{array}\right)p^{k}\left(1-p\right)^{n-k} $$ =\sum_{k=0}^{\infty}\left(\begin{array}{c} n\\ k \end{array}\right)\left(p\left(r\cdot e^{i\omega}+1-r\right)\right)^{k}\left(1-p\right)^{n-k}=\left(p\cdot r\cdot e^{i\omega}+p-p\cdot r+1-p\right)^{n}=\left(p\cdot r\cdot e^{i\omega}+1-p\cdot r\right)^{n}. $

This is the characteristic function of Binomial with probability pr .

$ \therefore P\left(\left\{ \mathbf{M}=m\right\} \right)=\left(\begin{array}{c} n\\ m \end{array}\right)\left(pr\right)^{m}\left(1-pr\right)^{n-m}. $

(b) Find the mean of $ \mathbf{M} $ .

$ E\left[\mathbf{M}\right]=\frac{\partial}{\partial\left(i\omega\right)}\Phi_{\mathbf{M}}\left(\omega\right)|_{i\omega=0}=n\left(pr\cdot e^{i\omega}+1-pr\right)^{n-1}\cdot pr\cdot e^{i\omega}|_{i\omega=0}=npr. $

In fact, we directly conclude $ E\left[\mathbf{M}\right]=npr $ because we already know that $ \mathbf{M} $ is Binomial random variable with probability pr .

(c) Find the variance of $ \mathbf{M} $ .

$ E\left[\mathbf{M}^{2}\right] $

$ Var\left[\mathbf{M}\right]=E\left[\mathbf{M}^{2}\right]-\left(E\left[\mathbf{M}\right]\right)^{2}=\left(npr\left(n-1\right)pr+npr\right)-\left(npr\right)^{2}=\left(npr\right)^{2}-n\left(pr\right)^{2}+npr-\left(npr\right)^{2}=npr\left(1-pr\right). $

In fact, we directly conclude $ Var\left[\mathbf{M}\right]=npr\left(1-pr\right) $ because we already know that $ \mathbf{M} $ is Binomial random variable with probability pr .

Example (MRB 2004 Spring Final)

Assume that $ \mathbf{X}\left(t\right) $ is a zero-mean, continuous-time, Gaussian white noise process with autocorrelation function $ R_{\mathbf{XX}}\left(t_{1},t_{2}\right)=N_{0}\delta\left(t_{1}-t_{2}\right) $. Let $ \mathbf{Y}\left(t\right) $ be a new random process defined as the output of a linear time-invariant system with impulse response $ h\left(t\right)=1_{\left[0,T\right]}\left(t\right), $ where $ T>0 $ .

(a) What is the mean of $ \mathbf{Y}\left(t\right) $ ?

$ E\left[\mathbf{Y}\left(t\right)\right]=E\left[\int_{-\infty}^{\infty}h\left(\tau\right)\mathbf{X}\left(t-\tau\right)d\tau\right]=\int_{-\infty}^{\infty}h\left(\tau\right)E\left[\mathbf{X}\left(t-\tau\right)\right]d\tau=\int_{-\infty}^{\infty}h\left(\tau\right)\cdot0d\tau=0. $

(b) What is the power spectral density of \mathbf{Y}\left(t\right) ?

$ S_{\mathbf{XX}}\left(\omega\right)=\int_{-\infty}^{\infty}N_{0}\delta\left(\tau\right)e^{-i\omega\tau}d\tau=N_{0}. $

$ H\left(\omega\right)=\int_{-\infty}^{\infty}h\left(t\right)e^{-i\omega t}dt=\int_{0}^{T}e^{-i\omega t}dt=\frac{e^{-i\omega t}}{-i\omega}\biggl|_{0}^{T}=\frac{e^{-i\omega T}-1}{-i\omega}=\frac{1-e^{-i\omega T}}{i\omega}. $

$ \left|H\left(\omega\right)\right|^{2}=H\left(\omega\right)H^{*}\left(\omega\right)=\frac{1-e^{-i\omega T}}{i\omega}\cdot\frac{1-e^{i\omega T}}{-i\omega}=\frac{2-e^{-i\omega T}-e^{i\omega T}}{\omega^{2}}=\frac{2\left(1-\cos\omega T\right)}{\omega^{2}} $$ =\frac{2\left(1-\left(1-2\sin^{2}\frac{\omega T}{2}\right)\right)}{\omega^{2}}=\frac{4}{\omega^{2}}\cdot\sin^{2}\frac{\omega T}{2}. $

$ \because e^{-i\omega T}+e^{i\omega T}=\left(\cos\omega T-i\sin\omega T\right)+\left(\cos\omega T+i\sin\omega T\right)=2\cos\omega T. $

$ \because\cos\left(2x\right)=\cos^{2}\left(x\right)-\sin^{2}\left(x\right)=2\cos^{2}\left(x\right)-1=1-2\sin^{2}\left(x\right). $

$ S_{\mathbf{YY}}\left(\omega\right)=S_{\mathbf{XX}}\left(\omega\right)\left|H\left(\omega\right)\right|^{2}=N_{0}\cdot\frac{4}{\omega^{2}}\cdot\sin^{2}\frac{\omega T}{2}. $

(c) What is the autocorrelation function of $ \mathbf{Y}\left(t\right) $ ?

$ S_{\mathbf{YY}}\left(\omega\right)=N_{0}\cdot\frac{4}{\omega^{2}}\cdot\sin^{2}\frac{\omega T}{2}=N_{0}T\cdot\frac{4\sin^{2}\left(\omega T/2\right)}{T\omega^{2}}\leftrightarrow N_{0}T\cdot\left(1-\frac{\left|\tau\right|}{T}\right)=R_{\mathbf{YY}}\left(\tau\right)\textrm{ for }\left|\tau\right|<T. $

$ \because\frac{4\sin^{2}\left(\omega T/2\right)}{T\omega^{2}}\leftrightarrow\left\{ \begin{array}{lll} 1-\frac{\left|\tau\right|}{T} \left|\tau\right|<T\\ 0 \left|\tau\right|>T\text{ (on the table given)}. \end{array}\right. $

(d) Write an expression for the second-order density $ f_{\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)}\left(y_{1},y_{2}\right) $ of $ \mathbf{Y}\left(t\right) $ .

Because $ \mathbf{X}\left(t\right) $ is a WSS Gaussian random process and the system is an LTI system, $ \mathbf{Y}\left(t\right) $ is also a WSS Gaussian random process . $ \mathbf{Y}\left(t\right) $ is a WSS Gaussian random process with $ E\left[\mathbf{Y}\left(t\right)\right]=0 $ , $ \sigma_{\mathbf{Y}\left(t\right)}^{2}=R_{\mathbf{YY}}\left(0\right)=N_{0}T $ .

$ r_{\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)}=r\left(t_{1}-t_{2}\right)=\frac{C_{\mathbf{YY}}\left(t_{1}-t_{2}\right)}{\sqrt{\sigma_{\mathbf{Y}\left(t_{1}\right)}^{2}\sigma_{\mathbf{Y}\left(t_{2}\right)}^{2}}}=\frac{R_{\mathbf{YY}}\left(t_{1}-t_{2}\right)}{R_{\mathbf{YY}}\left(0\right)}=1-\frac{\left|\tau\right|}{T}. $

$ f_{\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)}\left(y_{1},y_{2}\right)=\frac{1}{2\pi\sigma_{\mathbf{Y}\left(t_{1}\right)}\sigma_{\mathbf{Y}\left(t_{2}\right)}\sqrt{1-r^{2}\left(t_{1}-t_{2}\right)}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\left(t_{1}-t_{2}\right)\right)}\left[\frac{y_{1}^{2}}{\sigma_{\mathbf{Y}\left(t_{1}\right)}^{2}}-\frac{2r\left(t_{1}-t_{2}\right)y_{1}y_{2}}{\sigma_{\mathbf{Y}\left(t_{1}\right)}\sigma_{\mathbf{Y}\left(t_{2}\right)}}+\frac{y_{2}^{2}}{\sigma_{\mathbf{Y}\left(t_{2}\right)}^{2}}\right]\right\} $$ =\frac{1}{2\pi R_{\mathbf{YY}}\left(0\right)\sqrt{1-r^{2}\left(t_{1}-t_{2}\right)}}\exp\left\{ \frac{-1}{2R_{\mathbf{YY}}\left(0\right)\left(1-r^{2}\left(t_{1}-t_{2}\right)\right)}\left[y_{1}^{2}-2r\left(t_{1}-t_{2}\right)y_{1}y_{2}+y_{2}^{2}\right]\right\} . $

$ \therefore \left(\mathbf{Y}\left(t_{1}\right),\mathbf{Y}\left(t_{2}\right)\right) $ has distribution $ N\left[0,0,N_{0}T,N_{0}T,1-\frac{\left|\tau\right|}{T}\right] $ .


Back to ECE600

Back to ECE 600 Finals

Alumni Liaison

Meet a recent graduate heading to Sweden for a Postdoctorate.

Christine Berkesch