Revision as of 08:23, 27 June 2012 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

7.2 QE 2001 January

1. (20 pts)

State and prove the Tchebycheff Inequality.

Answer

You can see the proof of the Chebyshev Inequality.

2.

(a) (7 pts)

Let $ A $ and $ B $ be statistically independent events in the same probability space. Are $ A $ and $ B^{C} $ independent? (You must prove your result).

$ P\left(A\right)=P\left(A\cap\left(B\cup B^{C}\right)\right)=P\left(\left(A\cap B\right)\cup\left(A\cap B^{C}\right)\right)=P\left(A\cap B\right)+P\left(A\cap B^{C}\right)=P\left(A\right)P\left(B\right)+P\left(A\cap B^{C}\right). $

$ P\left(A\cap B^{C}\right)=P\left(A\right)-P\left(A\right)P\left(B\right)=P\left(A\right)\left(1-P\left(B\right)\right)=P\left(A\right)P\left(B^{C}\right). $

$ \therefore A\text{ and }B^{C}\text{ are independent. } $

(b) (7 pts)

Can two events be statistically independent and mutually exclusive? (You must derive the conditions on A and B for this to be true or not.)

If $ P\left(A\right)=0 $ or $ P\left(B\right)=0 $ , then A and B are statistically independent and mutually exclusive. Prove this:

Without loss of generality, suppose that $ P\left(A\right)=0 $ . $ 0=P\left(A\right)\geq P\left(A\cap B\right)\geq0\Longrightarrow P\left(A\cap B\right)=0\qquad\therefore\text{mutually excclusive}. $

$ P\left(A\cap B\right)=0=P\left(A\right)P\left(B\right)\qquad\therefore\text{statistically independent.} $

(c) (6 pts)

State the Axioms of Probability.

Answer

You can see the Axioms of Probability.

3. (20 pts)

Let the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ be a sequence of random variables that converge in mean square to the random variable $ \mathbf{X} $ . Does the sequence also converge to $ \mathbf{X} $ in probability? (A simple yes or no answer is not acceptable, you must derive the result.)

We know that $ E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0 $ as $ n\rightarrow\infty $ .

By using Chebyshev Inequality,

$ \lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0. $

$ \therefore $ A sequence of random variable that converge in mean square sense to the random variable $ \mathbf{X} $ , also converges in probability to $ \mathbf{X} $ .

4. (20 pts)

Let $ \mathbf{X}_{t} $ be a band-limited white noise strictly stationary random process with bandwidth 10 KHz. It is also known that $ \mathbf{X}_{t} $ is uniformly distributed between $ \pm5 $ volts. Find:

(a) (10 pts)

Let $ \mathbf{Y}_{t}=\left(\mathbf{X}_{t}\right)^{2} $ . Find the mean square value of $ \mathbf{Y}_{t} $ .

(b) (10 pts)

Let $ \mathbf{X}_{t} $ be the input to a linear shift-invariant system with transfer function:
$ H\left(f\right)=\begin{cases} \begin{array}{lll} 1 \text{ for }\left|f\right|\leq5\text{ KHz}\\ 0.5 \text{ for }5\text{ KHz}\leq\left|f\right|\leq50\text{ KHz}\\ 0 \text{ elsewhere. } \end{array}\end{cases} $

Find the mean and variance of the output. 5. (20 pts)

Let a linear discrete parameter shift-invariant system have the following difference equation: $ y\left(n\right)=0.7y\left(n-1\right)+x\left(n\right) $ where $ x\left(n\right) $ in the input and $ y\left(n\right) $ is the output. Now suppose this system has as its input the discrete parameter random process $ \mathbf{X}_{n} $ . You may assume that the input process is zero-mean i.i.d.

(a) (5 pts)

Is the input wide-sense stationary (show your work)?

$ E\left[\mathbf{X}_{n}\right]=0. $

$ R_{\mathbf{XX}}\left(n+m,\; n\right) $

$ \therefore\;\mathbf{X}_{n}\text{ is wide-sense stationary.} $

(b) (5 pts)

Is the output process wide-sense stationary (show your work)?

$ E\left[\mathbf{Y}_{n}\right]=0.7E\left[\mathbf{Y}_{n-1}\right]+E\left[\mathbf{X}_{n}\right]=0.7E\left[\mathbf{Y}_{n-1}\right]=0.7^{2}E\left[\mathbf{Y}_{n-2}\right]=0.7^{n}E\left[\mathbf{Y}_{0}\right]=0. $

$ E\left[\mathbf{Y}_{0}\right]=E\left[\sum_{n=-\infty}^{\infty}h\left(0-n\right)\mathbf{X}\left(n\right)\right]=\sum_{n=-\infty}^{\infty}h\left(-n\right)E\left[\mathbf{X}\left(n\right)\right]=0. $

$ R_{\mathbf{YY}}\left(n+m,\; n\right) $

$ R_{\mathbf{YY}}\left(n+m,\; n\right) $ depends on the time difference $ m $ . Thus, $ \mathbf{Y}_{n} $ is wide-sense stationary.

(c) (5 pts)

Find the autocorrelation function of the input process.

$ R_{\mathbf{XX}}\left(n,n+m\right)=R_{\mathbf{X}}\left(m\right)=\sigma_{\mathbf{X}}^{2}\delta\left(m\right). $

(d) (5 pts)

Find the autocorrelation function, in closed form, for the output process.

$ R_{\mathbf{Y}}\left(m\right) $

$ \because\; E\left[\mathbf{X}\left(n\right)\mathbf{Y}\left(m\right)\right]=E\left[\sum_{k=-\infty}^{\infty}h\left(m-k\right)\mathbf{X}\left(n\right)\mathbf{X}\left(k\right)\right]=\sum_{k=-\infty}^{\infty}h\left(m-k\right)\left(\sigma_{\mathbf{X}}^{2}\delta\left(n-k\right)\right). $


Back to ECE600

Back to my ECE 600 QE page

Back to the general ECE PHD QE page (for problem discussion)

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett