Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2008

1

The Weak Law of Large Numbers states that if $\mathbf{X}_{1},\mathbf{X}_{2},\mathbf{X}_{3},\cdots$ is a sequence of i.i.d. random variables with finite mean $E\left[\mathbf{X}_{i}\right]=\mu$ for every $i$ , then the sample mean $\mathbf{Y}_{n}=\frac{1}{n}\sum_{i=1}^{n}\mathbf{X}_{i}$ converges to $\mu$ in probability. Suppose that instead of being i.i.d , $\mathbf{X}_{1},\mathbf{X}_{2},\mathbf{X}_{3},\cdots$ each have finite mean $\mu$ , and the covariance function of the sequence $\mathbf{X}_{n}$ is $Cov\left(\mathbf{X}_{i},\mathbf{X}_{j}\right)=\sigma^{2}\rho^{\left|i-j\right|}$ , where $\left|\rho\right|<1$ and $\sigma^{2}>0$ .

a. (13 points)

Find the mean and variance of $\mathbf{Y}_{n}$ .

$E\left[\mathbf{Y}_{n}\right]=E\left[\frac{1}{n}\sum_{i=1}^{n}\mathbf{X}_{i}\right]=\frac{1}{n}\sum_{i=1}^{n}E\left[\mathbf{X}_{i}\right]=\frac{n\mu}{n}=\mu.$

$Cov\left(\mathbf{X}_{i},\mathbf{X}_{j}\right)=E\left[\left(\mathbf{X}_{i}-\mu\right)\left(\mathbf{X}_{j}-\mu\right)\right]=E\left[\mathbf{X}_{i}\mathbf{X}_{j}\right]-\mu^{2}.$

$E\left[\mathbf{X}_{i}\mathbf{X}_{j}\right]=\begin{cases} \begin{array}{lll} \sigma^{2}+\mu^{2} & & ,\; i=j\\ \sigma^{2}\rho^{\left|i-j\right|}+\mu^{2} & & ,\; i\neq j. \end{array}\end{cases}$

$E\left[\mathbf{Y}_{n}^{2}\right]=E\left[\frac{1}{n^{2}}\sum_{i=1}^{n}\sum_{j=1}^{n}\mathbf{X}_{i}\mathbf{X}_{j}\right]=\frac{1}{n^{2}}\left[\sum_{i=1}^{n}E\left[\mathbf{X}_{i}^{2}\right]+\underset{_{i\neq j}}{\sum_{i=1}^{n}\sum_{j=1}^{n}}E\left[\mathbf{X}_{i}\mathbf{X}_{j}\right]\right]$$=\frac{1}{n^{2}}\left[n\left(\sigma^{2}+\mu^{2}\right)+2\sum_{i=1}^{n-1}\left(n-i\right)\left(\sigma^{2}\rho^{i}+\mu^{2}\right)\right]=\frac{1}{n^{2}}\left[n\sigma^{2}+2\sigma^{2}\sum_{i=1}^{n-1}\left(n-i\right)\rho^{i}\right]+\mu^{2}.$

$Var\left[\mathbf{Y}_{n}\right]=E\left[\mathbf{Y}_{n}^{2}\right]-E\left[\mathbf{Y}_{n}\right]^{2}=\frac{1}{n^{2}}\left[n\sigma^{2}+2\sigma^{2}\sum_{i=1}^{n-1}\left(n-i\right)\rho^{i}\right]+\mu^{2}-\mu^{2}=\frac{1}{n^{2}}\left[n\sigma^{2}+2\sigma^{2}\sum_{i=1}^{n-1}\left(n-i\right)\rho^{i}\right].$

Note

If the element $m_{ij}$ of $n\times n$ matrix is defined as $\left|i-j\right|$ , then

$\left[\begin{array}{ccccc} 0 & 1 & 2 & \cdots & n-1\\ 1 & 0 & 1 & \ddots & \vdots\\ 2 & 1 & 0 & \ddots & 2\\ \vdots & \ddots & \ddots & \ddots & 1\\ n-1 & \cdots & 2 & 1 & 0 \end{array}\right].$

b. (12 points)

Does the sample mean still converge to $\mu$ in probability? You must justify your answer.

ref. You can see the definition of convergence in probability.

If $\mathbf{Y}_{n}\rightarrow\left(p\right)\rightarrow0$ , then for any $\epsilon>0$ , $P\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|>\epsilon\right\} \right)\rightarrow0$ as $n\rightarrow\infty$ . According to the Chebyshev inequality, $P\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|>\epsilon\right\} \right)\leq\frac{\sigma_{\mathbf{Y}}^{2}}{\epsilon^{2}}.$ $\lim_{n\rightarrow\infty}\sigma_{\mathbf{Y}}^{2}$ Therefore, we know that $\lim_{n\rightarrow\infty}P\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|>\epsilon\right\} \right)=0$ . Thus $\mathbf{Y}_{n}\rightarrow\left(p\right)\rightarrow0$ .

## Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett