(New page: =Example. Mean of i.i.d. random variables= Let <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math> be <math>M</math> jointly distributed i.i.d. random variables with mean...)
 
 
(2 intermediate revisions by one other user not shown)
Line 1: Line 1:
 
=Example. Mean of i.i.d.  random variables=
 
=Example. Mean of i.i.d.  random variables=
  
Let <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be <math>M</math>  jointly distributed i.i.d.  random variables with mean <math>\mu</math>  and variance <math>\sigma^{2}</math> . Let <math>\mathbf{Y}_{M}=\frac{1}{M}\sum_{n=0}^{M}\mathbf{X}_{n}</math> .
+
Let <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be <math class="inline">M</math>  jointly distributed i.i.d.  random variables with mean <math class="inline">\mu</math>  and variance <math class="inline">\sigma^{2}</math> . Let <math class="inline">\mathbf{Y}_{M}=\frac{1}{M}\sum_{n=0}^{M}\mathbf{X}_{n}</math> .
  
(a) Find the variance of <math>\mathbf{Y}_{M}</math> .
+
(a) Find the variance of <math class="inline">\mathbf{Y}_{M}</math> .
  
<math>Var\left[\mathbf{Y}_{M}\right]=E\left[\mathbf{Y}_{M}^{2}\right]-\left(E\left[\mathbf{Y}_{M}\right]\right)^{2}. </math>
+
<math class="inline">Var\left[\mathbf{Y}_{M}\right]=E\left[\mathbf{Y}_{M}^{2}\right]-\left(E\left[\mathbf{Y}_{M}\right]\right)^{2}. </math>
  
<math>E\left[\mathbf{Y}_{M}\right]=E\left[\frac{1}{M}\sum_{n=0}^{M}\mathbf{X}_{n}\right]=\frac{1}{M}\sum_{n=0}^{M}E\left[\mathbf{X}_{n}\right]=\frac{1}{M}\cdot M\cdot\mu=\mu.</math>  
+
<math class="inline">E\left[\mathbf{Y}_{M}\right]=E\left[\frac{1}{M}\sum_{n=0}^{M}\mathbf{X}_{n}\right]=\frac{1}{M}\sum_{n=0}^{M}E\left[\mathbf{X}_{n}\right]=\frac{1}{M}\cdot M\cdot\mu=\mu.</math>  
  
<math>E\left[\mathbf{Y}_{M}^{2}\right]=E\left[\frac{1}{M^{2}}\sum_{m=1}^{M}\sum_{n=1}^{M}\mathbf{X}_{m}\mathbf{X}_{n}\right]=\frac{1}{M^{2}}\sum_{m=1}^{M}\sum_{n=1}^{M}E\left[\mathbf{X}_{m}\mathbf{X}_{n}\right].</math>  
+
<math class="inline">E\left[\mathbf{Y}_{M}^{2}\right]=E\left[\frac{1}{M^{2}}\sum_{m=1}^{M}\sum_{n=1}^{M}\mathbf{X}_{m}\mathbf{X}_{n}\right]=\frac{1}{M^{2}}\sum_{m=1}^{M}\sum_{n=1}^{M}E\left[\mathbf{X}_{m}\mathbf{X}_{n}\right].</math>  
  
Now <math>E\left[\mathbf{X}_{m}\mathbf{X}_{n}\right]=\begin{cases}
+
Now <math class="inline">E\left[\mathbf{X}_{m}\mathbf{X}_{n}\right]=\begin{cases}
 
\begin{array}{ll}
 
\begin{array}{ll}
 
E\left[\mathbf{X}_{m}^{2}\right]  ,m=n\\
 
E\left[\mathbf{X}_{m}^{2}\right]  ,m=n\\
 
E\left[\mathbf{X}_{m}\right]E\left[\mathbf{X}_{n}\right]  ,m\neq n
 
E\left[\mathbf{X}_{m}\right]E\left[\mathbf{X}_{n}\right]  ,m\neq n
\end{array}\end{cases}</math>  because when <math>m\neq n</math> , <math>\mathbf{X}_{m}</math>  and <math>\mathbf{X}_{n}</math>  are independent <math>\Rightarrow  \mathbf{X}_{m}</math>  and <math>\mathbf{X}_{n}</math>  are uncorrelated.
+
\end{array}\end{cases}</math>  because when <math class="inline">m\neq n</math> , <math class="inline">\mathbf{X}_{m}</math>  and <math class="inline">\mathbf{X}_{n}</math>  are independent <math class="inline">\Rightarrow  \mathbf{X}_{m}</math>  and <math class="inline">\mathbf{X}_{n}</math>  are uncorrelated.
  
<math>E\left[\mathbf{Y}_{M}^{2}\right]=\frac{1}{M^{2}}\left[M\left(\mu^{2}+\sigma^{2}\right)+M\left(M-1\right)\mu^{2}\right]=\frac{\left(\mu^{2}+\sigma^{2}\right)+\left(M-1\right)\mu^{2}}{M}=\frac{M\mu^{2}+\sigma^{2}}{M}.</math>  
+
<math class="inline">E\left[\mathbf{Y}_{M}^{2}\right]=\frac{1}{M^{2}}\left[M\left(\mu^{2}+\sigma^{2}\right)+M\left(M-1\right)\mu^{2}\right]=\frac{\left(\mu^{2}+\sigma^{2}\right)+\left(M-1\right)\mu^{2}}{M}=\frac{M\mu^{2}+\sigma^{2}}{M}.</math>  
  
<math>Var\left[\mathbf{Y}_{M}\right]=\frac{M\mu^{2}+\sigma^{2}-M\mu^{2}}{M}=\frac{\sigma^{2}}{M}.</math>  
+
<math class="inline">Var\left[\mathbf{Y}_{M}\right]=\frac{M\mu^{2}+\sigma^{2}-M\mu^{2}}{M}=\frac{\sigma^{2}}{M}.</math>  
  
(b) Now assume that the <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  are identically distributed with with mean <math>\mu</math>  and variance <math>\sigma^{2}</math> , but they are only correlated rather than independent. Find the variance of <math>\mathbf{Y}_{M}</math> .
+
(b) Now assume that the <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  are identically distributed with with mean <math class="inline">\mu</math>  and variance <math class="inline">\sigma^{2}</math> , but they are only uncorrelated rather than independent. Find the variance of <math class="inline">\mathbf{Y}_{M}</math> .
  
Again, <math>Var\left[\mathbf{Y}_{M}\right]=\frac{\sigma^{2}}{M}</math> , because only uncorrelatedness was used in part (a).
+
Again, <math class="inline">Var\left[\mathbf{Y}_{M}\right]=\frac{\sigma^{2}}{M}</math> , because only uncorrelatedness was used in part (a).
 +
 
 +
----
 +
[[ECE600|Back to ECE600]]
 +
 
 +
[[ECE 600 Exams|Back to ECE 600 Exams]]

Latest revision as of 11:25, 16 July 2012

Example. Mean of i.i.d. random variables

Let $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n} $ be $ M $ jointly distributed i.i.d. random variables with mean $ \mu $ and variance $ \sigma^{2} $ . Let $ \mathbf{Y}_{M}=\frac{1}{M}\sum_{n=0}^{M}\mathbf{X}_{n} $ .

(a) Find the variance of $ \mathbf{Y}_{M} $ .

$ Var\left[\mathbf{Y}_{M}\right]=E\left[\mathbf{Y}_{M}^{2}\right]-\left(E\left[\mathbf{Y}_{M}\right]\right)^{2}. $

$ E\left[\mathbf{Y}_{M}\right]=E\left[\frac{1}{M}\sum_{n=0}^{M}\mathbf{X}_{n}\right]=\frac{1}{M}\sum_{n=0}^{M}E\left[\mathbf{X}_{n}\right]=\frac{1}{M}\cdot M\cdot\mu=\mu. $

$ E\left[\mathbf{Y}_{M}^{2}\right]=E\left[\frac{1}{M^{2}}\sum_{m=1}^{M}\sum_{n=1}^{M}\mathbf{X}_{m}\mathbf{X}_{n}\right]=\frac{1}{M^{2}}\sum_{m=1}^{M}\sum_{n=1}^{M}E\left[\mathbf{X}_{m}\mathbf{X}_{n}\right]. $

Now $ E\left[\mathbf{X}_{m}\mathbf{X}_{n}\right]=\begin{cases} \begin{array}{ll} E\left[\mathbf{X}_{m}^{2}\right] ,m=n\\ E\left[\mathbf{X}_{m}\right]E\left[\mathbf{X}_{n}\right] ,m\neq n \end{array}\end{cases} $ because when $ m\neq n $ , $ \mathbf{X}_{m} $ and $ \mathbf{X}_{n} $ are independent $ \Rightarrow \mathbf{X}_{m} $ and $ \mathbf{X}_{n} $ are uncorrelated.

$ E\left[\mathbf{Y}_{M}^{2}\right]=\frac{1}{M^{2}}\left[M\left(\mu^{2}+\sigma^{2}\right)+M\left(M-1\right)\mu^{2}\right]=\frac{\left(\mu^{2}+\sigma^{2}\right)+\left(M-1\right)\mu^{2}}{M}=\frac{M\mu^{2}+\sigma^{2}}{M}. $

$ Var\left[\mathbf{Y}_{M}\right]=\frac{M\mu^{2}+\sigma^{2}-M\mu^{2}}{M}=\frac{\sigma^{2}}{M}. $

(b) Now assume that the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n} $ are identically distributed with with mean $ \mu $ and variance $ \sigma^{2} $ , but they are only uncorrelated rather than independent. Find the variance of $ \mathbf{Y}_{M} $ .

Again, $ Var\left[\mathbf{Y}_{M}\right]=\frac{\sigma^{2}}{M} $ , because only uncorrelatedness was used in part (a).


Back to ECE600

Back to ECE 600 Exams

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett