Line 6: Line 6:
 
</font size>
 
</font size>
  
<font size= 3> Topic 9: Excpectation</font size>
+
<font size= 3> Topic 9: Expectation</font size>
 
</center>
 
</center>
  

Revision as of 19:15, 10 November 2013


Random Variables and Signals

Topic 9: Expectation




Thus far, we have learned how to represent the probabilistic behavior or random variables X using the density function f$ _X $ or the mass function p$ _X $.
Sometimes, we want to describe X probabilistically using only a small number of parameters. The expectation is often used to do this.

Definition $ \qquad $ the expected value of continuous random variable X is defined as

$ E[X) = \int_{-\infty}^{\infty}xf_X(x)dx $


Definition $ \qquad $ the expected value of discrete random variable X is defined as

$ E[X] = \sum_{x\in\mathcal R_X}xp_X(x) $

where $ R_X $ is the range space of X.

Note:

  • E[X] is also known as the mean of X. Other notation for E[X] include:
$ EX,\;\overline{X},\;m_X,\;\mu_X $
  • The equation defining E[X] for discrete X could have been derived from the continuous X, using the density function f$ _X $ containing $ \delta $-functions.

Example $ \qquad $ X is an exponential random variable. find E[X].

$ f_X(x) = \lambda e^{-\lambda x}u(x) \ $

$ \begin{align} E[X] &= \int_{-\infty}^{\infty}xf_X(x)dx \\ &= \int_{0}^{\infty}x\lambda e^{-\lambda x}dx \\ &= \frac{1}{\lambda} \end{align} $

Let $ \mu = 1/\lambda $. We often write

$ f_X(x) = \frac{1}{\mu} e^{-\frac{1}{\mu}x}u(x) \ $


Example $ \qquad $ X is a uniform discrete random varibable with $ R_X $ = {1,...,n}. Then,

$ \begin{align} E[X]&=\sum_{k=1}^n\frac{k}{n} \\ &=\frac{1}{n}\sum_{k=1}^n k \\ \\ &= \frac{1}{n}(\frac{1}{2})(n)(n+1) \\ \\ &=\frac{n+1}{2} \end{align} $


Having defined E[X], we will now consider more general E[g(X)] for a function g:RR.

Let Y = g(X). What is E[Y]? From previous definitions:

$ E[Y]=\int_{-\infty}^{\infty}yf_Y(y)dy $

or

$ E[Y] = \sum_{y\in\mathcal R_Y}yp_Y(y) $

We can find this by first finding f$ _Y $ or p$ _Y $ in terms of g and f$ _X,/math> or p<math>_X $. Alternatively, it can be shown that

$ E[Y]=E[g(X)]=\int_{-\infty}^{\infty}g(x)f_X(x)dx $

or

$ E[Y] = E[g(X)]=\sum_{y\in\mathcal R_X}g(x)p_X(x) $

See Papoulis for the proof of the above.

Two important cases or functions g:

  • g(x) = x. Then E[g(X)] = E[X]
  • g(x) = (x - $ \mu_X)^2 $. Then E[g(X)] = E[(X - $ \mu_X)^2 $]
$ E[g(X)] = \int_{-\infty}^{\infty}(x-\mu_X)^2f_X(x)dx $

or

$ E[g(X)] = \sum_{x\in\mathcal R_x}(x-\mu_X)^2p_X(x) $

Note: $ \qquad $ E[(X - $ \mu_X)^2 $] is called the variance of X and is often denoted $ \sigma_X $$ ^2 $. $ \sigma_X $ is called the standard deviation of X.

Important property of E[]:
Let g$ _1 $:RR; g$ _2 $:RR; $ \alpha,\beta $R, Then

$ E[\alpha g_1(X) +\beta g_2(X)] = \alpha E[g_1(X)]+\beta E[g_2(X)] \ $

So E[] is a linear operator. The proof follows from the linearity of integration.

Important property of Var():

$ Var(X) = E[X^2]-\mu_X^2 $

Proof:

$ \begin{align} E[(X-\mu)^2]&=E[X^2-2X\mu_X+\mu_X^2] \\ &=E[X^2]-2\mu_XE[X]+E[\mu_X^2] \\ &=E[X^2]-2\mu_X^2+\mu_X^2 \\ &=E[X^2]-\mu_X^2 \end{align} $


Example $ \qquad $ X is Gaussian N($ \mu,\sigma^2 $). Find E[X} and Var(X).

$ E[X] = \int_{-\infty}^{\infty}\frac{x}{\sqrt{2\pi}\sigma}e^{-\frac{(x-\mu)^2}{2\sigma^2}}dx $

Let r = x - $ \mu $. Then

$ E[X] = \int_{-\infty}^{\infty}\frac{r}{\sqrt{2\pi}\sigma}e^{-\frac{r^2}{2\sigma^2}}dr\;+\; \mu\int_{-\infty}^{\infty}\frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{r^2}{2\sigma^2}}dr $

First term: Integrating an odd function over (-∞,∞) ⇒ first term is 0.
Second term: Integrating a Gaussian pdf over (-∞,∞) gives one ⇒ second term is $ \mu $.
So E[X] = $ \mu $

$ E[X^2] = \int_{-\infty}^{\infty}\frac{x^2}{\sqrt{2\pi}\sigma}e^{-\frac{(x-\mu)^2}{2\sigma^2}}dx $

Using integration by parts, we see that this integral evaluates to $ \sigma^2+\mu^2 $. So,

$ Var(X) = \sigma^2+\mu^2-\mu^2 = \sigma^2 $


Example $ \qquad $ X is Poisson with parameter $ \lambda $. Find E[X] and Var(X).

$ \begin{align} E[X] &= \sum_{k=0}^{\infty}k\frac{e^{-\lambda}\lambda^k}{k!} \\ &= \sum_{k=1}^{\infty}k\frac{e^{-\lambda}\lambda^k}{(k-1)!} \\ &= \lambda\sum_{k=0}^{\infty}e^{-\lambda}\frac{\lambda^k}{k!} \\ &= \lambda \end{align} $


$ \begin{align} E[X^2] &= \sum_{k=0}^{\infty}k^2\frac{e^{-\lambda}\lambda^k}{k!} \\ &= \sum_{k=0}^{\infty}(k+1)\frac{e^{-\lambda}\lambda^{k+1}}{k!} \\ &= \lambda\sum_{k=0}^{\infty}\frac{ke^{-\lambda}\lambda^k}{k!}\;+\;\lambda\sum_{k=0}^{\infty}\frac{e^{-\lambda}\lambda^k}{k!} \\ \\ &= \lambda E[X] + \lambda(1) \\ &= \lambda^2+\lambda \end{align} $

So,
$ E[X^2] = \lambda^2 +\lambda \ $
$ \Rightarrow Var(X) = \lambda^2 +\lambda - \lambda = \lambda \ $



Moments

Moments generalize mean and variance to nth order expectations.

Definition $ \qquadd $ the nth order moment of random variable X is

$ \mu_n\equiv E[X^n]=\int_{-\infty}^{\infty}x^nf_X(x)dx\quad n=1,2,... $

and the nth central moment of X is

$ v_n\equiv E[(X-\mu_X)^n] = \int_{-\infty}^{\infty}(x-\mu_X)^nf_X(x)dx\qquad n = 2,3,... $

So

  • $ \mu_1 $ = E[X] mean
  • $ \mu_2 $ = E[X$ ^2 $] mean-square
  • v$ _2 $ = Var(X) variance



Conditional Expectation

For an event M ∈ F with P(M) > 0.

$ E[g(X)|M] = \int_{-\infty}^{\infty}g(x)f_X(x|M)dx $

or

$ E[g(X)|M] = \sum_{x\in\mathcal R_x}g(x)p_X(x|M)dx $


Example $ \qquad $ X is an exponential random variable. Let M = {X > $ \mu $}. Find E[X|M]. Note that P(M) = P(X > $ \mu $) and since $ \mu $ > 0,

$ P(M) = P(X>\mu) =\int_{\mu}^{\infty}\frac{1}{\mu}e^{-\frac{x}{\mu}}dx \;>\;0 $

It can be shown that

$ f_X(x|X>\mu) = \frac{1}{\mu}e^{-\frac{x-\mu}{\mu}}u(x-\mu) $

Then,

$ \begin{align} E[X|X>\mu] &=\int_{\mu}^{\infty}\frac{x}{\mu}e^{-\frac{x-\mu}{\mu}}dx \\ &=2\mu \end{align} $


Fig 1: Conditional Expectation; X is exponential.



References



Questions and comments

If you have any questions, comments, etc. please post them on this page



Back to all ECE 600 notes

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett