Back to all ECE 600 notes
Previous Topic: Expectation
Next Topic: Joint Distributions


The Comer Lectures on Random Variables and Signals

Slectures by Maliha Hossain


Topic 10: Characteristic Functions



Characteristic Functions

The pdf f$ _X $ of a random variable X is a function of a real valued variable x. It is sometimes useful to work with a "frequency domain" representation of f$ _X $. The characteristic function gives us this representation.

Definition $ \qquad $ Z:SC defined on (S,F,P) is a complex random variable if

$ Z = X+iY $

where X and Y are real valued random variables on (S,F,P).

Using the linearity of E[],

$ E[Z] = E[X] + iE[Y] \ $

Now consider the complex random variable Z = $ e^{i\omega X} $, where $ \omega $R is a "frequency" variable. We can write Z as

$ Z=e^{i\omega X}=\cos(\omega X)+i\sin(\omega X) \ $

and

$ E[Z]=E[e^{i\omega X}]=E[\cos(\omega X)]+iE[\sin(\omega X)] \ $

This expectation depends on $ \omega $R and will be the characteristic function of X.

Definition $ \qquad $ Let X be a random variable on (S,F,P). The characteristic function X is given by

$ \Phi_X(\omega)\equiv E[e^{i\omega X}]\qquad\forall\omega\in\mathbb R $

If X is continuous, we have

$ \Phi_X(\omega)= \int_{-\infty}^{\infty}e^{i\omega x}f_X(x)dx $

And if X is discrete, then we use

$ \Phi_X(\omega)= \sum_{x\in\mathcal R_x}e^{i\omega x}p_X(x) $

Note: The characteristic function looks like the Fourier Transform of f$ _X $, with opposite sign in the exponent. We can show that

$ f_X(x) = \frac{1}{2\pi}\int_{-\infty}^{\infty}\Phi_X(\omega)e^{-i\omega x}d\omega $



Moments

Definition $ \qquad $ The Moment Generating Function (mgf) of random variable X is given by

$ \phi_X(s)\equiv E[e^{sX}]\qquad\forall s\in\mathbb C $

Moment Theorem $ \qquad $ The Moment Theorem (or Moment Generating Theorem) shows us how to use the mgf to find moments of X:
given a random variable X with mgf $ \phi_X $, the nth moment pf X is given by

$ \begin{align} \mu_n &= E[X^n] \\ \\ &=\phi^{(n)}(0) \\ \\ &= \frac{d^n\phi_X(s)}{ds^n}|_{s=0} \end{align} $

Proof:
Differentiating $ \phi_X $ with respect to s n times gives

$ \begin{align} \phi_X^{(n)}(s)&=\frac{d^n}{ds^n}E[e^{sX}] \\ \\ &=E[\frac{d^n}{ds^n}e^{sX}] \\ \\ &=E[X^ne^{sX}] \end{align} $


So,

$ \phi_X^{(n)}(0)=E[X^n] $


This result can be written in terms of the characteristic function:

$ \mu_n = \frac{1}{i^n}\;\frac{d^n}{d\omega^n}\Phi_X(\omega)|_{\omega = 0} $


Example $ \qquad $ X is an exponential random variable. We can show that

$ \Phi_X(\omega) = \frac{1/\mu}{1/\mu - i\omega} $

since

$ f_X(x) = \frac{1}{\mu}e^{-\frac{x}{\mu}}u(x) $

Now,

$ \Phi_X'(\omega) = \frac{(1/\mu)\;i}{(1/\mu-i\omega)^2} $

and

$ \Phi_X''(\omega) = \frac{-2/\mu}{(1/\mu-i\omega)^3} $

So,

$ E[X]=\frac{1}{i}(\frac{(1/\mu)\;i}{(1/\mu)^2})=\mu $

and

$ E[X^2]=\frac{1}{i^2}(\frac{-2/\mu}{(1/\mu)^3})=2\mu^2 $

Then

$ Var(X) = 2\mu^2-\mu^2 =\mu^2 \ $



References



Questions and comments

If you have any questions, comments, etc. please post them on this page



Back to all ECE 600 notes
Previous Topic: Expectation
Next Topic: Joint Distributions

Alumni Liaison

To all math majors: "Mathematics is a wonderfully rich subject."

Dr. Paul Garrett