Theorem

If $ X_1,X_2,...X_n $ are $ n $ independent random variables with moment generating functions $ M_{Xi}(t) $, $ i = 1,2,...,n $, and if $ Y $ is a random variable resulting from a linear combination of $ X_i $s such that
$ Y = \sum_{i=1}^n a_iX_i, $
$ a_i \in \mathbf{R} $

then, the moment generating function of $ Y $, $ M_Y(t) $ is the product of the individual $ M_{Xi}(a_it) $s. i.e.
$ M_Y(t) = \prod_{i=1}^n M_{X_i}(a_it) $



Proof

From the definition of the moment generating function and of $ Y $, we have that
$ \begin{align} M_Y(t) &= E[e^{tY}] \\ &= E[e^{t(a_1X_1+a_2X_2+...+a_nX_n)}] \\ &= E[e^{t(a_1X_1)}e^{t(a_2X_2)}...e^{t(a_nX_n)}] \end{align} $

Since $ X_i $ are independent, the expectation of the product of functions of random variables can be written as the product of the expectations of functions of random variables (proof).
$ \begin{align} \Rightarrow M_Y(t) &= E[e^{t(a_1X_1)}]E[e^{t(a_2X_2)}]...E[e^{t(a_nX_n)}] \\ &= M_{X_1}(a_1t)M_{X_2}(a_2t)...M_{X_n}(a_nt) \\ &= \prod_{i=1}^n M_{X_i}(a_it)_{\blacksquare} \end{align} $



Back to list of proofs

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn