Theorem

The linear combination of independent Gaussian random variables is also Gaussian.

$ \Leftrightarrow $

If $ X_1, X_2,...,X_n $ are $ n $ independent Gaussian random variables, then the random variable $ Y $ is also Gaussian, where $ Y $ is a linear combination of $ X_i $'s, $ i = 1, 2,...,n $ given by
$ \begin{align} Y &= \sum_{i=1}^n a_i X_i \\ a_i &\in \mathbb{R} \end{align} $

$ \Leftrightarrow $

if
$ \begin{align} X_i &\sim N(\mu_i, \sigma_i^2) \\ i &= 1,2,...,n \end{align} $

and if
$ \begin{align} Y &= \sum_{i=1}^n a_i X_i \\ a_i &\in \mathbb{R} \end{align} $

then
$ Y\sim N(\sum_{i =1}^n a_i\mu_i, \sum_{i=1}^n a_i^2,\sigma_i^2) $


Proof

Let $ M_Y(t) $ be the moment generating function of $ Y $
Let $ M_{Xi}(t) $ be the moment generating function of $ X_i $. Since $ X_i $ are Gaussian varianbles with mean $ \mu_i $s and variances $ \sigma^2 $s respectively, we have that
$ M_{X_i}(t) = E[e^{tX_i}] = e^{\mu_it + \frac{1}{2}\sigma^2t^2} $
$ \Rightarrow M_{a_iX_i}(t) = E[e^{t(a_iX_i)}] = e^{a_i\mu_it + \frac{1}{2}\sigma^2a_i^2t^2} $

Since $ X_i $ are independent with respect to each other, and $ Y $ is a linear combination of them, we know that $ M_Y(t) $ is a product of the individual $ M_{Xi}(t) $ (proof), i.e.
$ \begin{align} Y &= \sum_{i=1}^n a_i X_i \\ a_i &\in \mathbb{R} \end{align} $
$ \begin{align} \Rightarrow M_Y(t) &= \prod_{i=1}^n M_{X_i}(t) \\ &= \prod_{i=1}^n e^{a_i\mu_i + \frac{1}{2}\sigma^2a_i^2t^2} \\ &= e^{(\sum_{i=1}^n a_i\mu_it) + \frac{1}{2}(\sum_{i=1}^n\sigma^2a_i^2t^2)} \end{align} $

Note that $ M_Y(t) $ is the moment generating function of a Gaussian variable with mean $ \mu $ and variance $ \sigma^2 $ where
$ \begin{align} \mu &= \sum_{i =1}^n a_i\mu_i \\ \sigma^2 &= \sum_{i=1}^n a_i^2,\sigma_i^2 \end{align} $

Thus we have that
$ Y\sim N(\sum_{i =1}^n a_i\mu_i, \sum_{i=1}^n a_i^2,\sigma_i^2)_{\blacksquare} $



Back to list of proofs

Alumni Liaison

ECE462 Survivor

Seraj Dosenbach