1.10 Two Random Variables

From the ECE600 Pre-requisites notes of Sangchun Han, ECE PhD student.


Fact

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be two jointly-distributed, statistically independent random variables, having pdfs $ f_{\mathbf{X}}\left(x\right) $ and $ f_{\mathbf{Y}}\left(y\right) $ . Then the pdf of $ \mathbf{Z}=\mathbf{X}+\mathbf{Y} $ is

$ f_{\mathbf{Z}}\left(z\right)=\left(f_{\mathbf{X}}*f_{\mathbf{Y}}\right)\left(z\right)=\int_{-\infty}^{\infty}f_{\mathbf{X}}\left(x\right)f_{\mathbf{Y}}\left(z-x\right)dx=\int_{-\infty}^{\infty}f_{\mathbf{Y}}\left(y\right)f_{\mathbf{X}}\left(z-y\right)dy $.

The discrete version of above equation is

$ P\left(\mathbf{Z}=z\right)=\sum_{k=-\infty}^{\infty}P\left(\mathbf{X}=k\right)P\left(\mathbf{Y}=z-k\right)=\sum_{k=-\infty}^{\infty}P\left(\mathbf{X}=z-k\right)P\left(\mathbf{Y}=k\right) $.

Example. Sum of two exponential random variables

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be two jointly-distributed RVs having exponential distributions with mean $ \mu $ . Let $ \mathbf{Z}=\mathbf{X}+\mathbf{Y} $ . Find $ f_{\mathbf{Z}}\left(z\right) $ .

$ f_{\mathbf{Z}}\left(z\right) $

Joint Gaussian pdf

$ f_{\mathbf{XY}}(x,y)=\frac{1}{2\pi\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)}\left[\frac{\left(x-\mu_{\mathbf{X}}\right)^{2}}{\sigma_{\mathbf{X}}^{2}}-\frac{2r\left(x-\mu_{\mathbf{X}}\right)\left(y-\mu_{\mathbf{Y}}\right)}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}+\frac{\left(y-\mu_{\mathbf{Y}}\right)^{2}}{\sigma_{\mathbf{Y}}^{2}}\right]\right\} $

If $ r=0 $ ($ \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated),

$ f_{\mathbf{XY}}(x,y) $

$ \Longrightarrow\mathbf{X} $ and $ \mathbf{Y} $ are independent.

If two RVs $ \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated Gaussian, then $ \mathbf{X} $ and $ \mathbf{Y} $ are independent. This is the exception of “$ \mathbf{X} $ and $ \mathbf{Y} $ are independent $ \left(\nLeftarrow\right)\Rightarrow \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated”.

Definition. Uncorrelation

Two RVs \mathbf{X} and \mathbf{Y} are uncorrelated if their covariance is equal to zero. This is true if any of the following three equivalent condition is true:

1. $ Cov\left(\mathbf{X},\mathbf{Y}\right)=0 $

2. $ r_{\mathbf{XY}}=0 $

3. $ E\left[\mathbf{XY}\right]=E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right] $

Note

$ \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated $ \Longleftrightarrow E\left[\mathbf{XY}\right]=E\left[\mathbf{X}\right]\cdot E\left[\mathbf{Y}\right]\Longleftrightarrow Cov\left(\mathbf{X},\mathbf{Y}\right)=0 $

$ \mathbf{X} $ and $ \mathbf{Y} $ are independent $ \Longleftrightarrow f_{\mathbf{XY}}(x,y)=f_{\mathbf{X}}(x)\cdot f_{\mathbf{Y}}(y) $

$ \mathbf{X} $ and $ \mathbf{Y} $ are independent $ \left(\nLeftarrow\right)\Longrightarrow \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated Definition. Orthogonality

Two RVs are orthogoal if $ E\left[\mathbf{XY}\right]=0 $ .

Fact

If $ E\left[\mathbf{X}^{2}\right]<\infty and E\left[\mathbf{Y}^{2}\right]<\infty , then \left|E\left[\mathbf{XY}\right]\right|\leq\sqrt{E\left[\mathbf{X}\right]^{2}E\left[\mathbf{Y}\right]^{2}} $ with equality iff $ \mathbf{Y}=a_{0}\mathbf{X} $ where $ a_{0} $ is constant.

Recall

For a quadratic equation $ ax^{2}+bx+c=0,\; a\neq0 $ , the discriminant is $ b^{2}-4ac $ .

Proof

$ E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]\geq0\Longrightarrow a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right]\geq0. $

n.b. LHS is a quadratic in a .

Let's consider two cases: $ (i)\; E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]>0 , (ii)\; E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=0 $ .

(i) $ 0<E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right] $ $ \Longrightarrow $ quadratic in a has complex roots $ \Longrightarrow $ “discriminant” of this quadratic is negative

$ 4a^{2}E\left[\mathbf{XY}\right]^{2}-4a^{2}E\left[\mathbf{X}^{2}\right]E\left[\mathbf{Y}^{2}\right] $

(ii) $ 0=E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right] $ for some $ a=a_{0} $ .

n.b. The discriminant is equal to $ 0 $ when $ \mathbf{Y}=a_{0}\mathbf{X} $ .


Back to ECE600

Back to ECE 600 Prerequisites

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood