Revision as of 06:46, 17 July 2012 by Rhea (Talk | contribs)


Question from ECE QE January 2001

Let the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ be a sequence of random variables that converge in mean square to the random variable $ \mathbf{X} $ . Does the sequence also converge to $ \mathbf{X} $ in probability? (A simple yes or no answer is not acceptable, you must derive the result.)


Share and discuss your solutions below.


Solution 1 (retrived from here)

Let the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ be a sequence of random variables that converge in mean square to the random variable $ \mathbf{X} $ . Does the sequence also converge to $ \mathbf{X} $ in probability? (A simple yes or no answer is not acceptable, you must derive the result.)

We know that $ E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0 $ as $ n\rightarrow\infty $ .

By using Chebyshev Inequality,

$ \lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0. $

$ \therefore $ A sequence of random variable that converge in mean square sense to the random variable $ \mathbf{X} $ , also converges in probability to $ \mathbf{X} $ .

Question: Should we prove Chebyshev Inequality to get full credit?

Solution 2

Write it here.


Back to QE CS question 1, January 2001

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett