Line 6: Line 6:
  
 
==Question from [[ECE_PhD_QE_CNSIP_Jan_2001_Problem1|ECE QE January 2001]]==  
 
==Question from [[ECE_PhD_QE_CNSIP_Jan_2001_Problem1|ECE QE January 2001]]==  
Question here
+
Let the <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots</math>  be a sequence of random variables that converge in mean square to the random variable <math class="inline">\mathbf{X}</math> . Does the sequence also converge to <math class="inline">\mathbf{X}</math>  in probability? (A simple yes or no answer is not acceptable, you must derive the result.)
 
----
 
----
 
==Share and discuss your solutions below.==
 
==Share and discuss your solutions below.==
 
----
 
----
 
=Solution 1 (retrived from [[ECE600_QE_2000_August|here]])=
 
=Solution 1 (retrived from [[ECE600_QE_2000_August|here]])=
Write it here
+
 
 +
Let the <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots</math>  be a sequence of random variables that converge in mean square to the random variable <math class="inline">\mathbf{X}</math> . Does the sequence also converge to <math class="inline">\mathbf{X}</math>  in probability? (A simple yes or no answer is not acceptable, you must derive the result.)
 +
 
 +
We know that <math class="inline">E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0</math>  as <math class="inline">n\rightarrow\infty</math> .
 +
 
 +
By using [[ECE 600 Chebyshev Inequality|Chebyshev Inequality]],
 +
 
 +
<math class="inline">\lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0.</math>
 +
 
 +
<math class="inline">\therefore</math>  A sequence of random variable that converge in mean square sense to the random variable <math class="inline">\mathbf{X}</math> , also converges in probability to <math class="inline">\mathbf{X}</math> .
 +
::<span style="color:green">Question: Should we prove Chebyshev Inequality to get full credit?</span>
 
----
 
----
 
==Solution 2==
 
==Solution 2==

Revision as of 06:46, 17 July 2012


Question from ECE QE January 2001

Let the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ be a sequence of random variables that converge in mean square to the random variable $ \mathbf{X} $ . Does the sequence also converge to $ \mathbf{X} $ in probability? (A simple yes or no answer is not acceptable, you must derive the result.)


Share and discuss your solutions below.


Solution 1 (retrived from here)

Let the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ be a sequence of random variables that converge in mean square to the random variable $ \mathbf{X} $ . Does the sequence also converge to $ \mathbf{X} $ in probability? (A simple yes or no answer is not acceptable, you must derive the result.)

We know that $ E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0 $ as $ n\rightarrow\infty $ .

By using Chebyshev Inequality,

$ \lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0. $

$ \therefore $ A sequence of random variable that converge in mean square sense to the random variable $ \mathbf{X} $ , also converges in probability to $ \mathbf{X} $ .

Question: Should we prove Chebyshev Inequality to get full credit?

Solution 2

Write it here.


Back to QE CS question 1, January 2001

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

has a message for current ECE438 students.

Sean Hu, ECE PhD 2009