(2 intermediate revisions by the same user not shown)
Line 4: Line 4:
 
[[Category:problem solving]]
 
[[Category:problem solving]]
 
[[Category:random variables]]
 
[[Category:random variables]]
 +
[[Category:probability]]
  
==Question from [[ECE_PhD_QE_CNSIP_Jan_2001_Problem1|ECE QE January 2001]]==
+
<center>
Question here
+
<font size= 4>
 +
[[ECE_PhD_Qualifying_Exams|ECE Ph.D. Qualifying Exam]]
 +
</font size>
 +
 
 +
<font size= 4>
 +
Communication, Networking, Signal and Image Processing (CS)
 +
 
 +
Question 1: Probability and Random Processes
 +
</font size>
 +
 
 +
January 2001
 +
</center>
 +
----
 +
----
 +
=Part 3=
 +
Let the <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots</math>  be a sequence of random variables that converge in mean square to the random variable <math class="inline">\mathbf{X}</math> . Does the sequence also converge to <math class="inline">\mathbf{X}</math>  in probability? (A simple yes or no answer is not acceptable, you must derive the result.)
 
----
 
----
 
==Share and discuss your solutions below.==
 
==Share and discuss your solutions below.==
 
----
 
----
 
=Solution 1 (retrived from [[ECE600_QE_2000_August|here]])=
 
=Solution 1 (retrived from [[ECE600_QE_2000_August|here]])=
Write it here
+
 
 +
Let the <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots</math>  be a sequence of random variables that converge in mean square to the random variable <math class="inline">\mathbf{X}</math> . Does the sequence also converge to <math class="inline">\mathbf{X}</math>  in probability? (A simple yes or no answer is not acceptable, you must derive the result.)
 +
 
 +
We know that <math class="inline">E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0</math>  as <math class="inline">n\rightarrow\infty</math> .
 +
 
 +
By using [[ECE 600 Chebyshev Inequality|Chebyshev Inequality]],
 +
 
 +
<math class="inline">\lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0.</math>
 +
 
 +
<math class="inline">\therefore</math>  A sequence of random variable that converge in mean square sense to the random variable <math class="inline">\mathbf{X}</math> , also converges in probability to <math class="inline">\mathbf{X}</math> .
 +
::<span style="color:green">Question: Should we prove Chebyshev Inequality to get full credit?</span>
 
----
 
----
 
==Solution 2==
 
==Solution 2==

Latest revision as of 10:36, 13 September 2013


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

January 2001



Part 3

Let the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ be a sequence of random variables that converge in mean square to the random variable $ \mathbf{X} $ . Does the sequence also converge to $ \mathbf{X} $ in probability? (A simple yes or no answer is not acceptable, you must derive the result.)


Share and discuss your solutions below.


Solution 1 (retrived from here)

Let the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ be a sequence of random variables that converge in mean square to the random variable $ \mathbf{X} $ . Does the sequence also converge to $ \mathbf{X} $ in probability? (A simple yes or no answer is not acceptable, you must derive the result.)

We know that $ E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0 $ as $ n\rightarrow\infty $ .

By using Chebyshev Inequality,

$ \lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0. $

$ \therefore $ A sequence of random variable that converge in mean square sense to the random variable $ \mathbf{X} $ , also converges in probability to $ \mathbf{X} $ .

Question: Should we prove Chebyshev Inequality to get full credit?

Solution 2

Write it here.


Back to QE CS question 1, January 2001

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Sees the importance of signal filtering in medical imaging

Dhruv Lamba, BSEE2010