Revision as of 16:41, 25 January 2014 by Chen558 (Talk | contribs)


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2012



Question

Part 1. 25 pts


State and prove the Chebyshev inequality for random variable $ \mathbf{X} $ with mean $ \mu $ and variance $ \sigma^2 $. In constructing your proof, keep in mind that $ \mathbf{X} $ may be either a discrete or continuous random variable.


Click here to view student answers and discussions

Part 2. 25 pts


Let $ \mathbf{X}_{1} \dots \mathbf{X}_{n} \dots $ be a sequence of independent, identical distributed random variables, each uniformly distributed on the interval [0, 1], an hence having pdf
$ f_{X}\left(x\right)=\begin{cases} \begin{array}{lll} 1, \text{ for } 0 \leq x \leq1\\ 0, \text{ elsewhere. } \end{array}\end{cases} $

Let $ \mathbf{Y}_{n} $ be a new random variable defined by

$ \mathbf{Y}_{n} = min \,\{{ \mathbf{X}_1, \mathbf{X}_2, \dots \mathbf{X}_n} \} $


(a) Find the pdf of $ \mathbf{Y}_{n} $

(b) Does the sequence $ \mathbf{Y}_{n} $ converge in probability?

(c) Does the sequence $ \mathbf{Y}_{n} $ converge in distribution? If yes, specify the cumulative function of the random variable it converges to.


Click here to view student answers and discussions

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett