(8 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
[[Category:ECE]]
 
[[Category:ECE]]
 
[[Category:QE]]
 
[[Category:QE]]
[[Category:AC]]
 
 
[[Category:problem solving]]
 
[[Category:problem solving]]
  
Line 45: Line 44:
 
x_3
 
x_3
 
\end{bmatrix}
 
\end{bmatrix}
</math>
+
=b \Rightarrow \begin{bmatrix}  
 
+
x_1 \\
===Solution 2===
+
x_2 \\
Let <math>Z=X+Y</math>,
+
x_3
 
+
\end{bmatrix}
<math>P_Z(k)=P_Z(z=k)=P_Z(x+y=k)\\
+
=
=\sum_{i=0}^{k}P(x=i)(y=k-i)=\sum_{i=0}^{k}\frac{\lambda_1^i e^{\lambda_1}}{i!}\cdot\frac{\lambda_2^{k-i}e^{-\lambda_2}}{(k-i)!}
+
\begin{bmatrix}  
=e^{-\lambda_1-\lambda_2}\cdot \frac{(\lambda_1+\lambda_2)^k}{k!}
+
1 & 3 & 0 \\
</math>
+
2 & 1 & 0 \\
 
+
1 & 1 & 1
Using <math>(a+b)^k=\sum_{i=0}^{k}a^ib^{(k-i)}\cdot \frac{k!}{i!(k-i)!} </math>
+
\end{bmatrix}^{-1}
 
+
\begin{bmatrix}  
Therefore,
+
6\\
 
+
4\\
<math>
+
3
P_{X|Z}(x=k|z=n)=\frac{P(x=k,z=n)}{P(z=n)}=\frac{P(x=k,x+y=n)}{P(z=n)}=\frac{P(x=k,y=n-k)}{P(z=n)}\\
+
\end{bmatrix}
=\frac{\lambda_1^k e^{-\lambda_1}}{k!}\frac{\lambda_2^{n-k} e^{-\lambda_2}}{(n-k)!}\frac{n!}{e^{-\lambda_1}e^{-\lambda_2}(\lambda_1+\lambda_2)^n}
+
=
= \frac{\lambda_1^k \lambda_2^{n-k} }{(\lambda_1+\lambda_2)^n}\frac{n!}{k!(n-k)!}
+
\begin{bmatrix}  
</math>
+
\dfrac{6}{5} \\
 
+
\dfrac{8}{5} \\
===Solution 3===
+
\dfrac{1}{5}
 
+
\end{bmatrix}
We will view this problem through the lens of Bayes' Theorem. As such, we can write the conditional distribution as
+
</math><br>
 
+
Such that <math>x^T=[\dfrac{6}{5}, \dfrac{8}{5},\dfrac{1}{5}, 0, 0]</math> is a feasible solution.
<math>
+
P(X = x | X+Y = n) = \frac{P(X = x, X+Y = n)}{P(X+Y = n)} = \frac{P(X = x, Y = n - X)}{\sum^n_{k = 0}P(X = k, Y = n - k)}
+
</math>.
+
 
+
Since <math>X</math> and <math>Y</math> are independent, we can further write
+
 
+
<math>
+
P(X = x | X+Y = n) = \frac{P(X = x)P(Y = n - X)}{\sum^n_{k = 0}(P(X=k)P(Y = n-k))}
+
</math>.
+
 
+
Now let us separate the above expression into numerator and denominator. Recalling that <math>X</math> and <math>Y</math> are independent Poisson r.v.s, the numerator is given by
+
 
+
<math>
+
P(X = x)P(Y = n - X) = \frac{e^{-\lambda_1}\lambda_1^x}{x!}\cdot\frac{e^{-\lambda_2}\lambda_2^{n-x}}{(n-x)!}
+
</math>.
+
 
+
Multiplying the above by <math>\frac{n!}{n!}</math> gives
+
 
+
<math>
+
P(X = x)P(Y = n - X) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}{n\choose x}\lambda_1^x\lambda_2^{n-x}
+
</math>.
+
 
+
Now let us examine the denominator. Again, we make use of the fact that <math>X</math> and <math>Y</math> are independent Poisson r.v.s to write
+
 
+
<math>
+
\sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \sum^n_{k = 0}\left(\frac{e^{-\lambda_1}\lambda_1^k}{k!}\frac{e^{-\lambda_2}\lambda_2^{n-k}}{(n-k)!}\right)
+
</math>.
+
 
+
Again, we multiply by <math>\frac{n!}{n!}</math> to obtain
+
 
+
<math>
+
\sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}\sum^n_{k = 0}{n\choose k}\lambda_1^k\lambda_2^{n-k}
+
</math>.
+
 
+
We can make use of the binomial formula to simplify this expression. Recall that the binomial formula is given by
+
 
+
<math>
+
(a+b)^n = \sum^n_{k = 0}{n\choose k}a^k b^{n - k}
+
</math>.
+
 
+
We use this to write
+
 
+
<math>.
+
\sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}\cdot(\lambda_1 + \lambda_2)^n
+
</math>
+
 
+
Putting this all together, we can finally write
+
 
+
<math>
+
P(X = x | X+Y = n) = \frac{\frac{e^{-\lambda_1 + \lambda_2}}{n!}{n\choose x}\lambda_1^x\lambda_2^{n-x}}{\frac{e^{-\lambda_1 + \lambda_2}}{n!}\cdot(\lambda_1 + \lambda_2)^n} = {n\choose x}\frac{\lambda_1^x\lambda_2^{n-x}}{(\lambda_1 + \lambda_2)^n}
+
</math>
+
 
+
and we are done.
+
  
 +
----
 
===Similar Problem===
 
===Similar Problem===
 
+
[https://engineering.purdue.edu/ECE/Academics/Graduates/Archived_QE_August_2015/AC-3?dl=1 2015 QE AC3 Prob1]<br>
If <math>X</math> and <math>Y</math> are independent binomial random variables with success probabilities <math>p</math> and <math>q</math> respectively, find the probability mass function of <math>X</math> when <math>X + Y = k</math>. In addition, investigate what happens to this p.m.f. when <math>p = q</math>.
+
[https://engineering.purdue.edu/ECE/Academics/Graduates/Archived_QE_August_2015/AC-3?dl=1 2015 QE AC3 Prob3]<br>
 +
[https://engineering.purdue.edu/ECE/Academics/Graduates/Archived_QE_August_2014/AC-3.pdf?dl=1 2014 QE AC3 Prob2]<br>
 
----
 
----
[[ECE-QE_CS1-2015|Back to QE CS question 1, August 2015]]
+
[[QE2016_AC-3_ECE580|Back to QE AC question 3, August 2016]]
  
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]

Latest revision as of 11:37, 25 February 2019


ECE Ph.D. Qualifying Exam

Automatic Control (AC)

Question 3: Optimization

August 2016 Problem 1


Solution

The problem equal to:
Minimize $ 2x_1+x_2 $
Subject to $ \begin{align*} &x_1+3x_2-x_3=6\\ &2x_1+x_2-x_4=4\\ &x_1+x_2+x_5=3\\ &x_1, x_2, x_3, x_4,x_5 >=0 \end{align*} $
such that $ A= \begin{bmatrix} 1 & 3 & -1 & 0 & 0 \\ 2 & 1 & 0 & -1 & 0 \\ 1 & 1 & 0 & 0 & 1 \end{bmatrix} $
we take $ B= \begin{bmatrix} 1 & 3 & 0 \\ 2 & 1 & 0 \\ 1 & 1 & 1 \end{bmatrix} \Rightarrow B\begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} =b \Rightarrow \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} 1 & 3 & 0 \\ 2 & 1 & 0 \\ 1 & 1 & 1 \end{bmatrix}^{-1} \begin{bmatrix} 6\\ 4\\ 3 \end{bmatrix} = \begin{bmatrix} \dfrac{6}{5} \\ \dfrac{8}{5} \\ \dfrac{1}{5} \end{bmatrix} $
Such that $ x^T=[\dfrac{6}{5}, \dfrac{8}{5},\dfrac{1}{5}, 0, 0] $ is a feasible solution.


Similar Problem

2015 QE AC3 Prob1
2015 QE AC3 Prob3
2014 QE AC3 Prob2


Back to QE AC question 3, August 2016

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang