(QE2013_AC-3_ECE580_question1)
(ChanglinWan_AC3_2016_problem)
Line 17: Line 17:
 
</font size>
 
</font size>
  
August 2016
+
August 2017
 
</center>
 
</center>
 
----
 
----
 
----
 
----
:Student answers and discussions for [[QE2013_AC-3_ECE580-1|Part 1]],[[QE2013_AC-3_ECE580-2|2]],[[QE2013_AC-3_ECE580-3|3]],[[QE2013_AC-3_ECE580-4|4]],[[QE2013_AC-3_ECE580-5|5]]
+
<!--哈哈我是注释,:Student answers and discussions for [[QE2013_AC-3_ECE580-1|Part 1]],[[QE2013_AC-3_ECE580-2|2]],[[QE2013_AC-3_ECE580-3|3]],[[QE2013_AC-3_ECE580-4|4]],[[QE2013_AC-3_ECE580-5|5]]不会在浏览器中显示。-->
 +
 
 
----
 
----
'''1.(20 pts) In some of the optimization methods, when minimizing a given function f(x), we select an intial guess <math>x^{(0)}</math> and a real symmetric positive definite matrix <math>H_{0}</math>.  Then we computed <math>d^{(k)} = -H_{k}g^{(k)}</math>, where <math>g^{(k)} = \nabla f( x^{(k)} )</math>, and <math>x^{(k+1)} = x^{(k)} + \alpha_{k}d^{(k)}</math>, where'''
+
1.(20 pts) Considern the following linear program, <br/>
<br>
+
<center> minimize <math>2x_{1} + x_{2}</math>, </center> <br/>
<math> \alpha_{k} = arg\min_{\alpha \ge 0}f(x^{(k)} + \alpha d^{(k)}) .</math>
+
<center> subject to <math>x_{1} + 3x_{2} \geq 6 </math> </center> <br/>
<br>
+
<center> <math>2x_{1} + x_{2} \geq 4</math> </center> <br/>
'''Suppose that the function we wish to minimize is a standard quadratic of the form,'''
+
<center> <math> x_{1} + x_{2} \leq 3 </math> </center> <br/>
<br>
+
<center> <math> x_{1} \geq 0 </math>, <math> x_{2} \geq 0 </math>. </center> <br/>
<math> f(x) = \frac{1}{2} x^{T} Qx - x^{T}b+c, Q = Q^{T} > 0. </math>
+
Convert the above linear program into standard form and find an initial basic feasible solution for the program in standard form. <br/>
<br><br>
+
'''(i)(10 pts) Find a closed form expression for <math>\alpha_k</math> in terms of <math>Q, H_k, g^{(k)}</math>, and  <math>d^{(k)}; </math>'''
+
<br>  
+
'''(ii)(10 pts) Give a sufficient condition on <math>H_k</math> for <math>\alpha_k</math> to be positive.'''
+
  
:'''Click [[QE2013_AC-3_ECE580-1|here]] to view [[QE2013_AC-3_ECE580-1|student answers and discussions]]'''
 
 
----
 
----
 +
2.(20 pts)
 +
*(15 pts) FInd the largest range of the step-size, <math> \alpha </math>, for which the fixed step gradient descent algorithm is guaranteed to convege to the minimizer of the quadratic function <br/>
 +
<center> <math> f = \frac{1}{2} x^{T}Qx - b^{T}x </math> </center> <br/>
 +
starting from an arbitary initial condition <math> x^{(0)} \in \mathbb{R}^{n} </math>, where  <math> x \in \mathbb{R}^{n} </math>, and <math>Q = Q^{T} > 0</math>. <br/>
 +
*(5 pts) Find the largest range of the step size, <math>\alpha</math>, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function<br/>
 +
<center><math> f= 6x_{1}^{2}+2x_{2}^{2}-5 </math></center> <br/>
 +
starting from an arbitrary initial condition <math>x^{(0)} \in \mathbb{R}^{n}</math>
  
  
 
----
 
----
[[ECE_PhD_Qualifying_Exams|Back to ECE QE page]]
+
3. (20 pts) Is the function <br/>
 +
<center><math> f(x_{1}, x_{2})=\frac{1}{(x_{1}-2)^{2} + (x_{2}+1)^{2}+3} </math></center><br>
 +
locally convex, concave, or neither in the neighborhood of the point <math> [2 -1]^{T} </math>? Justify your answer by giving all the details of your argument.
  
[[ ECE PhD Qualifying Exams|Back to ECE PhD Qualifying Exams]]
+
 
 +
----
 +
4. (20 pts) Solve the following optimization problem:
 +
<center>optimize <math> x_{1}x_{2} </math> </center><br>
 +
<center>subject to <math> x_{1}+x_{2}+x_{3}=1 </math> </center><br>
 +
<center><math> x_{1}+x_{2}-x_{3}=0 </math></center><br>
 +
 
 +
----
 +
5. (20 pts) Solve the following optimization problem:<br/>
 +
<center>maximize <math> 14x_{1}-x_{1}^{2}+6x_{2}-x_{2}^{2}+7 </math></center><br>
 +
<center>subject to <math>x_{1}+x_{2} \leq 2</math></center><br>
 +
<center><math> x_{1}+2x_{2} \leq 3 </math></center>
 +
 
 +
----
 +
----
 +
 
 +
 
 +
[[ECE_PhD_Qualifying_Exams|Back to ECE QE page]]

Revision as of 22:03, 17 February 2019


ECE Ph.D. Qualifying Exam

Automatic Control (AC)

Question 3: Optimization

August 2017




1.(20 pts) Considern the following linear program,

minimize $ 2x_{1} + x_{2} $,

subject to $ x_{1} + 3x_{2} \geq 6 $

$ 2x_{1} + x_{2} \geq 4 $

$ x_{1} + x_{2} \leq 3 $

$ x_{1} \geq 0 $, $ x_{2} \geq 0 $.

Convert the above linear program into standard form and find an initial basic feasible solution for the program in standard form.


2.(20 pts)

  • (15 pts) FInd the largest range of the step-size, $ \alpha $, for which the fixed step gradient descent algorithm is guaranteed to convege to the minimizer of the quadratic function
$ f = \frac{1}{2} x^{T}Qx - b^{T}x $

starting from an arbitary initial condition $ x^{(0)} \in \mathbb{R}^{n} $, where $ x \in \mathbb{R}^{n} $, and $ Q = Q^{T} > 0 $.

  • (5 pts) Find the largest range of the step size, $ \alpha $, for which the fixed step gradient descent algorithm is guaranteed to converge to the minimizer of the quadratic function
$ f= 6x_{1}^{2}+2x_{2}^{2}-5 $

starting from an arbitrary initial condition $ x^{(0)} \in \mathbb{R}^{n} $



3. (20 pts) Is the function

$ f(x_{1}, x_{2})=\frac{1}{(x_{1}-2)^{2} + (x_{2}+1)^{2}+3} $

locally convex, concave, or neither in the neighborhood of the point $ [2 -1]^{T} $? Justify your answer by giving all the details of your argument.



4. (20 pts) Solve the following optimization problem:

optimize $ x_{1}x_{2} $

subject to $ x_{1}+x_{2}+x_{3}=1 $

$ x_{1}+x_{2}-x_{3}=0 $


5. (20 pts) Solve the following optimization problem:

maximize $ 14x_{1}-x_{1}^{2}+6x_{2}-x_{2}^{2}+7 $

subject to $ x_{1}+x_{2} \leq 2 $

$ x_{1}+2x_{2} \leq 3 $



Back to ECE QE page

Alumni Liaison

Recent Math PhD now doing a post-doctorate at UC Riverside.

Kuei-Nuan Lin