(10 intermediate revisions by the same user not shown)
Line 20: Line 20:
 
==Problem 1==
 
==Problem 1==
  
# Calculate an expression for <math>\lambda_n^c</math>, the X-ray energy corrected for the dark current.
+
a) <math>\lambda_n^c=\lambda_n^b-\lambda_n^d</math>
  
<center>
+
b) <math>G_n = \frac{d\lambda_n^c}{dx}=-\mu (x,y_0+n\Delta d)\lambda_n^c</math>
<math>\lambda_n^c=\lambda_n^b-\lambda_n^d</math>
+
</center>
+
  
# Calculate an expression for <math>G_n</math>, the X-ray attenuation due to the object's presence.
+
c) <math>\lambda_n = \lambda_n^c e^{-\int_{0}^{x}\mu(t)dt} \Longrightarrow \hat{P}_n = \int_{0}^{x}\mu(t)dt= -ln(\frac{\lambda_n}{\lambda_n^c}) = -ln(\frac{\lambda_n}{\lambda_n^b-\lambda_n^d})</math>
  
<center>
+
d) <math>\hat{P}_n = \int_{0}^{T_n}\mu_0dt = \mu_0 T_n</math>
<math>
+
  A straight line with slope <math>\mu_0</math>
G_n=-\mu(x,y_0+n*\Delta d)\lambda_n
+
</math>
+
</center>
+
 
+
# Calculate an expression for <math>\hat{P}_n</math>, an estimate of the integral intensity in terms of <math>\lambda_n</math>, <math>\lambda_n^b</math>, and <math>\lambda_b^d</math>.
+
 
+
<center>
+
<math>
+
\lambda_n=(\lambda_n^b-\lambda_n^d)e^{-\int_0^x \mu(t)dt}
+
</math>
+
</center>
+
<center>
+
<math>
+
\hat{P}_n=\int_0^x \mu(t)dt=-log\frac{\lambda_n}{\lambda_n^b-\lambda_n^d}
+
</math>
+
</center>
+
 
+
# For this part, assume that the object is of constant density with <math>\mu(x,y)=\mu_0</math>. Then sketch a plot of <math>\hat{P}_n</math> versus the object thickness, <math>T_n</math>, in <math>mm</math>, for the <math>n^{th}</math> detector. Label key features of the curve such as its slope and intersection.
+
  
 
==Problem 2==
 
==Problem 2==
  
# Specify the size of <math>YY^t</math> and <math>Y^tY</math>. Which matrix is smaller?
+
a)Since U is <math>p \times N</math>, <math>\Sigma</math> and V are <math>N \times N</math>]
 
+
<center>
+
<math>Y</math> is of size <math>p\times N</math>, so the size of <math>YY^t</math> is <math>p\times p</math>.
+
 
+
<math>Y</math> is of size <math>p\times N</math>, so the size of <math>Y^tY</math> is <math>N\times N</math>.
+
 
+
Obviously, the size of <math>Y^tY</math> is much smaller, since <math>N<<p</math>.
+
</center>
+
 
+
# Prove that both <math>YY^t</math> and <math>Y^tY</math> are both symmetric and positive semi-definite matrices.
+
 
+
<center>
+
To prove it is symmetric:
+
 
+
<math>
+
(YY^t)^t=YY^t
+
</math>
+
 
+
To prove it is positive semi-definite:
+
 
+
Let <math>x</math> be an arbitrary vector
+
  
<math>
+
<math>Y = U \Sigma V^t = p \times N</math>
x^tYY^tx=(Y^tx)^T(Y^tx)\geq 0
+
</math>
+
So the matrix <math>YY^t</math> is positive semi-definite.
+
  
The proving procedures for <math>Y^tY</math> are the same.
+
<math>YY^t = (p\times N)(N\times p) = p \times p</math>
</center>
+
  
# Derive expressions for <math>V</math> and <math>\Sigma</math> in terms of <math>T</math>, and <math>D</math>.
+
<math>Y^tY = (N\times p)(p\times N) = N \times N</math>
  
<center>
+
b) <math>YY^t = U \Sigma V^t V \Sigma U^t = U\Sigma^2 U^t</math> and <math>(YY^t)^t = (U\Sigma^2 U^t)^t = U\Sigma^2 U^t = YY^t </math>. Therefore, <math>YY^t</math> is symmetric
<math>
+
Y^tY=(U\Sigma V^t)^tU\Sigma V^t=V\Sigma^2V^t=TDT^t
+
</math>
+
therefore <math>V=T</math> and <math>\Sigma=D^\frac{1}{2}</math>
+
</center>
+
  
# Drive expressions for <math>U</math> in terms of <math>Y</math>, <math>T</math>, and <math>D</math>.
+
For an arbitrary x, <math>x^tYY^tx = x^t U\Sigma \Sigma U^t x=(\Sigma U^t x)^t\Sigma U^t x=\|\Sigma U^t x\|^2 \geq 0</math>. Therefore, <math>YY^t</math> is positive semi-definite.
  
<center>
+
Similarly, <math>Y^tY = V \Sigma^2 V^t</math> and <math>(Y^tY)^t = (V \Sigma^2 V^t)^t = V \Sigma^2 V^t = Y^tY</math>, <math>Y^tY</math> is symmetric
<math>
+
Y=U\Sigma V^t=UD^\frac{1}{2}T^t
+
</math>
+
</center>
+
<center>
+
<math>
+
\therefore U=Y(D^\frac{1}{2}T^t)^{-1}
+
</math>
+
</center>
+
  
# Derive expressions for <math>E</math> in terms of <math>Y</math>, <math>T</math>, and <math>D</math>.
+
For an arbitrary x, <math>x^tY^tYx = x^t V\Sigma \Sigma V^t x=(\Sigma V^t x)^t\Sigma V^t x=\|\Sigma V^t x\|^2 \geq 0</math>. Therefore, <math>Y^tY</math> is positive semi-definite.
  
<center>
+
c) From B, obtain that <math>Y^tY = V\Sigma^2 V^t </math> while <math>Y^tY = TDT^t</math>. <math>V = T</math> and <math>\Sigma = D^{1/2}</math>
<math>
+
YY^t=U\Sigma V^t(U\Sigma V^t)^t=U\Sigma^2U^t=E\Gamma E^t
+
</math>
+
  
therefore
+
d)<math>U\Sigma V^t=Y \Longrightarrow U = Y(\Sigma V^t)^{-1} = Y(D^{1/2}T^t)^{-1}</math>
  
<math>
+
e)<math>YY^t = U\Sigma U^t = E\Sigma E^t</math>
E=U=Y(D^\frac{1}{2}T^t)^{-1}
+
</math>
+
</center>
+
  
# If the columns of <math>Y</math> are images from a training database, then what name do we give to the columns of <math>U</math>?
+
<math>E = U = Y(D^{1/2}T^t)^{-1}</math>
  
<center>
+
f)The name we give to the column of U is eigenimages
They are called '''eigenimages'''.
+
</center>
+

Revision as of 19:34, 9 July 2019


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 5: Image Processing

August 2016 (Published in Jul 2019)

Problem 1

a) $ \lambda_n^c=\lambda_n^b-\lambda_n^d $

b) $ G_n = \frac{d\lambda_n^c}{dx}=-\mu (x,y_0+n\Delta d)\lambda_n^c $

c) $ \lambda_n = \lambda_n^c e^{-\int_{0}^{x}\mu(t)dt} \Longrightarrow \hat{P}_n = \int_{0}^{x}\mu(t)dt= -ln(\frac{\lambda_n}{\lambda_n^c}) = -ln(\frac{\lambda_n}{\lambda_n^b-\lambda_n^d}) $

d) $ \hat{P}_n = \int_{0}^{T_n}\mu_0dt = \mu_0 T_n $

  A straight line with slope $ \mu_0 $

Problem 2

a)Since U is $ p \times N $, $ \Sigma $ and V are $ N \times N $]

$ Y = U \Sigma V^t = p \times N $

$ YY^t = (p\times N)(N\times p) = p \times p $

$ Y^tY = (N\times p)(p\times N) = N \times N $

b) $ YY^t = U \Sigma V^t V \Sigma U^t = U\Sigma^2 U^t $ and $ (YY^t)^t = (U\Sigma^2 U^t)^t = U\Sigma^2 U^t = YY^t $. Therefore, $ YY^t $ is symmetric

For an arbitrary x, $ x^tYY^tx = x^t U\Sigma \Sigma U^t x=(\Sigma U^t x)^t\Sigma U^t x=\|\Sigma U^t x\|^2 \geq 0 $. Therefore, $ YY^t $ is positive semi-definite.

Similarly, $ Y^tY = V \Sigma^2 V^t $ and $ (Y^tY)^t = (V \Sigma^2 V^t)^t = V \Sigma^2 V^t = Y^tY $, $ Y^tY $ is symmetric

For an arbitrary x, $ x^tY^tYx = x^t V\Sigma \Sigma V^t x=(\Sigma V^t x)^t\Sigma V^t x=\|\Sigma V^t x\|^2 \geq 0 $. Therefore, $ Y^tY $ is positive semi-definite.

c) From B, obtain that $ Y^tY = V\Sigma^2 V^t $ while $ Y^tY = TDT^t $. $ V = T $ and $ \Sigma = D^{1/2} $

d)$ U\Sigma V^t=Y \Longrightarrow U = Y(\Sigma V^t)^{-1} = Y(D^{1/2}T^t)^{-1} $

e)$ YY^t = U\Sigma U^t = E\Sigma E^t $

$ E = U = Y(D^{1/2}T^t)^{-1} $

f)The name we give to the column of U is eigenimages

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett