(New page: <object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/wzJkaATyitA&rel=1"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtu...)
 
Line 1: Line 1:
 
<object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/wzJkaATyitA&rel=1"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/wzJkaATyitA&rel=1" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object>
 
<object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/wzJkaATyitA&rel=1"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/wzJkaATyitA&rel=1" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object>
 +
 +
<object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/wzJkaATyitA&hl=en"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/wzJkaATyitA&hl=en" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object>
  
 
The video demonstrates Bayes decision rule on 2D feature data from two classes. We visualize the decision hypersurface as a red "wall" cutting through the bi-modal distribution of the data, and observe how it changes with the parameters of the Gaussian distributions for the two classes.  Note that if the covariance matrices and the priors of the classes are identical, then the decision surface cuts directly between the two modes.  If the prior of one class increases, the decision surface is "pushed away" from that mode, biasing the classifier in favour of the more likely class.
 
The video demonstrates Bayes decision rule on 2D feature data from two classes. We visualize the decision hypersurface as a red "wall" cutting through the bi-modal distribution of the data, and observe how it changes with the parameters of the Gaussian distributions for the two classes.  Note that if the covariance matrices and the priors of the classes are identical, then the decision surface cuts directly between the two modes.  If the prior of one class increases, the decision surface is "pushed away" from that mode, biasing the classifier in favour of the more likely class.

Revision as of 14:52, 20 March 2008

<object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/wzJkaATyitA&rel=1"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/wzJkaATyitA&rel=1" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object>

<object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/wzJkaATyitA&hl=en"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/wzJkaATyitA&hl=en" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object>

The video demonstrates Bayes decision rule on 2D feature data from two classes. We visualize the decision hypersurface as a red "wall" cutting through the bi-modal distribution of the data, and observe how it changes with the parameters of the Gaussian distributions for the two classes. Note that if the covariance matrices and the priors of the classes are identical, then the decision surface cuts directly between the two modes. If the prior of one class increases, the decision surface is "pushed away" from that mode, biasing the classifier in favour of the more likely class.

The code for making such a video is here: <a href="BayesDecisionSurface.tar.gz">BayesDecisionSurface.tar.gz</a>

For more on Bayes' decision rule, see [Lecture 6]

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang