Revision as of 12:20, 23 February 2012 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Lecture 11 Blog, ECE662 Spring 2012, Prof. Boutin

Tuesday February 14, 2012 (Week 6)


Quick link to lecture blogs: 1|2|3|4|5|6|7|8| 9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30


Today we studied a different way to formulate Bayes error in the two category case. The key idea in this new formulation is to view the discriminant function as a random variable. By changing the integration variable from the feature vector to the discriminant function, we end up having to compute two 1D integrations, as opposed to two an n-dimensional integrations, each on a complex domain. We illustrate how this new formulation can be used to explicitely obtain an analytical expression for Bayes error in the case where the feature vectors are normally distributed with the same standard deviation matrix for both classes.

Action items


Previous: Lecture 10

Next: Lecture 12


Comments

Please write your comments and questions below.

  • Write a comment here
  • Write another comment here.

Back to ECE662 Spring 2012

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva