Revision as of 12:31, 23 February 2012 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Lecture 6 Blog, ECE662 Spring 2012, Prof. Boutin

Thursday January 24, 2012 (Week 3)


Quick link to lecture blogs: 1|2|3|4|5|6|7|8| 9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30


Today we began talking about an important subject in decision theory: Bayes rule for normally distributed feature vectors. We proposed a simple discriminant function for this special case, and noted its geometric meaning. To better understand this geometric meaning, we first considered the special case where the class density has the identity matrix as standard deviation matrix. We noticed in that case that the value of the discriminant function is constant along circles around the mean of the class density, and that the closer the feature vector to the mean of the class density (in the usual, Euclidean sense), the larger the value of the discriminant function $ g_i(x) $ for that class.

We also spent a lot of time discussing the first homework. Hopefully you are all beginning to think about possible questions to investigate and how you are going to attack these.

Previous: Lecture 5

Next: Lecture 7


Comments

Please write your comments and questions below.

  • Write a comment here
  • Write another comment here.

Back to ECE662 Spring 2012

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn