(New page: Category:ECE662Spring2012Boutin Category:blog =Lecture 11 Blog, ECE662 Spring 2012, Prof. Boutin= Tuesday February 14, 2012 (Week 9) ---- Quick link to lectu...)
 
 
(One intermediate revision by the same user not shown)
Line 3: Line 3:
  
 
=Lecture 11 Blog, [[ECE662]] Spring 2012, [[user:mboutin|Prof. Boutin]]=
 
=Lecture 11 Blog, [[ECE662]] Spring 2012, [[user:mboutin|Prof. Boutin]]=
Tuesday February 14, 2012 (Week 9)  
+
Tuesday February 14, 2012 (Week 6)  
 
----
 
----
Quick link to lecture blogs: [[Lecture1ECE662S12|1]]|[[Lecture2ECE662S12|2]]|[[Lecture3ECE662S12|3]]|[[Lecture4ECE662S12|4]]|[[Lecture5ECE662S12|5]]|[[Lecture6ECE662S12|6]]|[[Lecture7ECE662S12|7]]|[[Lecture8ECE662S12|8]]| [[Lecture9ECE662S12|9]]|[[Lecture10ECE662S12|10]]|[[Lecture11ECE662S12|11]]|[[Lecture12ECE662S12|12]]|[[Lecture13ECE662S12|13]]|[[Lecture14ECE662S12|14]]|[[Lecture15ECE662S12|15]]|[[Lecture16ECE662S12|16]]|[[Lecture17ECE662S12|17]]|[[Lecture18ECE662S12|18]]|[[Lecture19ECE662S12|19]]|[[Lecture20ECE662S12|20]]|[[Lecture21ECE662S12|21]]|[[Lecture22ECE662S12|22]]|[[Lecture23ECE662S12|23]]|[[Lecture24ECE662S12|24]]|[[Lecture25ECE662S12|25]]|[[Lecture26ECE662S12|26]]|[[Lecture27ECE662S12|27]]|[[Lecture28ECE662S12|28]]|[[Lecture29ECE662S12|29]]|[[Lecture20ECE662S12|20]]
+
Quick link to lecture blogs: [[Lecture1ECE662S12|1]]|[[Lecture2ECE662S12|2]]|[[Lecture3ECE662S12|3]]|[[Lecture4ECE662S12|4]]|[[Lecture5ECE662S12|5]]|[[Lecture6ECE662S12|6]]|[[Lecture7ECE662S12|7]]|[[Lecture8ECE662S12|8]]| [[Lecture9ECE662S12|9]]|[[Lecture10ECE662S12|10]]|[[Lecture11ECE662S12|11]]|[[Lecture12ECE662S12|12]]|[[Lecture13ECE662S12|13]]|[[Lecture14ECE662S12|14]]|[[Lecture15ECE662S12|15]]|[[Lecture16ECE662S12|16]]|[[Lecture17ECE662S12|17]]|[[Lecture18ECE662S12|18]]|[[Lecture19ECE662S12|19]]|[[Lecture20ECE662S12|20]]|[[Lecture21ECE662S12|21]]|[[Lecture22ECE662S12|22]]|[[Lecture23ECE662S12|23]]|[[Lecture24ECE662S12|24]]|[[Lecture25ECE662S12|25]]|[[Lecture26ECE662S12|26]]|[[Lecture27ECE662S12|27]]|[[Lecture28ECE662S12|28]]|[[Lecture29ECE662S12|29]]|[[Lecture30ECE662S12|30]]
 
----
 
----
 
Today we studied a different way to formulate Bayes error in the two category case. The key idea in this new formulation is to view the discriminant function as a random variable. By changing the integration variable from the feature vector to the discriminant function, we end up having to compute two 1D integrations, as opposed to two an n-dimensional integrations, each on a complex domain. We illustrate how this new formulation can be used to explicitely obtain an analytical expression for Bayes error in the case where the feature vectors are normally distributed with the same standard deviation matrix for both classes.
 
Today we studied a different way to formulate Bayes error in the two category case. The key idea in this new formulation is to view the discriminant function as a random variable. By changing the integration variable from the feature vector to the discriminant function, we end up having to compute two 1D integrations, as opposed to two an n-dimensional integrations, each on a complex domain. We illustrate how this new formulation can be used to explicitely obtain an analytical expression for Bayes error in the case where the feature vectors are normally distributed with the same standard deviation matrix for both classes.

Latest revision as of 12:20, 23 February 2012


Lecture 11 Blog, ECE662 Spring 2012, Prof. Boutin

Tuesday February 14, 2012 (Week 6)


Quick link to lecture blogs: 1|2|3|4|5|6|7|8| 9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30


Today we studied a different way to formulate Bayes error in the two category case. The key idea in this new formulation is to view the discriminant function as a random variable. By changing the integration variable from the feature vector to the discriminant function, we end up having to compute two 1D integrations, as opposed to two an n-dimensional integrations, each on a complex domain. We illustrate how this new formulation can be used to explicitely obtain an analytical expression for Bayes error in the case where the feature vectors are normally distributed with the same standard deviation matrix for both classes.

Action items


Previous: Lecture 10

Next: Lecture 12


Comments

Please write your comments and questions below.

  • Write a comment here
  • Write another comment here.

Back to ECE662 Spring 2012

Alumni Liaison

Followed her dream after having raised her family.

Ruth Enoch, PhD Mathematics