(One intermediate revision by the same user not shown)
Line 1: Line 1:
 
[[Category:2010 Spring ECE 662 mboutin]]
 
[[Category:2010 Spring ECE 662 mboutin]]
  
=Details of Lecture 22, [[ECE662]] Spring 2010=
+
=Details of Lecture 22, [[ECE662]] Spring 2010, [[User:mboutin|Prof. Boutin]]=
 
In Lecture 22, we continued our discussion of [[Fisher Linear Discriminant|Fisher's linear discriminant]]. We began by answering the question: why not use  
 
In Lecture 22, we continued our discussion of [[Fisher Linear Discriminant|Fisher's linear discriminant]]. We began by answering the question: why not use  
  
Line 11: Line 11:
 
We then presented the analytic expression for <math>\vec{w}_0</math>, the argmax of <math>J(\vec{w})</math>, and related <math>\vec{w}_0</math> to the least square solution of <math>Y \vec{c}=b</math>. We noted the relationship between [[Fisher Linear Discriminant|Fisher's linear discriminant]] and [[Feature Extraction|feature extraction]].  
 
We then presented the analytic expression for <math>\vec{w}_0</math>, the argmax of <math>J(\vec{w})</math>, and related <math>\vec{w}_0</math> to the least square solution of <math>Y \vec{c}=b</math>. We noted the relationship between [[Fisher Linear Discriminant|Fisher's linear discriminant]] and [[Feature Extraction|feature extraction]].  
 
Finally, we began Section 9 of the course on [[Support Vector Machines]] by introducing the idea of extending the feature vector space into a space spanned by monomials.  
 
Finally, we began Section 9 of the course on [[Support Vector Machines]] by introducing the idea of extending the feature vector space into a space spanned by monomials.  
 
==Useful Links==
 
 
For more info, you may look at these students' pages on Fisher's linear discriminant:
 
* [[Derivation_of_Fisher's_Linear_Discriminant_OldKiwi|Definition Fisher's linear discriminant]],
 
* [[Fisher_Linear_Discriminant_OldKiwi| Fisher's linear discriminant in brief]]
 
 
  
 
Previous: [[Lecture21ECE662S10|Lecture 21]]
 
Previous: [[Lecture21ECE662S10|Lecture 21]]

Latest revision as of 11:49, 13 April 2010


Details of Lecture 22, ECE662 Spring 2010, Prof. Boutin

In Lecture 22, we continued our discussion of Fisher's linear discriminant. We began by answering the question: why not use

$ J(\vec{w})=\frac{\| \tilde{m}_1-\tilde{m}_2\|^2}{\|\vec{w} \|^2} $ instead of $ J(\vec{w})=\frac{\| \tilde{m}_1-\tilde{m}_2 \|^2}{\tilde{s}_1^2+\tilde{s}_2^2} $ ?

We then presented the analytic expression for $ \vec{w}_0 $, the argmax of $ J(\vec{w}) $, and related $ \vec{w}_0 $ to the least square solution of $ Y \vec{c}=b $. We noted the relationship between Fisher's linear discriminant and feature extraction. Finally, we began Section 9 of the course on Support Vector Machines by introducing the idea of extending the feature vector space into a space spanned by monomials.

Previous: Lecture 21 Next: Lecture 23


Back to course outline

Back to 2010 Spring ECE 662 mboutin

Back to ECE662

Alumni Liaison

Prof. Math. Ohio State and Associate Dean
Outstanding Alumnus Purdue Math 2008

Jeff McNeal