m
Line 1: Line 1:
[http://balthier.ecn.purdue.edu/index.php/ECE662#Course_Topics Course Topics]
 
 
 
One way to represent classifiers. Given classes w1, w2, .., wk and feature vector x, which is in n-dimensional space, the discriminant functions g1(x), g2(x), .., gk(x) where g#(x) maps n-dimensional space to real numbers, are used to make decisions. Decisions are defined as w# if g#(x) >= gj(x) for all j.
 
One way to represent classifiers. Given classes w1, w2, .., wk and feature vector x, which is in n-dimensional space, the discriminant functions g1(x), g2(x), .., gk(x) where g#(x) maps n-dimensional space to real numbers, are used to make decisions. Decisions are defined as w# if g#(x) >= gj(x) for all j.
  
Line 7: Line 5:
 
== See also ==
 
== See also ==
 
[[Linear Discriminant Functions_OldKiwi]]
 
[[Linear Discriminant Functions_OldKiwi]]
 +
 +
[[Category:ECE662]]

Revision as of 08:44, 10 April 2008

One way to represent classifiers. Given classes w1, w2, .., wk and feature vector x, which is in n-dimensional space, the discriminant functions g1(x), g2(x), .., gk(x) where g#(x) maps n-dimensional space to real numbers, are used to make decisions. Decisions are defined as w# if g#(x) >= gj(x) for all j.

Discriminant functions are used to define Decision Surfaces_OldKiwi.

See also

Linear Discriminant Functions_OldKiwi

Alumni Liaison

BSEE 2004, current Ph.D. student researching signal and image processing.

Landis Huffman