(2 intermediate revisions by the same user not shown)
Line 33: Line 33:
  
 
|-
 
|-
| 11-13  
+
| [[Lecture11ECE662S10|11]],12,13  
 
|  
 
|  
 
6. Parametric Density Estimation  
 
6. Parametric Density Estimation  
Line 53: Line 53:
 
| 8. Linear Discriminants
 
| 8. Linear Discriminants
 
|-
 
|-
| 22,23,24,25
+
| [[Lecture22ECE662S10|22]], [[Lecture23ECE662S10|23]] ,[[Lecture24ECE662S10|24]],[[Lecture25ECE662S10|25]],[[Lecture26ECE662S10|26]]
 
|  
 
|  
 
9. Non-Linear Discriminant functions  
 
9. Non-Linear Discriminant functions  
Line 61: Line 61:
  
 
|-
 
|-
| 26,27,28,29,30
+
| 27,28,29,30  
 
| 10. Clustering and decision trees
 
| 10. Clustering and decision trees
 
|}
 
|}

Latest revision as of 08:55, 22 April 2010


Course Outline, ECE662 Spring 2010 Prof. Mimi

Note: This is an approximate outline that is subject to change throughout the semester.


Lecture Topic
1 1. Introduction
1 2. What is pattern Recognition
2,3 3. Finite vs Infinite feature spaces
4,5 4. Bayes Rule
6-10

5. Discriminant functions

  • Definition;
  • Application to normally distributed features;
  • Error analysis.
11,12,13

6. Parametric Density Estimation

  • Maximum likelihood estimation
  • Bayesian parameter estimation
13-19

7. Non-parametric Density Estimation

  • Parzen Windows
  • K-nearest neighbors
  • The nearest neighbor classification rule.
19,20,21, 22 8. Linear Discriminants
22, 23 ,24,25,26

9. Non-Linear Discriminant functions

  • Support Vector Machines 
  • Artificial Neural Networks
27,28,29,30 10. Clustering and decision trees



Back to 2010 Spring ECE 662 mboutin

Alumni Liaison

Recent Math PhD now doing a post-doctorate at UC Riverside.

Kuei-Nuan Lin