Line 16: Line 16:
 
==0. Foreword by Professor Boutin==
 
==0. Foreword by Professor Boutin==
 
==1. Background Material ==
 
==1. Background Material ==
*Probability and Statistics
+
*Whitening and Coloring Transforms
**Whitening and Coloring Transforms
+
**[[ECE662 Whitening and Coloring Transforms S14 MH|Text slecture in English]], by [[User:Mhossain|Maliha Hossain]] <span style="color:GREEN">Very clear!</span>  
***[[ECE662 Whitening and Coloring Transforms S14 MH|Text slecture in English]], by [[User:Mhossain|Maliha Hossain]] <span style="color:GREEN">Very clear!</span>  
+
*How to generate random n dimensional data from two categories with different priors (use these methods to generate data for homework)  
**How to generate random n dimensional data from two categories with different priors (use these methods to generate data for homework)  
+
**[[Generating random data with controlled prior probabilities slecture ECE662S14 Gheith|Video slecture in English]] by Alex Gheith   
***[[Generating random data with controlled prior probabilities slecture ECE662S14 Gheith|Video slecture in English]] by Alex Gheith   
+
**[[How to generate random n dimensional data from two categories with different priors slecture Minwoong Kim ECE662 Spring 2014|Video slecture in Korean ]], by Minwoong Kim  
***[[How to generate random n dimensional data from two categories with different priors slecture Minwoong Kim ECE662 Spring 2014|Video slecture in Korean ]], by Minwoong Kim  
+
**[[How to generate random n dimensional data from two categories with different priors slecture Minwoong Cho ECE662 Spring 2014|Video slecture in Korean ]], by Hyun Dok Cho  
***[[How to generate random n dimensional data from two categories with different priors slecture Minwoong Cho ECE662 Spring 2014|Video slecture in Korean ]], by Hyun Dok Cho  
+
**[[The principles for how to generate random samples from a Gaussian distribution|Text slecture in English]] by Joonsoo Kim  
***[[The principles for how to generate random samples from a Gaussian distribution|Text slecture in English]] by Joonsoo Kim  
+
**[[Generation of N-dimensional normally distributed random numbers from two categories with different priors|Text slecture in English]] by Jonghoon Jin  
***[[Generation of N-dimensional normally distributed random numbers from two categories with different priors|Text slecture in English]] by Jonghoon Jin  
+
 
*Principal Component Analysis (PCA)  
 
*Principal Component Analysis (PCA)  
 
**[[PCA|Text slecture in English]], by [http://web.ics.purdue.edu/~zhou338/ Tian Zhou] <span style="color:GREEN">OK</span>  
 
**[[PCA|Text slecture in English]], by [http://web.ics.purdue.edu/~zhou338/ Tian Zhou] <span style="color:GREEN">OK</span>  
Line 77: Line 76:
 
***[[Bayersian Parameter Estimation: Gaussian Case|Text slecture in English]], by Shaobo Fang  
 
***[[Bayersian Parameter Estimation: Gaussian Case|Text slecture in English]], by Shaobo Fang  
 
***[[Bayes Parameter Estimation with examples|Text slecture in English]] by Yu Wang   
 
***[[Bayes Parameter Estimation with examples|Text slecture in English]] by Yu Wang   
==3. Local ("non-parametric") Density Estimation Methods==
+
==4. Local ("non-parametric") Density Estimation Methods==
 
*Introduction to Local density Estimation Techniques (so-called "non-parametric")  
 
*Introduction to Local density Estimation Techniques (so-called "non-parametric")  
 
**[[Introduction to local density estimation methods|Text slecture in English]] by Yu Liu <span style="color:GREEN">OK</span>  
 
**[[Introduction to local density estimation methods|Text slecture in English]] by Yu Liu <span style="color:GREEN">OK</span>  
Line 94: Line 93:
 
**[[Estimation Using Nearest Neighbor|Text slecture in English]] by Sang Ho Yoon
 
**[[Estimation Using Nearest Neighbor|Text slecture in English]] by Sang Ho Yoon
 
**[[Slecture from KNN to nearest neighbor|Text slecture in English]] by Jonathan Manring <span style="color:GREEN">OK</span>  
 
**[[Slecture from KNN to nearest neighbor|Text slecture in English]] by Jonathan Manring <span style="color:GREEN">OK</span>  
==4. Linear Classifiers ==
+
==5. Linear Classifiers ==
 
*Linear classifiers, projective coordinates, and Fisher linear discriminant
 
*Linear classifiers, projective coordinates, and Fisher linear discriminant
 
**[[JMSLinearClassifierSlecture|Text slecture in English]] by John Mulcahy-Stanislawczyk  
 
**[[JMSLinearClassifierSlecture|Text slecture in English]] by John Mulcahy-Stanislawczyk  
Line 101: Line 100:
 
**[[Least_Squares_Support_Vector_Machine_and_its_Applications_in_Solving_Linear_Regression_Problems| Text slecture in English]] by Xing Liu  
 
**[[Least_Squares_Support_Vector_Machine_and_its_Applications_in_Solving_Linear_Regression_Problems| Text slecture in English]] by Xing Liu  
 
**[[Support Vector Machine|Video slecture in English]] by Tao Jiang  
 
**[[Support Vector Machine|Video slecture in English]] by Tao Jiang  
==5. Supplementary Material==
+
==6. Supplementary Material==
 
*Clustering Algorithms  
 
*Clustering Algorithms  
 
**[[SlectureDavidRunyanCS662Spring14|text slecture in English]] by David Runyan
 
**[[SlectureDavidRunyanCS662Spring14|text slecture in English]] by David Runyan

Revision as of 07:07, 15 May 2014


The Boutin Lectures on Statistical Pattern Recognition

Multilingual Slectures by Students in the Spring 2014 Class of ECE662


0. Foreword by Professor Boutin

1. Background Material

2. Bayes Rule

3. Global (parametric) Density Estimation Methods

4. Local ("non-parametric") Density Estimation Methods

5. Linear Classifiers

6. Supplementary Material


Go to ECE662 Spring 2014 Course Wiki

Go to Slecture Page

Alumni Liaison

To all math majors: "Mathematics is a wonderfully rich subject."

Dr. Paul Garrett