(New page: The classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonpa...)
 
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
 +
=K Nearest Neighbors=
 
The classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonparametric architecture, simple and powerful, requires no traning time, but it also has disadvantage - memory intensive, classification and estimation are slow. Please refer to KNN tutorial website.
 
The classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonparametric architecture, simple and powerful, requires no traning time, but it also has disadvantage - memory intensive, classification and estimation are slow. Please refer to KNN tutorial website.
  
1. [http://people.revoledu.com/kardi/tutorial/KNN KNN Tutorial] : Contents are below
+
#[http://people.revoledu.com/kardi/tutorial/KNN KNN Tutorial] : Contents are below
/ How K-Nearest Neighbor (KNN) Algorithm works?
+
#*How K-Nearest Neighbor (KNN) Algorithm works?
/ Numerical Example (hand computation)
+
#*Numerical Example (hand computation)
/ KNN for Smoothing and Prediction
+
#*KNN for Smoothing and Prediction
/ How do we use the spreadsheet for KNN?
+
#*How do we use the spreadsheet for KNN?
/ Strength and Weakness of K-Nearest Neighbor Algorithm
+
#*Strength and Weakness of K-Nearest Neighbor Algorithm
/ Resources for K Nearest Neighbors Algorithm
+
#*Resources for K Nearest Neighbors Algorithm
 +
#[http://www.nlp.org.cn/docs/20020903/36/kNN.pdf KNN]
 +
#[http://http/www.chem.agilent.com/cag/bsp/products/gsgx/Downloads/pdf/class_prediction.pdf Class Prediction using KNN]
 +
#[http://en.wikipedia.org/wiki/Nearest_neighbor_(pattern_recognition) WIKIPEDIA]
 +
----
 +
[[ECE662:Glossary_Old_Kiwi|Back to "Decision Theory'' Glossary]]
  
2. [http://www.nlp.org.cn/docs/20020903/36/kNN.pdf KNN]
+
[[ECE662:BoutinSpring08_Old_Kiwi|Back to ECE662 Spring 2008 Prof. Boutin]]
 
+
3. [http://http/www.chem.agilent.com/cag/bsp/products/gsgx/Downloads/pdf/class_prediction.pdf Class Prediction using KNN]
+
 
+
4. [http://en.wikipedia.org/wiki/Nearest_neighbor_(pattern_recognition) WIKIPEDIA]
+

Latest revision as of 17:56, 22 October 2010

K Nearest Neighbors

The classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonparametric architecture, simple and powerful, requires no traning time, but it also has disadvantage - memory intensive, classification and estimation are slow. Please refer to KNN tutorial website.

  1. KNN Tutorial : Contents are below
    • How K-Nearest Neighbor (KNN) Algorithm works?
    • Numerical Example (hand computation)
    • KNN for Smoothing and Prediction
    • How do we use the spreadsheet for KNN?
    • Strength and Weakness of K-Nearest Neighbor Algorithm
    • Resources for K Nearest Neighbors Algorithm
  2. KNN
  3. Class Prediction using KNN
  4. WIKIPEDIA

Back to "Decision Theory Glossary

Back to ECE662 Spring 2008 Prof. Boutin

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang