(New page: The classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonpa...)
 
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
The classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonparametric architecture, simple and powerful, requires no traning time, but it also has disadvantage - memory intensive, classification and estimation are slow. Please refer to KNN tutorial website.
+
=K Nearest Neighbors (KNN)=
 +
Page created in the context of the course [[ECE662]].
 +
----
 +
K nearest neighbor (KNN) classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonparametric architecture, simple and powerful, requires no traning time, but it also has disadvantage - memory intensive, classification and estimation are slow.  
  
1. [http://people.revoledu.com/kardi/tutorial/KNN KNN Tutorial] : Contents are below
+
Related Rhea pages:
/ How K-Nearest Neighbor (KNN) Algorithm works?
+
*A [[KNN_algorithm_OldKiwi|tutorial]] written by an [[ECE662]] student.
/ Numerical Example (hand computation)
+
*[[ECE662]] lecture notes, [[ECE662:BoutinSpring08_Old_Kiwi|Spring 2008, Prof. Boutin]]:
/ KNN for Smoothing and Prediction
+
**[[Lecture_16_-_Parzen_Window_Method_and_K-nearest_Neighbor_Density_Estimate_Old_Kiwi|Lecture 16: Parzen Windows and KNN density estimates]]
/ How do we use the spreadsheet for KNN?
+
**[[Lecture_17_-_Nearest_Neighbors_Clarification_Rule_and_Metrics_Old_Kiwi|Lecture 17: Nearest neighbor classification]]
/ Strength and Weakness of K-Nearest Neighbor Algorithm
+
**[[Lecture_18_-_Nearest_Neighbors_Clarification_Rule_and_Metrics%28Continued%29_Old_Kiwi|Lecture 18: Nearest neighbor classification and metrics]]
/ Resources for K Nearest Neighbors Algorithm
+
**[[Lecture_19_-_Nearest_Neighbor_Error_Rates_Old_Kiwi|Lecture 19: Nearest neighbor error rate]]
  
2. [http://www.nlp.org.cn/docs/20020903/36/kNN.pdf KNN]
+
Other reference:
 
+
*A [http://people.revoledu.com/kardi/tutorial/KNN KNN Tutorial website]. Contents described below
3. [http://http/www.chem.agilent.com/cag/bsp/products/gsgx/Downloads/pdf/class_prediction.pdf Class Prediction using KNN]
+
:*How K-Nearest Neighbor (KNN) Algorithm works?
 
+
:*Numerical Example (hand computation)
4. [http://en.wikipedia.org/wiki/Nearest_neighbor_(pattern_recognition) WIKIPEDIA]
+
:*KNN for Smoothing and Prediction
 +
:*How do we use the spreadsheet for KNN?
 +
:*Strength and Weakness of K-Nearest Neighbor Algorithm
 +
:*Resources for K Nearest Neighbors Algorithm
 +
*[http://www.nlp.org.cn/docs/20020903/36/kNN.pdf KNN]
 +
*[http://http/www.chem.agilent.com/cag/bsp/products/gsgx/Downloads/pdf/class_prediction.pdf Class Prediction using KNN]
 +
*[http://en.wikipedia.org/wiki/Nearest_neighbor_(pattern_recognition) WIKIPEDIA]
 +
----
 +
[[ECE662|Back to ECE662]]

Latest revision as of 07:35, 1 December 2010

K Nearest Neighbors (KNN)

Page created in the context of the course ECE662.


K nearest neighbor (KNN) classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonparametric architecture, simple and powerful, requires no traning time, but it also has disadvantage - memory intensive, classification and estimation are slow.

Related Rhea pages:

Other reference:

  • How K-Nearest Neighbor (KNN) Algorithm works?
  • Numerical Example (hand computation)
  • KNN for Smoothing and Prediction
  • How do we use the spreadsheet for KNN?
  • Strength and Weakness of K-Nearest Neighbor Algorithm
  • Resources for K Nearest Neighbors Algorithm

Back to ECE662

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn