Revision as of 07:35, 1 December 2010 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

K Nearest Neighbors (KNN)

Page created in the context of the course ECE662.


K nearest neighbor (KNN) classifiers do not use any model to fit the data and only based on memory. The KNN uses neighborhood classification as the predication value of the new query. It has advantages - nonparametric architecture, simple and powerful, requires no traning time, but it also has disadvantage - memory intensive, classification and estimation are slow.

Related Rhea pages:

Other reference:

  • How K-Nearest Neighbor (KNN) Algorithm works?
  • Numerical Example (hand computation)
  • KNN for Smoothing and Prediction
  • How do we use the spreadsheet for KNN?
  • Strength and Weakness of K-Nearest Neighbor Algorithm
  • Resources for K Nearest Neighbors Algorithm

Back to ECE662

Alumni Liaison

BSEE 2004, current Ph.D. student researching signal and image processing.

Landis Huffman