(4 intermediate revisions by 3 users not shown)
Line 1: Line 1:
 
<br>  
 
<br>  
 
<center><font size="4"></font>
 
<center><font size="4"></font>
<font size="4">Review on KNN to Nearest Neighbor Slecture by Jonathan Manring </font>  
+
<font size="4">Review on KNN to [[Slecture_from_KNN_to_nearest_neighbor|Nearest Neighbor Slecture by Jonathan Manring]] </font>  
  
 
A [https://www.projectrhea.org/learning/slectures.php slecture] by graduate student Jonathan Manring  
 
A [https://www.projectrhea.org/learning/slectures.php slecture] by graduate student Jonathan Manring  
 
</center>  
 
</center>  
 +
----
 
----
 
----
  
 
----
 
----
  
== Review ==
+
== Comments/Feedback ==
  
----
+
This slecture reviewed by Ben Foster
  
----
+
This slecture provides a brief overview of the KNN classification method and transitions from KNN into a brief description of the nearest neighbor classification method. A few comments/suggestions:
  
== Comments/Feedback ==
+
*It is alright that there is not an abundance of theoretical background for KNN presented in this slecture, but it does seem like there could be a bit more. This is especially true since the idea of the slecture is to set the stage for the nearest neighbor method with KNN as a support. With this in mind, the author could use to make a more structured transition from KNN to the nearest neighbor method.
 +
 
 +
*There could also be some more discussion of the unbiased nature of KNN as an estimator for the local density <math>\rho(\vec{X_0})</math>. Why is this a point of interest? Are other local estimators of <math>\rho(\vec{X_0})</math> biased? Those that have taken the class are likely familiar with the answers to these questions, but this may be a point of confusion for a reader that is just becoming familiar with the material.
 +
 
 +
*The discussion of distance metrics is similarly sparse. Perhaps the author could add links/references to sites where more information on these topics can be found.
  
 
----
 
----
  
[[Slecture_KNN_to_nearest_neighbor|Back to KNN to Nearest Neighbor Slecture]]
+
[[Slecture_from_KNN_to_nearest_neighbor|Back to KNN to Nearest Neighbor Slecture]]
  
 
[[Category:Slecture]] [[Category:ECE662Spring2014Boutin]] [[Category:ECE]] [[Category:ECE662]] [[Category:Pattern_recognition]]
 
[[Category:Slecture]] [[Category:ECE662Spring2014Boutin]] [[Category:ECE]] [[Category:ECE662]] [[Category:Pattern_recognition]]

Latest revision as of 11:20, 7 May 2014


Review on KNN to Nearest Neighbor Slecture by Jonathan Manring

A slecture by graduate student Jonathan Manring




Comments/Feedback

This slecture reviewed by Ben Foster

This slecture provides a brief overview of the KNN classification method and transitions from KNN into a brief description of the nearest neighbor classification method. A few comments/suggestions:

  • It is alright that there is not an abundance of theoretical background for KNN presented in this slecture, but it does seem like there could be a bit more. This is especially true since the idea of the slecture is to set the stage for the nearest neighbor method with KNN as a support. With this in mind, the author could use to make a more structured transition from KNN to the nearest neighbor method.
  • There could also be some more discussion of the unbiased nature of KNN as an estimator for the local density $ \rho(\vec{X_0}) $. Why is this a point of interest? Are other local estimators of $ \rho(\vec{X_0}) $ biased? Those that have taken the class are likely familiar with the answers to these questions, but this may be a point of confusion for a reader that is just becoming familiar with the material.
  • The discussion of distance metrics is similarly sparse. Perhaps the author could add links/references to sites where more information on these topics can be found.

Back to KNN to Nearest Neighbor Slecture

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn