(9 intermediate revisions by one other user not shown) | |||
Line 1: | Line 1: | ||
− | + | ---- | |
+ | [https://www.projectrhea.org/rhea/index.php/2014_Spring_ECE_662_Boutin Back to ECE662, Spring 2014] | ||
+ | <center><font size="4"></font> | ||
+ | |||
+ | |||
+ | <font size="4">Questions and Comments for: '''[[K-Nearest_Neighbors_Density_Estimation| K-Nearest Neighbors Density Estimation ]]''' </font> | ||
+ | |||
+ | A [https://www.projectrhea.org/learning/slectures.php slecture] by student Qi Wang | ||
+ | </center> | ||
+ | ---- | ||
+ | |||
+ | Please leave me comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further. | ||
+ | |||
+ | ---- | ||
+ | ---- | ||
+ | ==Question/comment == | ||
+ | |||
+ | ---- | ||
+ | ==Review== | ||
+ | A review by Dan Barrett: | ||
+ | |||
+ | This video slecture includes a good general description of how K Nearest Neighbors works, then goes through the proof that KNN is an unbiased density estimate, and finally talks about metrics and gives some examples. | ||
+ | |||
+ | A couple improvements I might suggest: | ||
+ | - draw a more clear link between the example at the beginning and the discussion of metrics describing how you might use any of these metrics as the distance function in the example. | ||
+ | - discuss the two different KNN methods described in class, and how they relate to each other.(You discuss just the second one). | ||
+ | |||
+ | ---- | ||
+ | [[2014_Spring_ECE_662_Boutin|Back to ECE 662 2014 course wiki]] | ||
+ | |||
+ | [[ECE662|Back to ECE 662 course page]] |
Latest revision as of 18:09, 10 May 2014
Questions and Comments for: K-Nearest Neighbors Density Estimation
A slecture by student Qi Wang
Please leave me comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.
Question/comment
Review
A review by Dan Barrett:
This video slecture includes a good general description of how K Nearest Neighbors works, then goes through the proof that KNN is an unbiased density estimate, and finally talks about metrics and gives some examples.
A couple improvements I might suggest: - draw a more clear link between the example at the beginning and the discussion of metrics describing how you might use any of these metrics as the distance function in the example. - discuss the two different KNN methods described in class, and how they relate to each other.(You discuss just the second one).