(One intermediate revision by one other user not shown)
Line 1: Line 1:
<center><font size= 4>
+
<center><font size="4"></font>  
Questions and Comments for: '''[[Introduction_to_local_density_estimation_methods|Introduction to local (nonparametric) density estimation methods]]'''
+
<font size="4">Questions and Comments for: '''[[Introduction to local density estimation methods|Introduction to local (nonparametric) density estimation methods]]''' </font>  
</font size>
+
  
 
A [https://www.projectrhea.org/learning/slectures.php slecture] by Yu Liu  
 
A [https://www.projectrhea.org/learning/slectures.php slecture] by Yu Liu  
 +
</center>
 +
----
 +
 +
Please leave me comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.
  
</center>
 
----
 
Please leave me comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.
 
 
----
 
----
  
=Questions and Comments=
+
= Questions and Comments =
 +
 
 +
* [Lu Wang Review 1 - summary:] The slecture introduces two classic nonparametric density estimation methods, Parzen window density estimation and K-nearest neighbor density estimation. The general principle of both of the two methods are carefully explained. Also, it shows the importance of the window size (or the value k in KNN) in density estimation through examples.
 +
 
 +
* [Lu Wang Review 2 - strengths:] It is a great slecture. It serves as a great supplement to the lecture notes from class. The slecture makes it very clear the logic between each step of the derivation. It emphasizes how to choose the window size, and explains in detail the principle of picking such value, which is very important for the audience to understand the methods. Also, I especially enjoy the way the k-NN decision boundary plot rendered (figure 6 to figure 9).
  
This slecture will be reviewed by Lu Wang
+
* [Lu Wang Review 2 - suggestions:] This slecture is very worth reading as an introduction to the nonparametric method. However, I think it would improve the reading experience if the author types the formulas in Latex instead of using screenshot. Also, I would appreciate the slecture more if there are more comparisons between Parzen window and KNN. According to Section 4, it seems that KNN is better than Parzon in terms of accuracy. Including some examples to support this conclusion would be appealing.

Latest revision as of 17:34, 2 May 2014

Questions and Comments for: Introduction to local (nonparametric) density estimation methods

A slecture by Yu Liu


Please leave me comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.


Questions and Comments

  • [Lu Wang Review 1 - summary:] The slecture introduces two classic nonparametric density estimation methods, Parzen window density estimation and K-nearest neighbor density estimation. The general principle of both of the two methods are carefully explained. Also, it shows the importance of the window size (or the value k in KNN) in density estimation through examples.
  • [Lu Wang Review 2 - strengths:] It is a great slecture. It serves as a great supplement to the lecture notes from class. The slecture makes it very clear the logic between each step of the derivation. It emphasizes how to choose the window size, and explains in detail the principle of picking such value, which is very important for the audience to understand the methods. Also, I especially enjoy the way the k-NN decision boundary plot rendered (figure 6 to figure 9).
  • [Lu Wang Review 2 - suggestions:] This slecture is very worth reading as an introduction to the nonparametric method. However, I think it would improve the reading experience if the author types the formulas in Latex instead of using screenshot. Also, I would appreciate the slecture more if there are more comparisons between Parzen window and KNN. According to Section 4, it seems that KNN is better than Parzon in terms of accuracy. Including some examples to support this conclusion would be appealing.

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva