Line 14: Line 14:
 
----
 
----
  
==Review==
+
=Review By Anonymous7=
 +
 
 +
 
 +
 
 +
[Review by Anoymous7] This slecture is about classification by using Bayes Rule in 1-dimensional and N-dimensional feature spaces. In the beginning, the author introduced Bayes theorem and gave 2 examples that use Bayes theorem.  The author then discussed classification using Byes rule and derived the error formula for calculating the error when classifying 1 dimensional Gaussian distribution. After that, the author derived the discriminant function when classifying 1 dimensional and N dimensional Gaussian distribution. 
 +
 
 +
<br>[Review by Anoymouse7] Overall, the slecture is very well written. The flow in the slecture seems to be smooth. However, to make the slecture better, the following improvements are suggested:
 +
 
 +
* The dimensional classification error is not written correct. Because of the author’s choice of classes <math> \omega_1 </math> and <math> \omega_2 </math>,  the 1 dimensional classification error should be,
 +
<center><math>
 +
E(error)  = \int_{-\infty}^{t}\rho(x|\omega_1)P(\omega_1)dx + \int_{t}^{\infty}\rho(x|\omega_2)P(\omega_2)dx
 +
 
 +
</math></center>
 +
 
 +
* In the Bayes rule example 1, it was written that  P(W | L) = 0.75. It should be  P(L | W) = 0.75  instead.
 +
 
 +
* When comparing the original probability and the probability that we get by applying Byes rule in examples 1 and 2, it should be explained why the probability changed.
 +
* There should be a derivation of the discriminant function when it is the case of the general <math> \Sigma_i </math> in the N dimensional feature space.
 +
 
 +
 
  
== This slecture to be reviewed by Anonymous7 ==
 
  
 
----
 
----

Revision as of 20:35, 2 May 2014


Question and Comments for: Classification using Bayes Rule in 1-dimensional and N-dimensional feature spaces

A slecture by graduate student Jihwan Lee



Review By Anonymous7

[Review by Anoymous7] This slecture is about classification by using Bayes Rule in 1-dimensional and N-dimensional feature spaces. In the beginning, the author introduced Bayes theorem and gave 2 examples that use Bayes theorem. The author then discussed classification using Byes rule and derived the error formula for calculating the error when classifying 1 dimensional Gaussian distribution. After that, the author derived the discriminant function when classifying 1 dimensional and N dimensional Gaussian distribution.


[Review by Anoymouse7] Overall, the slecture is very well written. The flow in the slecture seems to be smooth. However, to make the slecture better, the following improvements are suggested:

  • The dimensional classification error is not written correct. Because of the author’s choice of classes $ \omega_1 $ and $ \omega_2 $, the 1 dimensional classification error should be,
$ E(error) = \int_{-\infty}^{t}\rho(x|\omega_1)P(\omega_1)dx + \int_{t}^{\infty}\rho(x|\omega_2)P(\omega_2)dx $
  • In the Bayes rule example 1, it was written that P(W | L) = 0.75. It should be P(L | W) = 0.75 instead.
  • When comparing the original probability and the probability that we get by applying Byes rule in examples 1 and 2, it should be explained why the probability changed.
  • There should be a derivation of the discriminant function when it is the case of the general $ \Sigma_i $ in the N dimensional feature space.





Write other comment/question here.


Back to Classification using Bayes Rule in 1-dimensional and N-dimensional feature spaces

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett