(13 intermediate revisions by one other user not shown) | |||
Line 1: | Line 1: | ||
− | + | [[Category:2010 Spring ECE 662 mboutin]] | |
− | + | =Details of Lecture 11, [[ECE662]] Spring 2010= | |
+ | In Lecture 11, we continued our discussion of Parametric Density Estimation techniques. We discussed the Maximum Likelihood Estimation (MLE) method and look at a couple of 1-dimension examples for case when feature in dataset follows Gaussian distribution. First, we looked at case where mean parameter was unknown, but variance parameter is known. Then we followed with another example where both mean and variance where unknown. Finally, we looked at the slight "bias" problem when calculating the variance. | ||
− | + | Note for this lecture can be found [[noteslecture11ECE662S10|here]]. | |
− | + | ||
− | + | Previous: [[Lecture10ECE662S10|Lecture 10]] | |
+ | Next: [[Lecture12ECE662S10|Lecture 12]] | ||
− | + | ---- | |
− | + | [[ 2010 Spring ECE 662 mboutin|Back to 2010 Spring ECE 662 mboutin]] | |
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | + | ||
− | --[[ | + |
Latest revision as of 09:15, 11 May 2010
Details of Lecture 11, ECE662 Spring 2010
In Lecture 11, we continued our discussion of Parametric Density Estimation techniques. We discussed the Maximum Likelihood Estimation (MLE) method and look at a couple of 1-dimension examples for case when feature in dataset follows Gaussian distribution. First, we looked at case where mean parameter was unknown, but variance parameter is known. Then we followed with another example where both mean and variance where unknown. Finally, we looked at the slight "bias" problem when calculating the variance.
Note for this lecture can be found here.
Previous: Lecture 10
Next: Lecture 12