Line 19: | Line 19: | ||
=== <br> 2. Motivation === | === <br> 2. Motivation === | ||
*Statistical Density Theory Context | *Statistical Density Theory Context | ||
− | **Given c classes + some knowledge about features | + | **Given c classes + some knowledge about features <math>x \in \mathbb{R}^n<math> (or some other space) |
Revision as of 20:59, 5 May 2014
Expected Value of MLE estimate over standard deviation and expected deviation
A slecture by ECE student Zhenpeng Zhao
Partly based on the ECE662 Spring 2014 lecture material of Prof. Mireille Boutin.
1. Motivation
- Most likely converge as number of number of training sample increase.
- Simpler than alternate methods such as Bayesian technique.
2. Motivation
- Statistical Density Theory Context
- Given c classes + some knowledge about features $ x \in \mathbb{R}^n<math> (or some other space) [[Image:Zhenpeng_Selecture_1.png]] [[Image:Zhenpeng_Selecture_2.png]] [[Image:Zhenpeng_Selecture_3.png]] [[Image:Zhenpeng_Selecture_4.png]] [[Image:Zhenpeng_Selecture_5.png]] ---- (create a question page and put a link below) == [[ECE662Selecture_ZHenpengMLE_Ques|Questions and comments]] == If you have any questions, comments, etc. please post them on [[ECE662Selecture_ZHenpengMLE_Ques|https://kiwi.ecn.purdue.edu/rhea/index.php/ECE662Selecture_ZHenpengMLE_Ques]]. ---- [[2014 Spring ECE 662 Boutin|Back to ECE662, Spring 2014]] [[Category:Slecture]] [[Category:ECE662Spring2014Boutin]] [[Category:ECE]] [[Category:ECE662]] [[Category:Pattern_recognition]] $