Revision as of 09:58, 23 April 2012 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Maximum Likelihood Estimation (MLE) example: Bernouilli Distribution

Link to other examples: Exponential and geometric distributions


Observations: k successes in n Bernoulli trials.

$ f(x)=\left(\frac{n!}{x!\left(n-x \right)!} \right){p}^{x}{\left(1-p \right)}^{n-x} $

$ L(p)=\prod_{i=1}^{n}f({x}_{i})=\prod_{i=1}^{n}\left(\frac{n!}{{x}_{i}!\left(n-{x}_{i} \right)!} \right){p}^{{x}_{i}}{\left(1-p \right)}^{n-{x}_{i}} $

$ L(p)=\left( \prod_{i=1}^{n}\left(\frac{n!}{{x}_{i}!\left(n-{x}_{i} \right)!} \right)\right){p}^{\sum_{i=1}^{n}{x}_{i}}{\left(1-p \right)}^{n-\sum_{i=1}^{n}{x}_{i}} $

$ lnL(p)=\sum_{i=1}^{n}{x}_{i}ln(p)+\left(n-\sum_{i=1}^{n}{x}_{i} \right)ln\left(1-p \right) $

$ \frac{dlnL(p)}{dp}=\frac{1}{p}\sum_{i=1}^{n}{x}_{i}+\frac{1}{1-p}\left(n-\sum_{i=1}^{n}{x}_{i} \right)=0 $

$ \left(1-\hat{p}\right)\sum_{i=1}^{n}{x}_{i}+p\left(n-\sum_{i=1}^{n}{x}_{i} \right)=0 $

$ \hat{p}=\frac{\sum_{i=1}^{n}{x}_{i}}{n}=\frac{k}{n} $


Poisson Distribution

Observations: $ {X}_{1}, {X}_{2}, {X}_{3}.....{X}_{n} $

$ f(x)=\frac{{\lambda}^{x}{e}^{-\lambda}}{x!} x=0, 1, 2, $...

$ L(\lambda)=\prod_{i=1}^{n}\frac{{\lambda}^{{x}_{i}}{e}^{-\lambda}}{{x}_{i}!} = {e}^{-n\lambda} \frac{{\lambda}^{\sum_{1}^{n}{x}_{i}}}{\prod_{i=1}^{n}{x}_{i}} $

$ lnL(\lambda)=-n\lambda+\sum_{1}^{n}{x}_{i}ln(\lambda)-ln\left(\prod_{i=1}^{n}{x}_{i}\right) $

$ \frac{dlnL(\lambda)}{dp}=-n+\sum_{1}^{n}{x}_{i}\frac{1}{\lambda} $

$ \hat{\lambda}=\frac{\sum_{i=1}^{n}{x}_{i}}{n} $


Back to ECE662

Alumni Liaison

To all math majors: "Mathematics is a wonderfully rich subject."

Dr. Paul Garrett