Line 20: Line 20:
 
<div style="margin-left: 25em;">
 
<div style="margin-left: 25em;">
 
<math>\sigma^2 = \mathcal{E}[(x- \mu)^2] =\int\limits_{-\infty}^{\infty} (x- \mu)^2 p(x)\, dx</math>         
 
<math>\sigma^2 = \mathcal{E}[(x- \mu)^2] =\int\limits_{-\infty}^{\infty} (x- \mu)^2 p(x)\, dx</math>         
 +
</div>
 +
 +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; For the multivariate normal density in ''d'' dimensions, f<sub>x</sub> is written as
 +
 +
<div style="margin-left: 25em;">
 +
<math>f_x = \frac{1}{(2 \pi)^ \frac{d}{2} |\boldsymbol{\Sigma}|^\frac{1}{2}} \exp [- \frac{1}{2} ('''x''' -\boldsymbol{\mu})^t\boldsymbol{\Sigma}^-1 ('''x''' -\boldsymbol{\mu})] </math>       
 
</div>
 
</div>

Revision as of 17:37, 4 April 2013

Discriminant Functions For The Normal Density


       Lets begin with the continuous univariate normal or Gaussian density.

$ f_x = \frac{1}{\sqrt{2 \pi} \sigma} \exp \left [- \frac{1}{2} \left ( \frac{x - \mu}{\sigma} \right)^2 \right ] $


for which the expected value of x is

$ \mu = \mathcal{E}[x] =\int\limits_{-\infty}^{\infty} xp(x)\, dx $

and where the expected squared deviation or variance is

$ \sigma^2 = \mathcal{E}[(x- \mu)^2] =\int\limits_{-\infty}^{\infty} (x- \mu)^2 p(x)\, dx $

       For the multivariate normal density in d dimensions, fx is written as

$ f_x = \frac{1}{(2 \pi)^ \frac{d}{2} |\boldsymbol{\Sigma}|^\frac{1}{2}} \exp [- \frac{1}{2} ('''x''' -\boldsymbol{\mu})^t\boldsymbol{\Sigma}^-1 ('''x''' -\boldsymbol{\mu})] $

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett