# Introduction to Fischer's Information

Fischer's information is a way of measuring the amount of information is in a random event X and is denoted by $I(θ)$.

In order to understand what that means though, we need to understand what the word "information" means in statistics. Information is not the same thing as data. Data is usually a fact or a collection of facts. The data is not going to tell you much and no conclusion will be made with simply the data. Information is the analysis of the data and conclusions that can be made in context of the class. For example, if students in a class took a test, their scores would be the data, and the average of the class would be information. Often times, the words statistics and information are interchanged but they both mean the same in the context of stats.

The formal definition of Fischer Information is:

"For random variable X, with a likelihood function $L(θ,X)$ and score function(with respect to parameter θ) $s(θ;X) = \nabla [ln(L(θ,X))]$ (Rothman)

In more understandable terms this just means that for some probability density function, the amount of information stored in the parameter θ, or Fischer information $I(θ)$, is equal to the variance of the score.

$I(θ) = Var(s(θ;X)) = E[(s(θ;X))^2] = -E[\nabla ln(L(θ,X))]$