(created content for basics of the proof)
 
 
(12 intermediate revisions by the same user not shown)
Line 1: Line 1:
Before we go into the proofs, here are a few definitions and concept to know so you don't get confused when we talk about the proof of this formula. I am giving fairly simple explanations here, just enough so you can understand what it is. If you want to learn more about it, you can find resources about the topic in the "More Sources" page.
+
'''Things to Know before we start'''<br />
  
 +
Before we go into the proofs, here are a few definitions and concept to know so you don't get confused when we talk about the proof of this formula. I am giving fairly simple explanations here, just enough so you can understand what it is. If you want to learn more about it, you can find resources about the topic in the "More Sources" page.<br />
  
'''Standard Deviation:''' is used to measure how spread out the numbers in a sample are. Usually denoted by . The standard deviation is the square root of variance.  
+
'''Probability density function:''' a function that models the probability of a random variable being a certain value, usually denoted as <math>P(X=A)</math> where X is the random variable and A is the outcome we are looking for. The integrals of all probability density functions are equal to 1.
  
'''Variance:''' the average of the squared difference from the mean. Usually denoted by 2. Variance is the square of the standard deviation. Used in calculation more often because variance is much easier to manipulate without the loss of data. It is also used because it weighs outliers much more than standard deviation which is important when used by investors.  
+
'''Standard Deviation:''' is used to measure how spread out the numbers in a sample are. Usually denoted by σ. The standard deviation is the square root of variance.  
  
'''Theta:''' Is used in statistics to represent any unknown parameter of interest. In a continuous probability function, it can be used as the likelihood that event X occurs.  
+
'''Variance:''' the average of the squared difference from the mean. Usually denoted by σ<sup>2</sup>. Variance is the square of the standard deviation. It is used in calculation more often because variance is much easier to manipulate without the loss of data. It is also used because it weighs outliers much more than standard deviation which is important when used by investors and stock traders.
  
'''Expected Value:''' the weighted average of a probability function. Denoted as E(X). In simple terms, the most likely event to occur in a probability function.
+
'''Theta(θ):''' Is used in statistics to represent any unknown parameter of interest. In a continuous probability function, it can be used as the likelihood that event X occurs.  
 +
<br />
 +
<math>P(x=A)=θ</math> <br />
 +
θ denotes the probability that event A occurs within the probability function <math>P(X)</math>
  
'''Likelihood Function:''' is used to predict how close a statistical model is. Denoted as L(θ).
+
'''Expected Value:''' the weighted average of a probability function. Denoted as <math>E(x)</math>. In simple terms, the most likely event to occur in a probability function.  
  
'''Score:''' is the gradient, or vectors of the partial derivatives, of ln(L(θ))where L(θ)is a likelihood function of some parameter
+
'''Likelihood Function:''' is used to predict how close a statistical model is. Denoted as <math>L(θ)</math>
 +
 
 +
'''Score:''' is the gradient, or vectors of the partial derivatives, of the natural log of (<math>L(θ)</math>) where <math>L(θ)</math>is a likelihood function of some parameter
 +
 
 +
[[Walther_MA271_Fall2020_topic13|Back To Fisher information]]
 +
[[Category:MA271Fall2020Walther]]

Latest revision as of 22:58, 6 December 2020

Things to Know before we start

Before we go into the proofs, here are a few definitions and concept to know so you don't get confused when we talk about the proof of this formula. I am giving fairly simple explanations here, just enough so you can understand what it is. If you want to learn more about it, you can find resources about the topic in the "More Sources" page.

Probability density function: a function that models the probability of a random variable being a certain value, usually denoted as $ P(X=A) $ where X is the random variable and A is the outcome we are looking for. The integrals of all probability density functions are equal to 1.

Standard Deviation: is used to measure how spread out the numbers in a sample are. Usually denoted by σ. The standard deviation is the square root of variance.

Variance: the average of the squared difference from the mean. Usually denoted by σ2. Variance is the square of the standard deviation. It is used in calculation more often because variance is much easier to manipulate without the loss of data. It is also used because it weighs outliers much more than standard deviation which is important when used by investors and stock traders.

Theta(θ): Is used in statistics to represent any unknown parameter of interest. In a continuous probability function, it can be used as the likelihood that event X occurs.
$ P(x=A)=θ $
θ denotes the probability that event A occurs within the probability function $ P(X) $

Expected Value: the weighted average of a probability function. Denoted as $ E(x) $. In simple terms, the most likely event to occur in a probability function.

Likelihood Function: is used to predict how close a statistical model is. Denoted as $ L(θ) $

Score: is the gradient, or vectors of the partial derivatives, of the natural log of ($ L(θ) $) where $ L(θ) $is a likelihood function of some parameter

Back To Fisher information

Alumni Liaison

Followed her dream after having raised her family.

Ruth Enoch, PhD Mathematics