Revision as of 11:02, 21 April 2012 by Mmetzger (Talk | contribs)

Probability is ordinarily used to describe an attitude of mind towards some proposition of whose truth we are not certain. The proposition of interest is usually of the form "Will a specific event occur?" The attitude of mind is of the form "How certain are we that the event will occur?" The certainty we adopt can be described in terms of a numerical measure and this number, between 0 and 1, we call probability. The higher the probability of an event, the more certain we are that the event will occur. Thus, probability in an applied sense is a measure of the confidence a person has that a (random) event will occur.


Discrete Probability

Discrete probability restricts one to experiments that have finitely many, equally likely, outcomes.  There are several terms one must familiarize themselves with before talking about discrete probability, which are listed below:

  • Experiment- A procedure that yields one of a given set of possible outcomes
  • Sample Space- The set of possible outcomes
  • Event-A subset of the sample space
  • The complementary event of E- (E)- if E is an event in the sample space S is the subtraction of E from S: E = S-E                                        
  • Union of two events A and B (A u B)- The union of two events A and B is the set of outcomes that belong to A or B or both.
  • Intersection of two events A and B (A n B)-The intersection of two events A and B is the set of outcomes that belong to both A and B.


Now that there is an understanding of some fundamental definitions, the definition of Probability now can be defined:

The probability of E (if E is an event and S is a finite nonempty sample space of equally likely outcomes), p(E),  is equal to |E| / |S|

                                                                                p(E) = |E| / |S|

 

Examples:


The probability of the complementary event (E) is given by the following equaiton:

                                                                               p(E) = 1-P(E)

Example:


The probability of A union B is given by the below formula:

                                                                               p(A u B)= p(A)+p(B)-P(A n B)

Example

Probability Theory

In this section, every outcome might not have the same probability, so assigning probabilities might be necessary.  One example of all outcomes not having identical probabilities is in the example of a loaded dice (One number on the die has a larger probability than others).


Assigning Probabilities

If S is the sample space of an experiment wih a finite number of outcomes, then p(s) is assigned to each outcome s. 

2 conditions must be met when assigning probabilities:

  1. 0<p(s)<1 for each s that exists in S
  2. the sum of all probabilities of s that exist in S must equal 1


Example


Uniform Distribution- Assigns the probability 1/n to each element of S, if S is a set with n elements

The probability of an event (A)- The sum of the probability of all the outcomes of A


Probabilities of Complements and Unions of Events


[Category:MA375Spring2012Walther]

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn