(15 intermediate revisions by 6 users not shown)
Line 1: Line 1:
 +
[[Category:ECE302Fall2008_ProfSanghavi]]
 +
[[Category:probabilities]]
 +
[[Category:ECE302]]
 +
[[Category:cheat sheet]]
 +
 +
=[[ECE302]] Cheet Sheet number 1=
 
You can get/put ideas for what should be on the cheat sheet here. <b> DO NOT SIGN YOUR NAME </b>
 
You can get/put ideas for what should be on the cheat sheet here. <b> DO NOT SIGN YOUR NAME </b>
  
 
'''Sample Space, Axioms of probability (finite spaces, infinite spaces)'''  
 
'''Sample Space, Axioms of probability (finite spaces, infinite spaces)'''  
  
<math> P(A) \geq 0 </math> for all events A
+
1. <math> P(A) \geq 0 </math> for all events A
 +
 
 +
2. <math>P(\omega)=1</math>
 +
 
 +
3. If A & B are disjoint then <math>P(A\cup B)=P(A)+P(B)</math>
 +
 
  
 
'''Properties of Probability laws'''
 
'''Properties of Probability laws'''
Line 12: Line 23:
 
<math>P(A|B) = \frac{P(A \cap B)}{P(B)}</math>
 
<math>P(A|B) = \frac{P(A \cap B)}{P(B)}</math>
  
Propertie:
+
Properties:
  
 
1) <math>P(A|B) \ge 0</math>
 
1) <math>P(A|B) \ge 0</math>
  
2) <math>P( \Omega |B) >= 0</math>
+
2) <math>P( \Omega |B) = 1\!</math>
  
3) if A1 and A2 are disjoint
+
3) if A1 and A2 are disjoint <math>P(A1 \cup A2|B) = P(A1|B) + P(A2|B)</math>
 
+
<math>P(A1 \cup A2|B) = P(A1|B) + P(A2|B)</math>
+
  
 
'''Bayes rule and total probability'''
 
'''Bayes rule and total probability'''
  
<math>P(A|B) = \frac{P(A \cap B)}{P(B)}</math>
+
<math>P(B)=P(B\cap A_1) + P(B \cap\ A_2) +...+P(B\cap A_n)= P(B|A_1)P(A)+P(B|A_2)P(A_2)+...+P(B|A_n)P(A_n)</math>
  
 
'''Definitions of Independence and Conditional independence'''
 
'''Definitions of Independence and Conditional independence'''
  
 +
Independence:
 +
A & B are independent if <math>P(A\cap B)=P(A)P(B)</math>
 +
side note: if A&B are independent then P(A|B)=P(A)
 +
 +
Conditional Independence:
 +
A&B are conditionally independent given C if <math>P(A\cap B|C)=P(A|C)P(B|C)=\frac{P(A\cap C)}{P(C)} \frac{P(B\cap C)}{P(C)}</math>
  
 
'''Definition and basic concepts of random variables, PMFs'''
 
'''Definition and basic concepts of random variables, PMFs'''
  
 +
Random Variable: a map/function from outcomes to real values
  
'''The common random variables:''' bernoulli, binomial, geometric, and how they come about in problems. ALSo
+
Probability Mass Function (PMF)
their PMFs.
+
<math>P_X (x) = P(X=x)</math>
  
Geometric RV
+
'''The common random variables:''' bernoulli, binomial, geometric, and how they come about in problems. Also their PMFs.
  
P(X=k) = (1-p)^(k-1) * p  for k>=1
+
Geometric RV:
  
<math> E[X] = 1/p </math>
+
where X is # of trials until the first success
  
 +
<math>P(X=k) = p(1-p)^{(k-1)}</math> for k>=1
 +
 +
<math> E[X] = 1/p \!</math>
 +
 +
<math>Var(x)=\frac{(1-p)}{p^2}</math>
 +
 +
 +
Binomial R.V. "many biased coins"
 +
with parameters n and p where n is the number of outcomes.
 +
 +
where X is # of successes in n trials and is the sum of independent, identically distributed outcomes.
 +
 +
P(X=k) = nCk * p^k * (1-p)^(n-k) for  k=0,1,2,...n
 +
 +
E[X]=np VAR[X]=np(1-p)
 +
 +
Bernoulli R.V "one biased coin"
 +
with parameter p
 +
 +
X=1 if A occurs and X=0 otherwise
 +
 +
P(1)=p
 +
 +
E[x]=p
 +
 +
Var(X)=p(1-p)
  
 
'''Definition of expectation and variance''' and their properties
 
'''Definition of expectation and variance''' and their properties
  
<math> Var(X) = E[X^2] - (E[X])^2 </math>
+
 
 +
<math>E[X]=\sum_X x P_X (x)</math>
 +
 
 +
<math>E[ax+b]=aE[x]+b</math> where a and b are constants
 +
 
 +
<math> Var(X) = E[X^2] - (E[X])^2 \!</math>
 +
 
 +
<math>Var(ax+b)=a^2 Var(x) </math>
  
  
 
'''Joint PMFs of more than one random variable'''
 
'''Joint PMFs of more than one random variable'''
 +
 +
Joint Probability Mass Function
 +
 +
<math>P_{XY}(x,y)=P({X=x}\cap {Y=y})</math>
 +
 +
PX(x)=(SUM of all y)[PXY(x,y)]
 +
 +
PY(y)=(SUM of all x)[PXY(x,y)]
 +
 +
<math>E[g(X,Y)]=\sum_{X,Y} g(X,Y)P_{XY}(x,y)</math>
 +
 +
E[ax+by]=aE[x]+bE[y]
 +
----
 +
[[Main_Page_ECE302Fall2008sanghavi|Back to ECE302 Fall 2008 Prof. Sanghavi]]

Latest revision as of 13:04, 22 November 2011


ECE302 Cheet Sheet number 1

You can get/put ideas for what should be on the cheat sheet here. DO NOT SIGN YOUR NAME

Sample Space, Axioms of probability (finite spaces, infinite spaces)

1. $ P(A) \geq 0 $ for all events A

2. $ P(\omega)=1 $

3. If A & B are disjoint then $ P(A\cup B)=P(A)+P(B) $


Properties of Probability laws


Definition of conditional probability, and properties thereof

$ P(A|B) = \frac{P(A \cap B)}{P(B)} $

Properties:

1) $ P(A|B) \ge 0 $

2) $ P( \Omega |B) = 1\! $

3) if A1 and A2 are disjoint $ P(A1 \cup A2|B) = P(A1|B) + P(A2|B) $

Bayes rule and total probability

$ P(B)=P(B\cap A_1) + P(B \cap\ A_2) +...+P(B\cap A_n)= P(B|A_1)P(A)+P(B|A_2)P(A_2)+...+P(B|A_n)P(A_n) $

Definitions of Independence and Conditional independence

Independence: A & B are independent if $ P(A\cap B)=P(A)P(B) $ side note: if A&B are independent then P(A|B)=P(A)

Conditional Independence: A&B are conditionally independent given C if $ P(A\cap B|C)=P(A|C)P(B|C)=\frac{P(A\cap C)}{P(C)} \frac{P(B\cap C)}{P(C)} $

Definition and basic concepts of random variables, PMFs

Random Variable: a map/function from outcomes to real values

Probability Mass Function (PMF) $ P_X (x) = P(X=x) $

The common random variables: bernoulli, binomial, geometric, and how they come about in problems. Also their PMFs.

Geometric RV:

where X is # of trials until the first success

$ P(X=k) = p(1-p)^{(k-1)} $ for k>=1

$ E[X] = 1/p \! $

$ Var(x)=\frac{(1-p)}{p^2} $


Binomial R.V. "many biased coins" with parameters n and p where n is the number of outcomes.

where X is # of successes in n trials and is the sum of independent, identically distributed outcomes.

P(X=k) = nCk * p^k * (1-p)^(n-k) for k=0,1,2,...n

E[X]=np VAR[X]=np(1-p)

Bernoulli R.V "one biased coin" with parameter p

X=1 if A occurs and X=0 otherwise

P(1)=p

E[x]=p

Var(X)=p(1-p)

Definition of expectation and variance and their properties


$ E[X]=\sum_X x P_X (x) $

$ E[ax+b]=aE[x]+b $ where a and b are constants

$ Var(X) = E[X^2] - (E[X])^2 \! $

$ Var(ax+b)=a^2 Var(x) $


Joint PMFs of more than one random variable

Joint Probability Mass Function

$ P_{XY}(x,y)=P({X=x}\cap {Y=y}) $

PX(x)=(SUM of all y)[PXY(x,y)]

PY(y)=(SUM of all x)[PXY(x,y)]

$ E[g(X,Y)]=\sum_{X,Y} g(X,Y)P_{XY}(x,y) $

E[ax+by]=aE[x]+bE[y]


Back to ECE302 Fall 2008 Prof. Sanghavi

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood