Line 3: Line 3:
 
In this page, we will derive Bayes’ rule for both discrete and continuous random variables.  
 
In this page, we will derive Bayes’ rule for both discrete and continuous random variables.  
  
= Derivation of Bayes' Rule =
+
= Derivation of Bayes' Rule =
  
 
In this page, we will consider the derivation of Bayes’ rule both in discrete and continues case.  
 
In this page, we will consider the derivation of Bayes’ rule both in discrete and continues case.  
  
== Discrete Random Variables ==
+
== Discrete Random Variables ==
  
 
In discrete case, we have the conditional probability formula,  
 
In discrete case, we have the conditional probability formula,  
Line 31: Line 31:
 
----
 
----
  
== Continues Random Variables ==
+
== Continues Random Variables ==
  
 
Now, we can consider the Bayes' rule when we have continuous random variables. We know that the conditional probability for the continues random variables is,  
 
Now, we can consider the Bayes' rule when we have continuous random variables. We know that the conditional probability for the continues random variables is,  
Line 57: Line 57:
 
----
 
----
  
Example 1: Let us suppose that a certain school has 200 students. 45 students of the total number of students are members of a certain club. At the beginning of the academic year, the school offered a seminar which was attended by 70 students. 25 students, who are members of the club, attended the seminar.  
+
== Example 1 ==
 +
 
 +
Let us suppose that a certain school has 200 students. 45 students of the total number of students are members of a certain club. At the beginning of the academic year, the school offered a seminar which was attended by 70 students. 25 students, who are members of the club, attended the seminar.  
  
 
Now, let’s suppose that the probability that a student is a member of the club is <span class="texhtml">''P''(''A'')</span> and the probability a student attended the seminar is <span class="texhtml">''P''(''B'')</span>. Let <span class="texhtml">''P''(''A''<sup>''c''</sup>)</span> represents the probability that a student- is not a member of the club and <span class="texhtml">''P''(''B''<sup>''c''</sup>)</span> represents the probability that a student did not attend the seminar. We can easily see that <span class="texhtml">''P''(''A'') = 45 / 200</span>, <span class="texhtml">''P''(''A''<sup>''c''</sup>) = 155 / 200</span> , <span class="texhtml">''P''(''B'') = 70 / 200</span> , and <span class="texhtml">''P''(''B''<sup>''c''</sup>) = 130 / 200</span>  
 
Now, let’s suppose that the probability that a student is a member of the club is <span class="texhtml">''P''(''A'')</span> and the probability a student attended the seminar is <span class="texhtml">''P''(''B'')</span>. Let <span class="texhtml">''P''(''A''<sup>''c''</sup>)</span> represents the probability that a student- is not a member of the club and <span class="texhtml">''P''(''B''<sup>''c''</sup>)</span> represents the probability that a student did not attend the seminar. We can easily see that <span class="texhtml">''P''(''A'') = 45 / 200</span>, <span class="texhtml">''P''(''A''<sup>''c''</sup>) = 155 / 200</span> , <span class="texhtml">''P''(''B'') = 70 / 200</span> , and <span class="texhtml">''P''(''B''<sup>''c''</sup>) = 130 / 200</span>  
Line 74: Line 76:
  
 
<br> We can see now that <math> \textbf{ P}(A|B)  >  \textbf{P}(A) </math> because
 
<br> We can see now that <math> \textbf{ P}(A|B)  >  \textbf{P}(A) </math> because
 +
 +
 +
 +
<br>
 +
 +
-------
 +
 +
== References ==
 +
 +
 +
# ECE662: Statistical Pattern Recognition and Decision Making Processes, Purdue University, Spring 2014.
 +
 +
----
 +
 +
==[[Talk:|Questions and comments]]==
 +
 +
If you have any questions, comments, etc. please post them on [[Talk:|this page]].

Revision as of 02:43, 1 May 2014

Introduction

In this page, we will derive Bayes’ rule for both discrete and continuous random variables.

Derivation of Bayes' Rule

In this page, we will consider the derivation of Bayes’ rule both in discrete and continues case.

Discrete Random Variables

In discrete case, we have the conditional probability formula,

$ P(x|y)=\frac{P(x \cap y)}{P(y)} (1) $

Now, we can rewrite this equation as

$ P(x \cap y)= P(x|y) \cdot P(y) (2) $

Now, because the intersection is commutative, we can write the P(x \cap y) as,

$ P(x \cap y)= P(y \cap x) (3) $

Now, using the conditional probability definition, we can write equation (3) as

$ P(x|y) \cdot P(y)= P(y|x) \cdot P(x) (4) $

Now, we can write equation 4 as,

$ P(x|y) = \frac{P(y|x) \cdot P(x)}{P(y)} (5) $


Continues Random Variables

Now, we can consider the Bayes' rule when we have continuous random variables. We know that the conditional probability for the continues random variables is,

$ f_{X|Y} (x|y) = \frac{f_{X,Y} (x,y)}{f_{Y} (y)} (5) $

Now, we can write another equation similar to equation 5 as follows,

$ f_{Y|X} (y|x) = \frac{f_{Y,X} (y,x)}{f_{X} (x)} (6) $

But because fY,X(y,x) is the same as fX,Y(x,y), we can rewrite equation 6 as follows,

$ f_{Y|X} (y|x) = \frac{f_{X,Y} (x,y)}{f_{X} (x)} (7) $

Now, be rearranging equations 5 and 7, we can write,

$ f_{X,Y} (x,y)= f_{X|Y} (x|y) \times f_{Y} (y) (8) $


$ f_{X,Y} (x,y)= f_{Y|X} (y|x) \times f_{X} (x) (9) $

Now, by equating equations 8 and 9, we get Bayes' rule for continues random variables

$ f_{X|Y} (x|y) = \frac{ f_{Y|X} (y|x) \times f_{X} (x) } { f_{Y} (y) } (10) $


Example 1

Let us suppose that a certain school has 200 students. 45 students of the total number of students are members of a certain club. At the beginning of the academic year, the school offered a seminar which was attended by 70 students. 25 students, who are members of the club, attended the seminar.

Now, let’s suppose that the probability that a student is a member of the club is P(A) and the probability a student attended the seminar is P(B). Let P(Ac) represents the probability that a student- is not a member of the club and P(Bc) represents the probability that a student did not attend the seminar. We can easily see that P(A) = 45 / 200, P(Ac) = 155 / 200 , P(B) = 70 / 200 , and P(Bc) = 130 / 200

Now, let’s try to answer the following question: What is the probability that a student who attended the seminar is a member of the club? We answer this question easily by using Bayes rule, We know that $ P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)} $

So, we can calculate $ P(A|B) = \frac{ \frac{25}{45} \cdot \frac{45}{200} } { \frac{70}{200} } = \frac {5}{14} $

Now, Let us try to answer another question: What is the probability that a student who did NOT attend the seminar is a member of the club? Again, we answer this question by using Bayes rule, We know that $ P(A|B^c) = \frac{P(B^c|A) \cdot P(A)}{P(B^c)} $

So, we can calculate $ P(A|B^c) = \frac{ \frac{20}{45} \cdot \frac{45}{200} } { \frac{130}{200} } = \frac {2}{13} $


Now, Let us try to answer another question: What is the probability that a student who attended the seminar is NOT a member of the club? Again, we answer this question by using Bayes rule, We know that $ P(A^c|B) = \frac{P(B|A^c) \cdot P(A^c)}{P(B)} $

So, we can calculate $ P(A^c|B) = \frac{ \frac{45}{155} \cdot \frac{155}{200} } { \frac{70}{200} } = \frac {45}{70} $


We can see now that $ \textbf{ P}(A|B) > \textbf{P}(A) $ because




References

  1. ECE662: Statistical Pattern Recognition and Decision Making Processes, Purdue University, Spring 2014.

[[Talk:|Questions and comments]]

If you have any questions, comments, etc. please post them on [[Talk:|this page]].

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn