Line 1: Line 1:
 
 
= Bayes Decision Theory  =
 
= Bayes Decision Theory  =
  
Line 6: Line 5:
 
'''Introduction'''  
 
'''Introduction'''  
  
       The Bayesian decision theory is a valuable approach to solve a pattern classification problem. It is based on quantifying the tradeoffs between various classification decisions using the probability of events occurring and the costs that accompany the decisions. Here, we are assuming that the problems are posed in probalistic terms and all relevant probability values are known (It is important to note that in reality its not always like this).
+
       The Bayesian decision theory is a valuable approach to solve a pattern classification problem. It is based on quantifying the tradeoffs between various classification decisions using the probability of events occurring and the costs that accompany the decisions. Here, we are assuming that the problems are posed in probabilistic terms and all relevant probability values are known (It is important to note that in reality its not always like this).
 
   
 
   
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Consider a situation where we have a stack of cards where each card is either a diamond or spade, . We can denote x = x<sub>1</sub> for diamonds, and x = x<sub>2</sub> for spades. Suppose we want to design a system that will be able to  predict the next card that will come up. We also know the prior probability P(x<sub>1</sub>) that the next card is diamonds, and some prior probability P(x<sub>2</sub>) that it's spades, and both probabilities sum up to 1 (since we only have two variables). We can therefore use the following decision rule :that if P(x<sub>1</sub>) > P(x<sub>2</sub>), then the card is diamonds, otherwise it is spades. How well that works will depend on how much greater P(x<sub>1</sub>) is. If it is much greater than P(x<sub>2</sub>) then  our decision will favor diamonds most of the time, however if P(x<sub>1</sub>) = P(x<sub>2</sub>) then we have only a 50% chance of being right.
+
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Consider a situation where we have a stack of cards where each card is either a diamond or spade, . We can denote ''x'' = ''x<sub>1</sub>'' for diamonds, and ''x'' = ''x<sub>2</sub>'' for spades. Suppose we want to design a system that will be able to  predict the next card that will come up. We also know the prior probability ''P(x<sub>1</sub>)'' that the next card is diamonds, and some prior probability P(x<sub>2</sub>) that it's spades, and both probabilities sum up to 1 (since we only have two variables). We can therefore use the following decision rule :that if P(x<sub>1</sub>) > P(x<sub>2</sub>), then the card is diamonds, otherwise it is spades. How well that works will depend on how much greater P(x<sub>1</sub>) is. If it is much greater than P(x<sub>2</sub>) then  our decision will favor diamonds most of the time, however if P(x<sub>1</sub>) = P(x<sub>2</sub>) then we have only a 50% chance of being right.
  
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; However, we  
+
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; However, in most cases we wont be making decisions with so little information. For example if we had information about the colors of the shapes on the cards (Red for diamonds and black for spades), we can describe this as a variable ''y'' and we consider ''y'' to be a
  
  
 
[[Category:Honors_project]] [[Category:ECE302]] [[Category:Pattern_recognition]]
 
[[Category:Honors_project]] [[Category:ECE302]] [[Category:Pattern_recognition]]

Revision as of 13:48, 15 February 2013

Bayes Decision Theory


Introduction

       The Bayesian decision theory is a valuable approach to solve a pattern classification problem. It is based on quantifying the tradeoffs between various classification decisions using the probability of events occurring and the costs that accompany the decisions. Here, we are assuming that the problems are posed in probabilistic terms and all relevant probability values are known (It is important to note that in reality its not always like this).

       Consider a situation where we have a stack of cards where each card is either a diamond or spade, . We can denote x = x1 for diamonds, and x = x2 for spades. Suppose we want to design a system that will be able to predict the next card that will come up. We also know the prior probability P(x1) that the next card is diamonds, and some prior probability P(x2) that it's spades, and both probabilities sum up to 1 (since we only have two variables). We can therefore use the following decision rule :that if P(x1) > P(x2), then the card is diamonds, otherwise it is spades. How well that works will depend on how much greater P(x1) is. If it is much greater than P(x2) then our decision will favor diamonds most of the time, however if P(x1) = P(x2) then we have only a 50% chance of being right.

       However, in most cases we wont be making decisions with so little information. For example if we had information about the colors of the shapes on the cards (Red for diamonds and black for spades), we can describe this as a variable y and we consider y to be a

Alumni Liaison

Prof. Math. Ohio State and Associate Dean
Outstanding Alumnus Purdue Math 2008

Jeff McNeal