Line 1: Line 1:
<br>
 
  
 
= Bayes Decision Theory  =
 
= Bayes Decision Theory  =
Line 7: Line 6:
 
'''Introduction'''  
 
'''Introduction'''  
  
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; The Bayesian decision theory is a valuable approach to solve a pattern classification problem. It is based on quantifying the tradeoffs between various classification decisions using the probability of events occurring and the costs that accompany the decisions. Here, we are assuming that the problems are posed in probalistic terms and all relevant probability values are known (It is important to note that in reality its not always like this). Consider a situation where we have a stack of cards where each card is either a diamond or spade, . We can denote x = x<sub>1</sub> for diamonds, and x = x<sub>2</sub> for spades. Suppose we want to design a system that will be able to  predict the next card that will come up. We also know the prior probability P(x<sub>1</sub>) that the next card is diamonds, and some prior probability P(x<sub>1</sub>) that it's spades, and both probabilities sum up to 1 (since we only have two variables).  
+
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; The Bayesian decision theory is a valuable approach to solve a pattern classification problem. It is based on quantifying the tradeoffs between various classification decisions using the probability of events occurring and the costs that accompany the decisions. Here, we are assuming that the problems are posed in probalistic terms and all relevant probability values are known (It is important to note that in reality its not always like this).
 +
 +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Consider a situation where we have a stack of cards where each card is either a diamond or spade, . We can denote x = x<sub>1</sub> for diamonds, and x = x<sub>2</sub> for spades. Suppose we want to design a system that will be able to  predict the next card that will come up. We also know the prior probability P(x<sub>1</sub>) that the next card is diamonds, and some prior probability P(x<sub>2</sub>) that it's spades, and both probabilities sum up to 1 (since we only have two variables). We can therefore use the following decision rule :that if P(x<sub>1</sub>) > P(x<sub>2</sub>), then the card is diamonds, otherwise it is spades. How well that works will depend on how much greater P(x<sub>1</sub>) is. If it is much greater than P(x<sub>2</sub>) then  our decision will favor diamonds most of the time, however if P(x<sub>1</sub>) = P(x<sub>2</sub>) then we have only a 50% chance of being right.
 +
 
 +
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; However, we
 +
 
  
 
[[Category:Honors_project]] [[Category:ECE302]] [[Category:Pattern_recognition]]
 
[[Category:Honors_project]] [[Category:ECE302]] [[Category:Pattern_recognition]]

Revision as of 22:29, 14 February 2013

Bayes Decision Theory


Introduction

       The Bayesian decision theory is a valuable approach to solve a pattern classification problem. It is based on quantifying the tradeoffs between various classification decisions using the probability of events occurring and the costs that accompany the decisions. Here, we are assuming that the problems are posed in probalistic terms and all relevant probability values are known (It is important to note that in reality its not always like this).

       Consider a situation where we have a stack of cards where each card is either a diamond or spade, . We can denote x = x1 for diamonds, and x = x2 for spades. Suppose we want to design a system that will be able to predict the next card that will come up. We also know the prior probability P(x1) that the next card is diamonds, and some prior probability P(x2) that it's spades, and both probabilities sum up to 1 (since we only have two variables). We can therefore use the following decision rule :that if P(x1) > P(x2), then the card is diamonds, otherwise it is spades. How well that works will depend on how much greater P(x1) is. If it is much greater than P(x2) then our decision will favor diamonds most of the time, however if P(x1) = P(x2) then we have only a 50% chance of being right.

       However, we

Alumni Liaison

ECE462 Survivor

Seraj Dosenbach