html5-img
1 / 17

Bayes Decision Rule

Bayes Decision Rule. Comp328 tutorial 3 Kai Zhang. Outline. Basic notions Three examples Minimizing error rate Decision functions. Prior Probability. w - state of nature, e.g. w 1 the object is a fish, w 2 the object is a bird, etc. w 1 the course is good, w 2 the course is bad

roydecker
Télécharger la présentation

Bayes Decision Rule

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Bayes Decision Rule Comp328 tutorial 3 Kai Zhang

  2. Outline • Basic notions • Three examples • Minimizing error rate • Decision functions

  3. Prior Probability • w - state of nature, e.g. • w1 the object is a fish, w2 the object is a bird, etc. • w1 the course is good, w2 the course is bad • etc. • A priory probability (or prior) P(wi)

  4. Class-Conditional Probability • Observation x, e.g. • the objects has wings • The object’s length is 20 cm • The first lecture is interesting • Class-conditional probability density (mass) function p(x|w)

  5. Bayes Decision Rule Suppose the priors P(wj) and conditional densities p(x|wj) are known prior likelihood posterior evidence

  6. Example • Bayes Decision Rule • If P(apple | color) > P(peach | color) then choose apple • Note that the evidence p(color) is only necessary for normalization purposes; it does not affect the decision rule

  7. Misclassification Error • After observing x, the error occurs when the decision is different from the truth • So the average error is • The Bayesian decision rule minimize the average probability of error since P(error|x) is always forced to be minimum

  8. Bayes decision minimizes the averaged error rate

  9. Examples • We know the ratio of unqualified products for the 4 workers. • Given a unqualified product, from which worker it is from most likely? A1,A2,A3,A4: (products of) four workers B: event that the product is not qualified

  10. Example • The objects can be classified as either GREEN or RED. • Our task is to classify new cases as they arrive, i.e., decide to which class they belong, based on currently exiting objects.

  11. Prior probabilities • In this case, the percentage of GREEN and RED objects, can be used to predict outcomes before they actually happen. = 40/60 = 20/60 • Likelihood / class conditional probability = 1/40 = 3/20

  12. Object classification • Posterior probabilities and decision rule

  13. Deiscriminant Functions • Discriminant function is one of the ways to represent a pattern classifier; the classifier assigns a feature to class i if • Bayes classifiers can be represented in this way :

  14. Decision Boundaries • Discriminant functions can be in different forms, but the effect of the decision rules is the same • Decision boundaries of different joint probabilities as above

  15. Discriminant Functions for NormalProbability Density • Case I: equal covariance (spherical Gaussian)

  16. Case II: arbitrary & identical covariance

  17. Case III: (arbitrary covariance)

More Related