Presentation is loading. Please wait.

Presentation is loading. Please wait.

Comp328 tutorial 3 Kai Zhang

Similar presentations


Presentation on theme: "Comp328 tutorial 3 Kai Zhang"— Presentation transcript:

1 Comp328 tutorial 3 Kai Zhang
Bayes Decision Rule Comp328 tutorial 3 Kai Zhang

2 Outline Basic notions Three examples Minimizing error rate
Decision functions

3 Prior Probability w - state of nature, e.g.
w1 the object is a fish, w2 the object is a bird, etc. w1 the course is good, w2 the course is bad etc. A priory probability (or prior) P(wi)

4 Class-Conditional Probability
Observation x, e.g. the objects has wings The object’s length is 20 cm The first lecture is interesting Class-conditional probability density (mass) function p(x|w)

5 Bayes Decision Rule Suppose the priors P(wj) and conditional
densities p(x|wj) are known prior likelihood posterior evidence

6 Example Bayes Decision Rule
If P(apple | color) > P(peach | color) then choose apple Note that the evidence p(color) is only necessary for normalization purposes; it does not affect the decision rule

7 Misclassification Error
After observing x, the error occurs when the decision is different from the truth So the average error is The Bayesian decision rule minimize the average probability of error since P(error|x) is always forced to be minimum

8 Bayes decision minimizes the averaged error rate

9 Examples We know the ratio of unqualified products for the 4 workers.
Given a unqualified product, from which worker it is from most likely? A1,A2,A3,A4: (products of) four workers B: event that the product is not qualified

10 Example The objects can be classified as either GREEN or RED.
Our task is to classify new cases as they arrive, i.e., decide to which class they belong, based on currently exiting objects.

11 Likelihood / class conditional probability = 1/40
Prior probabilities In this case, the percentage of GREEN and RED objects, can be used to predict outcomes before they actually happen. = 40/60 = 20/60 Likelihood / class conditional probability = 1/40 = 3/20

12 Object classification
Posterior probabilities and decision rule

13 Deiscriminant Functions
Discriminant function is one of the ways to represent a pattern classifier; the classifier assigns a feature to class i if Bayes classifiers can be represented in this way :

14 Decision Boundaries Discriminant functions can be in different forms, but the effect of the decision rules is the same Decision boundaries of different joint probabilities as above

15 Discriminant Functions for Normal Probability Density
Case I: equal covariance (spherical Gaussian)

16 Case II: arbitrary & identical covariance

17 Case III: (arbitrary covariance)


Download ppt "Comp328 tutorial 3 Kai Zhang"

Similar presentations


Ads by Google