Download presentation
Presentation is loading. Please wait.
1
Optimal Bayes Classification
Tutorial 1: Optimal Bayes Classification
2
Theory Review We assume all variables are random variables, with known distributions Notation: - A finite set of classes (categories), - Input space (patterns), - A classifier, maps to
3
Basic Assumption The following distributions are known:
- Prior probabilities for each class. - Conditional probability of the input, given that the class is If is a continuous space, denotes the probability density.
4
Reminder: Fish Classification
Two classes Prior probability can be estimated from relative frequency Class conditional probability can be estimated by a frequency histogram
5
Optimal Bayes Classifier
Bayes Rule: Optimal Bayes Classifier: This classifier minimizes the conditional error probability and the average error probability.
6
Exercise 1 It is given that and The prior probability is uniform:
And the class conditional probability is Gaussian: Where What is the Bayes optimal decision rule? What are the decision boundaries in the plane? Does the decision boundary for the Gaussian case always have the same form? What it depends on?
7
Exercise 2 It is given that the input space is binary vectors of length d, meaning that The output space is with general prior We define the per-coordinate class conditional probabilities: In addition, each coordinate is statistically independent of all other coordinates What is the optimal decision rule? What Happens if for some i, ?
8
Exercise 3 Two classes are given, with uniform prior. In addition it is given that: What is the Bayes optimal decision rule?
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.