Presentation is loading. Please wait.

Presentation is loading. Please wait.

Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12.

Similar presentations


Presentation on theme: "Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12."— Presentation transcript:

1 Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12

2 2 Terminology A pattern (x,y,z): arrangement of descriptors (those discussed in previous 2 lectures) A feature: another name for a descriptor in pattern recognition A pattern class : a family of patterns that share some common properties.

3 3 Example Petal width Petal length Is the feature selection good enough?

4 4 Decision-Theoretic Methods Assuming W classes ( ), we want to find decision functions with the property that if pattern x belongs to class, then The decision boundary separating two classes is the set of x for which

5 5 Common Approaches Matching  Minimum distance classifier  Matching by correlation (skip) Optimum statistical classifiers  Bayes classifier for Gaussian pattern classes Neural network

6 6 Matching–Minimum Distance Classifier Techniques based on matching represent each class by a prototype pattern vector. An unknown pattern is assigned to the class to which it is closest in terms of a predefined metric.  For MDC, the metric is the Euclidean distance

7 7 MDC The prototype of each pattern class is the mean vector of that class: The distance metric is the Euclidean distance: Euclidean norm

8 8 MDC Assign x to class if D j (x) is the smallest. Smallest D j (x) is equivalent to largest d j (x), the decision function: The decision boundary between classes i and j becomes:

9 9 MDC- Decision Boundary bisector of the line joining m i and m j.  In 2D: bisector is a line  In 3D: bisector is a plane m 1 =(4.3,1.3) T m 2 =(1.5,0.3) T

10 10 Comments Simplest matching method. A class is described by the mean vector Works well for  Large mean separation, and  Relatively small class spread Unfortunately, we don’t often encounter this scenario in pactice.

11 11 Quiz Q1: Compute the decision functions of a minimum distance classifier for the pattern shown in the next page. Q2: Compute and sketch the decision surfaces implemented by the decision functions in Q1.

12 12 m 1 =(4.3,1.3) T m 2 =(1.5,0.3) T m 1 =(5.5,2.1) T


Download ppt "Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12."

Similar presentations


Ads by Google