Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lecture 20 Object recognition I

Similar presentations


Presentation on theme: "Lecture 20 Object recognition I"— Presentation transcript:

1 Lecture 20 Object recognition I
Pattern and pattern classes Classifiers based on Bayes Decision Theory Recognition based on decision-theoretical methods Optimum statistical classifiers Pattern recognition with Matlab

2 Patterns and Pattern classes
A pattern is an arrangement of descriptors (features) Three commonly used pattern arrangements Vectors Strings Trees A pattern class is a family of patterns that share some common properties. Pattern recognition is to assign a given pattern to its respective class.

3 Example 1 Represent flow petals by features width and length
Then three types of iris flowers are in different pattern classes

4 Example 2 Use signature as pattern vector

5 Example 3 Represent pattern by string

6 Example 4 Represent pattern by trees

7 2. Classifier based on Baysian Decision Theory
Fundamental statistical approach Assumes relevant probabilities are known, compute the probability of the event observed, then make optimal decisions Bayes’ Theorem: Example: Suppose at Laurier, 50% are girl students, 30% are science students, among science students, 20% are girl students. If one meet a girl student at Laurier, what is the probability that she is a science student. B – girl students, A – science students. Then

8 Bayes theory Given x ∈ Rl and a set classes, ωi , i = 1, 2, , c, the Bayes theory states that where P(ωi) is the a priori probability of class ωi ; i = 1, 2, , c, P(ωi |x) is the a posteriori probability of class ωi given the value of x; p(x) is the probability density function (pdf ) of x; and p(x| ωi), i = 1 = 2, , c, is the class conditional pdf of x given ωi (sometimes called the likelihood of ωi with respect to x).

9 Bayes classifier Let x ≡ [x(1), x(2), , x(l)]T ∈ Rl be its corresponding feature vector, which results from some measurements. Also, we let the number of possible classes be equal to c, that is, ω1, , ωc. Bayes decision theory: x is assigned to the class ωi if

10 Multidimensional Gaussian PDF

11 Example Consider a 2-class classification task in the 2-dimensional space, where the data in both classes, ω1, ω2, are distributed according to the Gaussian distributions N(m1,S1) and N(m2,S2), respectively. Let Assuming that, Classify x = [1.8, 1.8]T into ω1 or ω2 .

12 Solution The resulting values p1 = 0.042, p2 = 0.0189
m1=[1 1]'; m2=[3 3]'; S=eye(2); x=[ ]'; p1=P1*comp_gauss_dens_val(m1,S,x); p2=P2*comp_gauss_dens_val(m2,S,x); The resulting values p1 = , p2 = According to the Bayesian classifier, x is assigned to ω1

13 Decision-theoretic methods
Decision (discriminate) functions Decision boundary

14 Minimum distance classifier

15 Example

16 Minimum Mahalanobis distance classifiers

17 Example x=[0.1 0.5 0.1]'; m1=[0 0 0]'; m2=[0.5 0.5 0.5]'; m=[m1 m2];
z1=euclidean_classifier(m,x) S=[ ; ; ]; z2=mahalanobis_classifier(m,S,x); z1 = 1 < z2 = 2 x is classified to w1

18 4. Matching by correlation
Given a template w(s,t) (or mask), i.e. an m × n matrix, find the a sub m × n matrix in f(x,y) such that it best matches w, i.e. with largest correlation.

19 Correlation theorem [M, N] = size(f); f = fft2(f);
w = conj(fft2(w, M, N)); g = real(ifft2(w.*f));

20

21 Example

22 Case study Optical character recognition (OCR) See the reference
Preprocessing         Digitization, make binary         Noise elimination, thinning, normalizing Feature Extraction (by character, word part, word)         Segmentation (explicit or implicit)         Detection of major features (top-down approach) Matching         Recognition of character         Context verification from knowledge base Understanding and Action See the reference

23 Example

24 3. Optimum statistical classifiers

25 Bayes classifer for Gaussian pattern class
Consider two patter classes with Gaussian distribution

26 N-dimensional case

27 Example

28 A real example

29

30 Linear classifier Two classes f(x) is a separation hyperplane
How to obtain the coefficients, or weights wi By perceptron algorithm

31 How to obtain the coefficients, or weights wi

32 The Online Form of the Perceptron Algorithm

33 The Multiclass LS Classifier
The classification rule is now as follows: Given x, classify it to class ωi if


Download ppt "Lecture 20 Object recognition I"

Similar presentations


Ads by Google