Presentation is loading. Please wait.

Presentation is loading. Please wait.

Chapter 4 Pattern Recognition Concepts continued.

Similar presentations


Presentation on theme: "Chapter 4 Pattern Recognition Concepts continued."— Presentation transcript:

1 Chapter 4 Pattern Recognition Concepts continued

2 Using features Ex. Recognizing characters Ex. Recognizing characters Area Area Height Height Width Width # of holes # of holes # of strokes # of strokes Centroid Centroid Best axis (of least inertia) Best axis (of least inertia) Second moments (about axis of least and most inertia) Second moments (about axis of least and most inertia)

3 Decision tree

4

5

6 Classifying using nearest class mean Some problems are more “fuzzy” and can’t be solved using simple decision trees. Some problems are more “fuzzy” and can’t be solved using simple decision trees. To classify a candidate, c, compute its distance from all know class means and assign it to the same class as the class of the nearest mean. To classify a candidate, c, compute its distance from all know class means and assign it to the same class as the class of the nearest mean.

7 Classifying using nearest class mean (with good results)

8 Euclidean distance (and scaled Euclidean distance)

9 Other distance measures from http://www.molmine.com/magma/analysis/distance.htm from http://www.molmine.com/magma/analysis/distance.htm

10 What is a distance metric? 3 properties: 1. g(x,y) = g(y,x)symmetric 2. g(x,x) = 0 and g(x,y)=0 implies x=y 3. g(x,y) + g(y,z) >= g(x,z)triangle inequality from http://mathworld.wolfram.com/Metric.html from http://mathworld.wolfram.com/Metric.html

11 Classifying using nearest class mean (with poor results)

12 Classifying using nearest neighbor To classify a candidate, c, compute its distance to all member of all know classes and assign it to the same class as the class of the nearest element. To classify a candidate, c, compute its distance to all member of all know classes and assign it to the same class as the class of the nearest element.

13

14 Precision vs. recall Say we have an image db we wish to query. Say we have an image db we wish to query. “Show me all images of tanks.” “Show me all images of tanks.” Precision = # of relevant documents retrieved divided by the total number of documents retrieved Precision = # of relevant documents retrieved divided by the total number of documents retrieved Precision = TP / (TP+FP) Precision = TP / (TP+FP) PPV = positive predictive value PPV = positive predictive value Probability that the target is is actually present when the observer says that is it present. Probability that the target is is actually present when the observer says that is it present.

15 Precision vs. recall Say we have an image db we wish to query. Say we have an image db we wish to query. “Show me all images of tanks.” “Show me all images of tanks.” Recall = # of relevant documents retrieved divided by the total number of relevant documents in the db. Recall = # of relevant documents retrieved divided by the total number of relevant documents in the db. Recall = TP / (TP+FN) = TPF Recall = TP / (TP+FN) = TPF Negative predictive value Negative predictive value NPV = TN / (TN+FN) NPV = TN / (TN+FN) Probability that the target is actually absent when the observer says that it is absent. Probability that the target is actually absent when the observer says that it is absent.

16 Structural (pattern recognition) techniques Simple features may not be enough for recognition. Simple features may not be enough for recognition. So relationships between these primitive features are used (in structural techniques). So relationships between these primitive features are used (in structural techniques).

17 Same bounding box, holes, strokes, centroid, 2 nd moments in row and column directions, and similar major axis direction. Same bounding box, holes, strokes, centroid, 2 nd moments in row and column directions, and similar major axis direction.

18 bay=intrusion of background

19 lid=virtual line segment that close the bay

20 Structural graphs a graph G = ( V, E ) a graph G = ( V, E ) where V is a vertex set and E is an edge set where V is a vertex set and E is an edge set from http://mathworld.wolfram.com/Graph.html from http://mathworld.wolfram.com/Graph.html

21 Structural graphs a graph G = ( V, E ) a graph G = ( V, E ) where V is a vertex set and E is an edge set where V is a vertex set and E is an edge set V S = side S = side L = lake L = lake B = bay B = bay E CON = connection of 2 strokes CON = connection of 2 strokes ADJ = stroke region is immediately adjacent to a lake or bay region ADJ = stroke region is immediately adjacent to a lake or bay region ABOVE = 1 hole (lake or bay) lies above another ABOVE = 1 hole (lake or bay) lies above another

22

23 Structural graph conclusions Graph-matching techniques can be used for recognition. Graph-matching techniques can be used for recognition. Or we can count occurrences of relationships and use these counts as a feature vector for statistical PR. Or we can count occurrences of relationships and use these counts as a feature vector for statistical PR.

24 Structural graph homework Create structural graphs for the following characters: Create structural graphs for the following characters: X 8 C 6 Email your answers to me with the subject, “structural.” Email your answers to me with the subject, “structural.”

25 Confusion matrix

26 Empirical error rate = % misclassified Empirical error rate = % misclassified Empirical reject rate = % rejected Empirical reject rate = % rejected

27 Empirical error rate = % misclassified = 25/1000 overall (does not include rejects); 5/100 for 9s Empirical error rate = % misclassified = 25/1000 overall (does not include rejects); 5/100 for 9s Empirical reject rate = % rejected = 7/1000 Empirical reject rate = % rejected = 7/1000 Can ROC analysis be applied? Can ROC analysis be applied?

28 Recall ROC analysis TP = true positive = present and detected TP = true positive = present and detected TN = true negative = not present and not detected TN = true negative = not present and not detected FP = false positive = not present but detected FP = false positive = not present but detected FN = false negative = present but not detected FN = false negative = present but not detected True positive fraction True positive fraction TPF = TP / (TP+FN) TPF = TP / (TP+FN) true abnormals called abnormal by the observer true abnormals called abnormal by the observer False positive fraction False positive fraction FPF = FP / (FP+TN) FPF = FP / (FP+TN)

29 Ex. ROC analysis for the classification of ‘3’ TP = ‘3’ present and ‘3’ detected TP = ‘3’ present and ‘3’ detected TN = not present and not detected TN = not present and not detected FP = not present but detected FP = not present but detected FN = present but not detected FN = present but not detected

30 Ex. ROC analysis for the classification of ‘3’ TP = ‘3’ present and ‘3’ detected TP = ‘3’ present and ‘3’ detected TN = not present and not detected TN = not present and not detected FP = not present but detected FP = not present but detected FN = present but not detected FN = present but not detected

31 Ex. ROC analysis for the classification of ‘3’ TP = ‘3’ present and ‘3’ detected TP = ‘3’ present and ‘3’ detected TN = not present and not detected TN = not present and not detected FP = not present but detected FP = not present but detected FN = present but not detected FN = present but not detected

32 Ex. ROC analysis for the classification of ‘3’ TP = ‘3’ present and ‘3’ detected TP = ‘3’ present and ‘3’ detected TN = not present and not detected TN = not present and not detected FP = not present but detected FP = not present but detected FN = present but not detected FN = present but not detected

33 Skip remainder of Chapter 4

34

35

36

37

38

39

40

41

42

43

44


Download ppt "Chapter 4 Pattern Recognition Concepts continued."

Similar presentations


Ads by Google