Digital Image Processing & Pattern Analysis (CSCE 563) Introduction to Pattern Analysis Prof. Amr Goneid Department of Computer Science & Engineering The American University in Cairo
Prof. Amr Goneid, AUC2 Introduction to Pattern Analysis Patterns & Pattern Analysis Pattern Classes Minimum Distance (Nearest Neighbor) Classifiers Structural Matching Classifiers Information Measure Classifiers Learning-Based Classifiers
Prof. Amr Goneid, AUC3 1. Patterns & Pattern Analysis Patterns are Object Descriptors The objective of pattern analysis is to classify objects into distinct and separable classes A pattern consists of a finite set of “features” or “attributes” Such set is usually called a “Feature Vector” The “Dimesionality” of a pattern is the number (n) of features in the feature vector
Prof. Amr Goneid, AUC4 Patterns & Pattern Analysis Features can be: Quantitative or Qualitative Numerical or Binary Ordinal or Non-Ordinal The process of mapping object descriptors to a Feature Vector is called “Feature Extraction” The process of reducing the number of attributes while maximizing the seprability between classes is called “Dimensionality Reduction”
Prof. Amr Goneid, AUC5 Feature Vector objectMapper Dimensionality Reduction Feature Extraction k-dimensional Feature Vector n-dimensional Feature Vector x1x2..xnx1x2..xn X = n < k
Prof. Amr Goneid, AUC6 Feature Vector (Example) Boundary Fourier Descriptors Sort & Truncate Boundary Pixels k-dimensional Feature Vector n-dimensional Feature Vector x1x2..xnx1x2..xn X = n < k
Prof. Amr Goneid, AUC7 Dimensionality Reduction Removal of features with insignificant weights (ordering or sorting then truncation) Removal of non-orthogonal features (discovering association) Selecting features that maximize inter-class to intra-class variances. Discriminant Analysis PCA
Prof. Amr Goneid, AUC8 2. Pattern Classes Classifiers w1w1 w2w2 wMwM Classifier x
Prof. Amr Goneid, AUC9 Feature Space & Seprability Feature Space Clusters (Classes) Linearly Separable Classes Non-Separable Classes w1 w2 w1 w2 w1 w2
Prof. Amr Goneid, AUC10 3. Minimum Distance (Nearest Neighbor) Classifiers Distance Measures: Search for minimum distance to class “center” “Nearest Neighbor” w1 w2 * * x
Prof. Amr Goneid, AUC11 Class Means n Features: x 1, x 2, ….., x n Feature Vector of unknown pattern: X M Pattern Classes: w 1, w 2, …. w M Sample Patterns for class (j) L = 1,2,…, N j have sample feature vectors S jL Class Means for class (j): m j = (1/ N j ) L S jL = = m1m2.mnm1m2.mn j. j
Prof. Amr Goneid, AUC12 Euclidean Distance For the unknown pattern X, the Euclidean Distance from the “center” of class w j is: D j 2 (X) = || X – m j || 2 = (X – m j ) T (X – m j ) = X T X - 2 { X T m j – (1/2) m j T m j } The “Affinity Function”: a j (X) = X T m j – (1/2) m j T m j changes with class (j) and should be maximum
Prof. Amr Goneid, AUC13 Decision Boundary The decision boundary between class w i and class w j is: a ij (X) = a i (X) – a j (X) = 0 If a ij (X) > 0 pattern X belongs to class w i If a ij (X) < 0 pattern X belongs to class w j
Prof. Amr Goneid, AUC14 Example: w 1 : Ballet dancers, w 2 : weight lifters Features: x 1 = weight in kg, x 2 = height in cm Samples have means: m 1 = ( ) T, m 2 = ( ) T Unknown pattern X = (x 1 x 2 ) T The affinity functions are: a 1 (X) = 60 x x 2 – a 2 (X) = 120 x x
Prof. Amr Goneid, AUC15 Example (continued) The decision boundary is: a 12 (X) = - 60 x x = 0 For the unknown pattern X, if a 12 (X) > 0, it belongs to ballet dancers if a 12 (X) < 0, it belongs to weight lifters e.g. if x 1 = 100 kg, x 2 = 165 cm then it is a weight lifter weight height w1 w2 a 12 (X) = 0
Prof. Amr Goneid, AUC16 4. Structural Matching Classifiers String Matching Two boundaries are coded as strings A, B. A match occurs at position k if A(k) = B(k). M = # of matches Number of mismatches is: Q = max (length(A), length(B)) – M Q = 0 if A and B are identical Degree of matching is R = M / Q
Prof. Amr Goneid, AUC17 Example: Matching Polygons Boundaries can be approximated by polygons Traversing anti-clockwise, we compute the interior angles between segments. Angles are quantized into regions. Quantized angles are coded by symbols. Successive angles of a polygon are represented by a string of symbols that can be matched to another string (polygon)
Prof. Amr Goneid, AUC18 5. Information Measure Classifiers (Information Content) For Binary Attributes Information Content: A set of n objects with only one attribute: a = # of objects having attribute (n-a) = # of objects missing attribute Information Content is: I = n log n – { a log a + (n-a) log (n-a)}for n > 1 = 0for n = 1
Prof. Amr Goneid, AUC19 Diversity & Similarity A set of n objects i = 1, 2,.. N and m attributes j = 1, 2, …, m B ij = 1 if object (i) has attribute (j) and zero otherwise a j = # of objects having attribute (j) (n – a j ) = # of objects missing attribute (j) = i B ij Information Content: I (m,n,a) = m n log n - j {a j log a j + (n – a j ) log (n – a j )} for n > 1 I (m,n,a) = 0 for n = 1
Prof. Amr Goneid, AUC20 Diversity & Similarity Let m = n For complete similarity, a j = n for all j I = n 2 log n - j { n log n + 0 } = 0 Least info content For complete diversity, a j = 1 for all j I = n 2 log n - n(n-1) log (n-1) > 0 Max info content e.g. for n = m = 2, I = 4
Prof. Amr Goneid, AUC21 Fusion & Division Two object sets A, B Objects in A are similar. Same for B I A = I B = 0 Fusing A,B into C = A B we gain info by I = I C – [I A + I B ] Dividing C into A and B with A B = 0, we lose info by I = I C – [I A + I B ]
Prof. Amr Goneid, AUC22 Selecting Division Attributes Dividing set C on attribute (j) produces two sets A with n j + = a j objects and B with n j - = (n-a j ) objects. Let I j + = I (m, n j +, a), I j - = I (m, n j -, a), then the gain from fusing A and B into C is I j = I C – [I A + I B ] = I (m,n,a) – { I j + + I j - } The division attribute (j) corresponds to max( I j ) The process is repeated for the net attribute after removing the first one.
Prof. Amr Goneid, AUC23 Classification Tree Recursive computation of info gain produces a classification tree
Prof. Amr Goneid, AUC24 6. Learning-Based Classifiers (Neural Networks) Neurons Learning Feed-Forward NN Backpropagation NN Other NN