Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.

Similar presentations


Presentation on theme: "1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology."— Presentation transcript:

1 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology

2 2 Lecture 24 Topics 1.Review and Motivation for Link Structure 2.Present the Functional Link Artificial Neural Network. 3.Simple Example- design using ANN and FLANN 4. Performance for Neural Network Designs 5. Radial Basis Function Neural Networks 6. Problems, Advantages, Disadvantages, and promise of Artificial Neural Network Design

3 3 g1(x)g1(x) g2(x)g2(x) gj(x)gj(x) gM(x)gM(x) … … x x x x x + g(x) wMwM w2w2 wjwj w1w1 Generalized Linear Discriminant Functions Review 1

4 4 Patterns are linearly separated in the 3-dim space Separating plane Review 2

5 5 Example: Decision rule using one nonlinear discriminant function g(x) Given the following g(x) and decision rule Illustrate the decision regions R 1 and R 2 where we respectively classify as C 1 and C 2 for the decision rule above Review 3

6 6 Solution: R 1 decide C 1 R 2 decide C 2 every where else Review 4

7 7 Find a generalized linear discriminant function that separates the classes Solution: d(x) = w 1 f 1 (x)+ w 2 f 2 (x)+ w 3 f 3 (x) + w 4 f 4 (x) +w 5 f 5 (x) + w 6 f 6 (x) = w T f (x) in the f space (linear) Review 5

8 8 where in the original pattern space: (nonlinear) Review 6

9 9 Decision Boundary in original pattern space -2 2 1 12 34 x2x2 x1x1 from C 1 from C 2 d(x) = 0 Boundary Review 7

10 10 Potential Function Approach – Motivated by electromagnetic theory Sample space + from C 1 - from C 2 Review 8

11 11 Plot of Samples from the two classes Review 9

12 12 Given Samples x from two classes C 1 and C 2 K(x) = ∑ K(x, x k ) - ∑ K(x, x k ) x k S 2 x k S 1 C C S1S2S1S2 Define Total Potential Function Decision Boundary K(x) = 0 C1C1 C2C2 Potential Function Review 10

13 13 Algorithm converged in 1.75 passes through the data to give final discriminant function as Review 11

14 14 Functional Link Neural Network

15 15 Quadratic Functional Link

16 16 Fourier Series Functional Link

17 17 Principal Component Functional Link f k (x), k=1 to N are chosen as the eigen vectors of the sample covariance matrix

18 18 Example: Comparison of Neural Net and functional link neural net Given two pattern classes C 1 and C 2 with the following four patterns and their desired outputs

19 19 (a)Design an Artificial Neural Network to classify the two patterns given (b) Design a Functional Link Artificial Neural Network to classify the patterns given. (c) Compare the Neural Net and Functional Link Neural Net designs

20 20 Select the following structure (a) Solution: Artificial Neural Net Design

21 21 After training using the training set and the backpropagation algorithm the design becomes Values determined by neural net

22 22 (b) Solution: Functional Link Artificial Neural Net Design

23 23 A neural net was trained using the functional link output patterns as new pattern samples The resulting weights and structure are

24 24 (c) Comparison Artificial Neural Net (ANN) and Functional Link Artificial Neural Net (FLANN} Designs FLANN has simpler structure than the ANN with only one neural element and Link. Fewer iterations and computations in the training algorithm for FLANN. FLANN design may be more sensitive to errors in patterns.

25 25 Determining Performance of Neural Net Design on Training Set

26 26 Test Design on Testing Set Classify each member of the testing set using the neural network design. Determine Performance for Design using Training Set Classify each member of the training set using the neural network design.

27 27 Could use (a) Performance Measure E TOT (b) The Confusion Matrix (c) Probability of Error (d) Bayes Risk

28 28 (a) Local and global errors- Used in Neural Net Design procedure Local Measure Global Measure

29 29 (b) Confusion Matrix- Example Correct Classification Incorrect Classification

30 30 (c) Probability of Error- Example Estimates of Probabilities of being Correct Estimate of Total Probability of Error

31 31 (d) Bayes Risk Estimate

32 32 Radial Basis Function (RBF) Artificial Neural Network Functional Link

33 33 Functional Form of RBF ANN where Examples of Nonlinearities

34 34 Design Using RBF ANN Let F(x 1, x 2, …, x n ) represent the function we wish to approximate. For pattern classification F ( x ) represents the class assignment or desired output (target value) for each pattern vector x a member of the training set Define the performance measure E by E We wish to Minimize E by selecting M, ,  ,   ,   and z 1, z 2,... z M

35 35 Finding the Best Approximation using RBF ANN (1 st ) Find the number M of prototypes and the prototypes { z j : j=1, 2,..., M} by using a clustering algorithm(Presented in Chapter 6) on the training samples Usually broken into two parts (2 nd ) With these fixed M and { z j : j=1,2,..., M} find the ,  ,   ,   that minimize E. Notes: You can use any minimization procedure you wish. Training does not use the Backpropagation Algorithm

36 36 Problems Using Neural Network Designs Failure to converge Max iterations too small Lockup occurs Limit cycles Good performance on training set – poor performance on testing set Training set not representative of variation Too strict of a tolerance - “grandmothering” Selection of insufficient structure

37 37 Advantages of Neural Network Designs Can obtain a design for very complicated problems. Parallel structure using identical elements allows hardware or software implementation Structure of Neural Network Design similar for all problems.

38 38 Other problems that can be solved using Neural Network Designs System Identification Functional Approximation Control Systems Any problem that can be placed in the format of a clearly defined desired output for different given input vectors.

39 39 Famous Quotation “Neural network designs are the second best way to solve all problems”

40 40 Famous Quotation “Neural network designs are the second best way to solve all problems” ? ? ? ? ? ? ? ? ? ?

41 41 Famous Quotation “Neural network designs are the second best way to solve all problems” The promise is that a Neural Network can be used to solve all problems; however, with the caveat that there is always a better way to solve a specific problem. ? ? ? ? ? ? ? ? ? ?

42 42 So what is the best way to solve a given problem ???

43 43 So what is the best way to solve a given problem ??? ?

44 44 So what is the best way to solve a given problem ??? ? A design that uses and understands the structure of the data !!!

45 45 Summary Lecture 24 1.Reviewed and Motivated Link Structure 2.Presented the Functional Link Artificial Neural Network. 3. Presented Simple Example with designs using ANN and FLANN 4. Described Performance Measures for Neural Network Designs 5. Presented Radial Basis Function Neural Networks

46 46 6. Discussed Problems, Advantages, Disadvantages, and the Promise of Artificial Neural Network Design

47 47 End of Lecture 24


Download ppt "1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology."

Similar presentations


Ads by Google