1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology.

Slides:



Advertisements
Similar presentations
G53MLE | Machine Learning | Dr Guoping Qiu
Advertisements

Perceptron Learning Rule
NEURAL NETWORKS Perceptron
Support Vector Machines
B.Macukow 1 Lecture 12 Neural Networks. B.Macukow 2 Neural Networks for Matrix Algebra Problems.
Chapter 9 Perceptrons and their generalizations. Rosenblatt ’ s perceptron Proofs of the theorem Method of stochastic approximation and sigmoid approximation.
Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin.
Linear Discriminant Functions
Simple Neural Nets For Pattern Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Radial Basis Functions
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Chapter 2: Pattern Recognition
Announcements  Project proposal is due on 03/11  Three seminars this Friday (EB 3105) Dealing with Indefinite Representations in Pattern Recognition.
CES 514 – Data Mining Lecture 8 classification (contd…)
An Illustrative Example
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Chapter 6: Multilayer Neural Networks
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Chapter 4 (part 2): Non-Parametric Classification
Data Mining with Neural Networks (HK: Chapter 7.5)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Linear Discriminant Functions Chapter 5 (Duda et al.)
Aula 4 Radial Basis Function Networks
Radial-Basis Function Networks
Radial Basis Function Networks
8/10/ RBF NetworksM.W. Mak Radial Basis Function Networks 1. Introduction 2. Finding RBF Parameters 3. Decision Surface of RBF Networks 4. Comparison.
1 Linear Methods for Classification Lecture Notes for CMPUT 466/551 Nilanjan Ray.
Radial Basis Function Networks
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Artificial Neural Networks Shreekanth Mandayam Robi Polikar …… …... … net k   
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 23 Nov 2, 2005 Nanjing University of Science & Technology.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
An Introduction to Support Vector Machine (SVM) Presenter : Ahey Date : 2007/07/20 The slides are based on lecture notes of Prof. 林智仁 and Daniel Yeung.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 14 Oct 14, 2005 Nanjing University of Science & Technology.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Computational Intelligence: Methods and Applications Lecture 23 Logistic discrimination and support vectors Włodzisław Duch Dept. of Informatics, UMK Google:
Non-Bayes classifiers. Linear discriminants, neural networks.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 25 Nov 4, 2005 Nanjing University of Science & Technology.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 12 Sept 30, 2005 Nanjing University of Science & Technology.
An Introduction to Support Vector Machine (SVM)
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 7: Linear and Generalized Discriminant Functions.
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Elements of Pattern Recognition CNS/EE Lecture 5 M. Weber P. Perona.
Classification Course web page: vision.cis.udel.edu/~cv May 14, 2003  Lecture 34.
November 21, 2013Computer Vision Lecture 14: Object Recognition II 1 Statistical Pattern Recognition The formal description consists of relevant numerical.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Neural NetworksNN 21 Architecture We consider the architecture: feed- forward NN with one layer It is sufficient to study single layer perceptrons with.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 2 Nanjing University of Science & Technology.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Linear Discriminant Functions Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Pattern Recognition: Statistical and Neural
Neuro-Computing Lecture 4 Radial Basis Function Network
Pattern Recognition: Statistical and Neural
Presentation transcript:

1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 24 Nov 2, 2005 Nanjing University of Science & Technology

2 Lecture 24 Topics 1.Review and Motivation for Link Structure 2.Present the Functional Link Artificial Neural Network. 3.Simple Example- design using ANN and FLANN 4. Performance for Neural Network Designs 5. Radial Basis Function Neural Networks 6. Problems, Advantages, Disadvantages, and promise of Artificial Neural Network Design

3 g1(x)g1(x) g2(x)g2(x) gj(x)gj(x) gM(x)gM(x) … … x x x x x + g(x) wMwM w2w2 wjwj w1w1 Generalized Linear Discriminant Functions Review 1

4 Patterns are linearly separated in the 3-dim space Separating plane Review 2

5 Example: Decision rule using one nonlinear discriminant function g(x) Given the following g(x) and decision rule Illustrate the decision regions R 1 and R 2 where we respectively classify as C 1 and C 2 for the decision rule above Review 3

6 Solution: R 1 decide C 1 R 2 decide C 2 every where else Review 4

7 Find a generalized linear discriminant function that separates the classes Solution: d(x) = w 1 f 1 (x)+ w 2 f 2 (x)+ w 3 f 3 (x) + w 4 f 4 (x) +w 5 f 5 (x) + w 6 f 6 (x) = w T f (x) in the f space (linear) Review 5

8 where in the original pattern space: (nonlinear) Review 6

9 Decision Boundary in original pattern space x2x2 x1x1 from C 1 from C 2 d(x) = 0 Boundary Review 7

10 Potential Function Approach – Motivated by electromagnetic theory Sample space + from C 1 - from C 2 Review 8

11 Plot of Samples from the two classes Review 9

12 Given Samples x from two classes C 1 and C 2 K(x) = ∑ K(x, x k ) - ∑ K(x, x k ) x k S 2 x k S 1 C C S1S2S1S2 Define Total Potential Function Decision Boundary K(x) = 0 C1C1 C2C2 Potential Function Review 10

13 Algorithm converged in 1.75 passes through the data to give final discriminant function as Review 11

14 Functional Link Neural Network

15 Quadratic Functional Link

16 Fourier Series Functional Link

17 Principal Component Functional Link f k (x), k=1 to N are chosen as the eigen vectors of the sample covariance matrix

18 Example: Comparison of Neural Net and functional link neural net Given two pattern classes C 1 and C 2 with the following four patterns and their desired outputs

19 (a)Design an Artificial Neural Network to classify the two patterns given (b) Design a Functional Link Artificial Neural Network to classify the patterns given. (c) Compare the Neural Net and Functional Link Neural Net designs

20 Select the following structure (a) Solution: Artificial Neural Net Design

21 After training using the training set and the backpropagation algorithm the design becomes Values determined by neural net

22 (b) Solution: Functional Link Artificial Neural Net Design

23 A neural net was trained using the functional link output patterns as new pattern samples The resulting weights and structure are

24 (c) Comparison Artificial Neural Net (ANN) and Functional Link Artificial Neural Net (FLANN} Designs FLANN has simpler structure than the ANN with only one neural element and Link. Fewer iterations and computations in the training algorithm for FLANN. FLANN design may be more sensitive to errors in patterns.

25 Determining Performance of Neural Net Design on Training Set

26 Test Design on Testing Set Classify each member of the testing set using the neural network design. Determine Performance for Design using Training Set Classify each member of the training set using the neural network design.

27 Could use (a) Performance Measure E TOT (b) The Confusion Matrix (c) Probability of Error (d) Bayes Risk

28 (a) Local and global errors- Used in Neural Net Design procedure Local Measure Global Measure

29 (b) Confusion Matrix- Example Correct Classification Incorrect Classification

30 (c) Probability of Error- Example Estimates of Probabilities of being Correct Estimate of Total Probability of Error

31 (d) Bayes Risk Estimate

32 Radial Basis Function (RBF) Artificial Neural Network Functional Link

33 Functional Form of RBF ANN where Examples of Nonlinearities

34 Design Using RBF ANN Let F(x 1, x 2, …, x n ) represent the function we wish to approximate. For pattern classification F ( x ) represents the class assignment or desired output (target value) for each pattern vector x a member of the training set Define the performance measure E by E We wish to Minimize E by selecting M, ,  ,   ,   and z 1, z 2,... z M

35 Finding the Best Approximation using RBF ANN (1 st ) Find the number M of prototypes and the prototypes { z j : j=1, 2,..., M} by using a clustering algorithm(Presented in Chapter 6) on the training samples Usually broken into two parts (2 nd ) With these fixed M and { z j : j=1,2,..., M} find the ,  ,   ,   that minimize E. Notes: You can use any minimization procedure you wish. Training does not use the Backpropagation Algorithm

36 Problems Using Neural Network Designs Failure to converge Max iterations too small Lockup occurs Limit cycles Good performance on training set – poor performance on testing set Training set not representative of variation Too strict of a tolerance - “grandmothering” Selection of insufficient structure

37 Advantages of Neural Network Designs Can obtain a design for very complicated problems. Parallel structure using identical elements allows hardware or software implementation Structure of Neural Network Design similar for all problems.

38 Other problems that can be solved using Neural Network Designs System Identification Functional Approximation Control Systems Any problem that can be placed in the format of a clearly defined desired output for different given input vectors.

39 Famous Quotation “Neural network designs are the second best way to solve all problems”

40 Famous Quotation “Neural network designs are the second best way to solve all problems” ? ? ? ? ? ? ? ? ? ?

41 Famous Quotation “Neural network designs are the second best way to solve all problems” The promise is that a Neural Network can be used to solve all problems; however, with the caveat that there is always a better way to solve a specific problem. ? ? ? ? ? ? ? ? ? ?

42 So what is the best way to solve a given problem ???

43 So what is the best way to solve a given problem ??? ?

44 So what is the best way to solve a given problem ??? ? A design that uses and understands the structure of the data !!!

45 Summary Lecture 24 1.Reviewed and Motivated Link Structure 2.Presented the Functional Link Artificial Neural Network. 3. Presented Simple Example with designs using ANN and FLANN 4. Described Performance Measures for Neural Network Designs 5. Presented Radial Basis Function Neural Networks

46 6. Discussed Problems, Advantages, Disadvantages, and the Promise of Artificial Neural Network Design

47 End of Lecture 24