Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology.

Similar presentations


Presentation on theme: "1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology."— Presentation transcript:

1 1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology

2 2 Lecture 20 Topics 1. Perceptron Algorithm Revisited 2. Local Delta Training Algorithm for ANE 3. General Definition of Neural Networks 4. Basic Neural Network Structures-Examples 5. Analysis and Synthesis of Neural Networks

3 3 Signum Function Activation Training Algorithm(Perceptron) Weight Update Algorithm y = +1 if input vector x is from C 1 y = -1 if input vector x is from C 2 Review

4 4 How do we train an Artificial Neural Element(ANE) to do classification ??? Question Answer Use the Delta Training Algorithm !!!

5 5 Given an Artificial Neural Element as follows Wish to find weight vector such that training patterns are correctly classified

6 6 x(p) ε { x 1, x 2, …, x K } d( x(p) ) = { d(x 1 ), d(x 2 ), …, d(x K ) } Define a performance measure E p for sample x(p) and decision d[ x(p) ] as Given:

7 7 Use the gradient method to minimize E p New Weight w k+1 in terms of previous weight w k where the Gradient is Derivation of Delta weight update Equation

8 8 Substituting the gradient vector into the weight update gives the General Local Delta Algorithm or rewriting gives w(p+1) = w(p) + {d[x(p)] – f(net)} f / (net)) x(p) where net = w T (p)x(p) General Local Delta Algorithm Weight Update Equation

9 9 Continuous Perceptron Training Algorithm Sometimes called the

10 10 Case 1: Local Delta Algorithm for Training an ANE with Logistic Activation Function Given: Solution:

11 11 Substituting the derivative gives the Local algorithm for the Logistic Activation function as Local Weight Update Equation for Logistic Activation Function

12 12 Case 2: Local Delta Algorithm for for Training an ANE - Hyperbolic Tangent Activation Function Given: Solution; Taking derivative of the nonlinearity and substituting into the general update equation yields the following Local Weight Update Equation for Hyperbolic Activation Function

13 13 Scale Factors for Case 2: Tanh Activation Function SF = ( d[x(p) ] –f(net) )(1 – f 2 (net) ) d[x(p)]= 1 SF 1 = ( 1 – f(net) )(1 – f 2 (net) ) d[x(p)] = -1 SF -1 = ( -1 – f(net) )(1 – f 2 (net) ) d[x(p)]= 1d[x(p)] = -1

14 14 Scale Factors for Case 2: Tanh Activation Function (desired values = +0.9 and -0.9 )

15 15 Case 3: Local Delta Algorithm for Training an ANE - Linear Activation Function Given: Solution:Taking derivative and substituting in general update equation gives Local Weight Update Equation for Linear Activation Function ( Widrow-Hoff Training Rule )

16 16 General Global Delta Algorithm Define a performance measure E TOT for all samples x k and decisions d[ x k ) ] as Using Gradient technique gives the Global Delta Algorithm as Global Weight Update Equation

17 17 Definitions A Neural Network is defined as any connection of Neural Elements. An Artificial Neural Network is defined as any connection of Artificial Neural Elements.

18 18 Examples of Artificial Neural Networks (a) Two Layer neural Network (b) Special Three Layer Form: Hyperplane-AND-OR structure (c) General 3-Layer Feedforward structure and nomenclature Feedback Artificial Neural Networks (d) One Layer Hopfield Net (e) Two Layer Feedback Feed Forward Artificial Neural Networks

19 19 (a) Example - Two Layer Neural Network Using Signum Nonlinearity

20 20 (b) Special Hyperplane-AND-OR structure x HyperplanesLogical AND Logical OR y input output Layer 1 Layer 2 Layer 3

21 21 Building Block- Hyperplane μ

22 22 Building Block- AND μ -(n-½)

23 23 Building Block- OR ½ μ

24 24 AND Layer OR Layer Hyperplanes Layer all f(·) = u(·) unit step (b) Example- Hyperplanes-AND-OR Structure

25 25 (c) General Feedforward Structure

26 26 (d) Example: Feedback Structure one Layer

27 27 (e) Example: Feedback Structure Two Layer /

28 28 Definitions: Analysis of Neural Networks- Synthesis of Neural Networks- Given a Neural Network describe the output for all inputs ( Mathematical or computer generated) Given a list of properties and requirements build a Neural Network to satisfy the requirements ( Mathematical or computer generated)

29 29 Example: Analyze the following Neural Network 1 1 1 0 0 0 1 Determine the output y 1 (2) for all (x 1,x 2 ). Solution: (Next Lecture)

30 30 Example: Synthesize a Neural Network Given the following decision regions build a neural network to perform the classification process Solution: Use Hyperplane-AND-OR Structure (Next Lecture)

31 31 Summary Lecture 20 1. Perceptron Algorithm Revisited 2. Local Delta Training Algorithms for ANE 3. General Definition of Neural Networks 4. Basic Neural Network Structures-Examples 5. Analysis and Synthesis of Neural Networks

32 32 Question How do we train an Artificial Neural Network to perform the classification problem??? Answer Not a simple answer but we will look at one way that uses the backpropagation algorithm to do the Training. Not Today, we have to wait until Friday. ☺☻☺☻☺☻☺☻☺

33 33 End of Lecture 20


Download ppt "1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 20 Oct 26, 2005 Nanjing University of Science & Technology."

Similar presentations


Ads by Google