Presentation is loading. Please wait.

Presentation is loading. Please wait.

Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.

Similar presentations


Presentation on theme: "Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer."— Presentation transcript:

1 Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer

2 Bab 5-4 - 2/21 Artificial Neural Networks (ANN) / 1 Output Y is 1 if at least two of the three inputs are equal to 1.

3 Bab 5-4 - 3/21 Artificial Neural Networks (ANN) / 2

4 Bab 5-4 - 4/21 Artificial Neural Networks (ANN) / 3 l Model is an assembly of inter-connected nodes and weighted links l Output node sums up each of its input value according to the weights of its links l Compare output node against some threshold t Perceptron Model or

5 Bab 5-4 - 5/21 General Structure of ANN Training ANN means learning the weights of the neurons

6 Bab 5-4 - 6/21 Algorithm for Learning ANN l Initialize the weights (w 0, w 1, …, w k ) l Adjust the weights in such a way that the output of ANN is consistent with class labels of training examples –Objective function: –Find the weights w i ’s that minimize the above objective function  e.g., backpropagation algorithm

7 Bab 5-4 - 7/21 Artificial Neural Networks (ANN) / 2

8 Bab 5-4 - 8/21 Perceptron

9 Bab 5-4 - 9/21 l Let D = {(x i, y i ) | i= 1,2,…,N} be the set of training examples l Initialize the weights l Repeat –For each training example (x i, y i ) do  Compute f(w, x i )  For each weight w j do Update the weight l Until stopping condition is met Perceptron Learning Rule / 1

10 Bab 5-4 - 10/21 l Weight update formula: l Intuition: –Update weight based on error –If y = f(w,x), e = 0, no update is needed –If y > f(w,x), e = 2, weight must be increased so that f(w,x) will increase –If y < f(w,x), e = -2, weight must be decreased so that f(w,x) will decrease Perceptron Learning Rule / 2

11 Bab 5-4 - 11/21 l Terminating condition: Training stops when either 1. all  w ij in the previous epoch (i.e., iteration) were so small as to be below some specified threshold, or 2. the percentage of samples misclassified in the previous epoch is below some threshold, or 3. a pre-specified number of epochs has expired. In practice, several hundreds of thousands of epochs may be required before the weights will converge Perceptron Learning Rule / 3

12 Bab 5-4 - 12/21 Example of Perceptron Learning

13 Bab 5-4 - 13/21 Perceptron Learning

14 Bab 5-4 - 14/21 Nonlinearly Separable Data

15 Bab 5-4 - 15/21 Multilayer Neural Network / 1

16 Bab 5-4 - 16/21 Multilayer Neural Network / 2

17 Bab 5-4 - 17/21 Learning Multilayer Neural Network

18 Bab 5-4 - 18/21 Gradient Descent for Multilayer NN / 1

19 Bab 5-4 - 19/21 Gradient Descent for Multilayer NN / 2

20 Bab 5-4 - 20/21 Design Issues in ANN

21 Bab 5-4 - 21/21 Characteristics of ANN


Download ppt "Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer."

Similar presentations


Ads by Google