Download presentation

Presentation is loading. Please wait.

Published byChance Charter Modified over 2 years ago

1
1 Computation in neural networks M. Meeter

2
2 Perceptron learning problem Input Patterns Desired output [+1, +1, -1, -1] [+1, -1, +1] [-1, -1, +1, +1] [+1, +1, -1] [-1, -1, -1, -1] [-1, -1, +1, -1] [-1, -1, -1] [-1, +1, +1, -1][-1, +1, +1] [+1, -1, +1, -1] Calculating a function

3
3 Types of networks & functions Attractor Feedfwrd Hebbian associative (Hebbian) competitive Feedfwrd error corr. perceptron backprop completion, autoass. memory association, assoc. memory clustering categorization, generalization nonlinear, same

4
4 Types of networks Attractor Feedfwrd Hebbian associative (Hebbian) competitive Feedfwrd error corr. perceptron backprop completion, autoass. memory association, assoc. memory clustering categorization, generalization nonlinear, same

5
5 Classification A

6
6 Generalization ?

7
7 Univariate Linear Regression prediction of values Regression = generalization

8
8 Clustering

9
9 Types of networks Attractor Feedfwrd Hebbian associative (Hebbian) competitive Feedfwrd error corr. perceptron backprop completion, autoass. memory association, assoc. memory clustering categorization, generalization nonlinear, same

10
10 Perceptron learning problem Prototypical Input Patterns Desired output [+1, +1, -1, -1] [+1, -1, +1] [-1, -1, +1, +1] [+1, +1, -1] [-1, -1, -1, -1] [-1, -1, +1, -1] [-1, -1, -1] [-1, +1, +1, -1][-1, +1, +1] [+1, -1, +1, -1] Classification - discrete

11
11 Perceptron learning problem Prototypical Input Patterns Desired output [+1, +1, -1, -1] [+1, -1, +1] [-1, -1, +1, +1] [+1, +1, -1] [-1, -1, -1, -1] [-1, -1, +1, -1] [-1, -1, -1] [-1, +1, +1, -1][-1, +1, +1] [+1, -1, +1, -1] Classification - discrete

12
12 XiXi X1X1 X2X2 XnXn w ji threshold Classification in Perceptron

13
13 Effe tussendoor… Bij perceptron etc.: net input knoop>0 dan activatie 0 Niet altijd gewenst: daarom heeft knoop in continue vormen perceptron / backprop een ‘bias’, een activatie die altijd bij input opgeteld wordt Effect: verschuiven threshold

14
14 Classification in 2 dimensions Threshold Input= Threshold Input= mixture + -

15
15 Discriminant Analysis Produces exact same result Find center of two categories, draw line in between, then one diagonal in middle = discrimination line

16
16 Univariate Linear Regression prediction of values Generalization = Regression

17
17 XiXi Activation function (·) X1X1 X2X2 XnXn y Change weights with rule, minimizing e 2 j w ji v = x i *w ji (v) = av + b Bias Perceptron with linear activation rule

18
18 Multivariate = multiple independent variables X =multiple inputs X i X 1 X 2 X n 1 y 1 2 y 2 X Y1Y1 Y2Y2 Multivariate Multiple Linear Regression Multiple = multiple dependent variables Y =multiple outputs

19
19 Linear vs. nonlinear regression linear x y nonlinear x y Here: quadratic General: wrinkle-fitting

20
20 y1y1 y2y2 XX X= [x 1, x 2,.., x i,.., x n ] *wxv i jii Multi-Layer Perceptron Fit any function: “Universal approximators”

21
21 x y Too simple model Bad Too complex model x y Extremely bad Overfitting

22
22 Clustering Competitive learning: next week ART

23
23 Conclusions Neural networks similar to statistical analyses Perceptron-> categorization / generalization Backprop-> same but nonlinear Competitive l.-> clustering But… Whole data set vs. one pattern at a time

24
24 Feature reduction with PCA

25
25 Feature extraction with PCA ?? Unsupervised Learning Hebbian Learning

Similar presentations

© 2016 SlidePlayer.com Inc.

All rights reserved.

Ads by Google