Download presentation
Presentation is loading. Please wait.
1
Introduction to Neural Networks John Paxton Montana State University Summer 2003
2
Textbook Fundamentals of Neural Networks: Architectures, Algorithms, and Applications Laurene Fausett Prentice-Hall 1994
3
Chapter 1: Introduction Why Neural Networks? Training techniques exist. High speed digital computers. Specialized hardware. Better capture biological neural systems.
4
Who is interested? Electrical Engineers – signal processing, control theory Computer Engineers – robotics Computer Scientists – artificial intelligence, pattern recognition Mathematicians – modelling tool when explicit relationships are unknown
5
Characterizations Architecture – a pattern of connections between neurons Learning Algorithm – a method of determining the connection weights Activation Function
6
Problem Domains Storing and recalling patterns Classifying patterns Mapping inputs onto outputs Grouping similar patterns Finding solutions to constrained optimization problems
7
A Simple Neural Network x2x2 y w1w1 w2w2 x1x1 y in = x 1 w 1 + x 2 w 2 Activation is f(y in )
8
Biological Neuron Dendrites receive electrical signals affected by chemical process Soma fires at differing frequencies soma dendrite axon
9
Observations A neuron can receive many inputs Inputs may be modified by weights at the receiving dendrites A neuron sums its weighted inputs A neuron can transmit an output signal The output can go to many other neurons
10
Features Information processing is local Memory is distributed (short term = signals, long term = dendrite weights) The dendrite weights learn through experience The weights may be inhibatory or excitatory
11
Features Neurons can generalize novel input stimuli Neurons are fault tolerant and can sustain damage
12
Applications Signal processing, e.g. suppress noise on a phone line. Control, e.g. backing up a truck with a trailer. Pattern recognition, e.g. handwritten characters or face sex identification. Diagnosis, e.g. aryhthmia classification or mapping symptoms to a medical case.
13
Applications Speech production, e.g. NET Talk. Sejnowski and Rosenberg 1986. Speech recognition. Business, e.g. mortgage underwriting. Collins et. Al. 1988. Unsupervised, e.g. TD-Gammon.
14
Single Layer Feedforward NN x1x1 xnxn y1y1 ymym w 11 w 1m w n1 w nm
15
Multilayer Neural Network More powerful Harder to train x1x1 xnxn zpzp z1z1 ymym y1y1
16
Setting the Weight Supervised Unsupervised Fixed weight nets
17
Activation Functions Identity f(x) = x Binary step f(x) = 1 if x >= f(x) = 0 otherwise Binary sigmoid f(x) = 1 / (1 + e - x )
18
Activation Functions Bipolar sigmoid f(x) = -1 + 2 / (1 + x ) Hyperbolic tangent f(x) = (e x – e -x ) / (e x + e -x )
19
History 1943 McCulloch-Pitts neurons 1949 Hebb’s law 1958 Perceptron (Rosenblatt) 1960 Adaline, better learning rule (Widrow, Huff) 1969 Limitations (Minsky, Papert) 1972 Kohonen nets, associative memory
20
History 1977 Brain State in a Box (Anderson) 1982 Hopfield net, constraint satisfaction 1985 ART (Carpenter, Grossfield) 1986 Backpropagation (Rumelhart, Hinton, McClelland) 1988 Neocognitron, character recognition (Fukushima)
21
McCulloch-Pitts Neuron x1x1 x2x2 x3x3 y f(y in ) = 1 if y in >=
22
Exercises 2 input AND 2 input OR 3 input OR 2 input XOR
Similar presentations
© 2024 SlidePlayer.com Inc.
All rights reserved.