Presentation is loading. Please wait.

Presentation is loading. Please wait.

Introduction.

Similar presentations


Presentation on theme: "Introduction."— Presentation transcript:

1 Introduction

2 Course Objectives This course gives a basic neural network architectures and learning rules. Emphasis is placed on the mathematical analysis of these networks, on methods of training them and on their application to practical engineering problems in such areas as pattern recognition, signal processing and control systems.

3 What Will Not Be Covered
Review of all architectures and learning rules Implementation VLSI Optical Parallel Computers Biology Psychology

4 Historical Sketch Pre-1940: von Hemholtz, Mach, Pavlov, etc.
General theories of learning, vision, conditioning No specific mathematical models of neuron operation 1940s: Hebb, McCulloch and Pitts Mechanism for learning in biological neurons Neural-like networks can compute any arithmetic function 1950s: Rosenblatt, Widrow and Hoff First practical networks and learning rules 1960s: Minsky and Papert Demonstrated limitations of existing neural networks, new learning algorithms are not forthcoming, some research suspended 1970s: Amari, Anderson, Fukushima, Grossberg, Kohonen Progress continues, although at a slower pace 1980s: Grossberg, Hopfield, Kohonen, Rumelhart, etc. Important new developments cause a resurgence in the field

5 Applications Aerospace Automotive Banking Defense Electronics
High performance aircraft autopilots, flight path simulations, aircraft control systems, autopilot enhancements, aircraft component simulations, aircraft component fault detectors Automotive Automobile automatic guidance systems, warranty activity analyzers Banking Check and other document readers, credit application evaluators Defense Weapon steering, target tracking, object discrimination, facial recognition, new kinds of sensors, sonar, radar and image signal processing including data compression, feature extraction and noise suppression, signal/image identification Electronics Code sequence prediction, integrated circuit chip layout, process control, chip failure analysis, machine vision, voice synthesis, nonlinear modeling

6 Applications Financial Manufacturing Medical
Real estate appraisal, loan advisor, mortgage screening, corporate bond rating, credit line use analysis, portfolio trading program, corporate financial analysis, currency price prediction Manufacturing Manufacturing process control, product design and analysis, process and machine diagnosis, real-time particle identification, visual quality inspection systems, beer testing, welding quality analysis, paper quality prediction, computer chip quality analysis, analysis of grinding operations, chemical product design analysis, machine maintenance analysis, project bidding, planning and management, dynamic modeling of chemical process systems Medical Breast cancer cell analysis, EEG and ECG analysis, prosthesis design, optimization of transplant times, hospital expense reduction, hospital quality improvement, emergency room test advisement

7 Applications Robotics Speech Securities Telecommunications
Trajectory control, forklift robot, manipulator controllers, vision systems Speech Speech recognition, speech compression, vowel classification, text to speech synthesis Securities Market analysis, automatic bond rating, stock trading advisory systems Telecommunications Image and data compression, automated information services, real-time translation of spoken language, customer payment processing systems Transportation Truck brake diagnosis systems, vehicle scheduling, routing systems

8 Biology • Neurons respond slowly
• The brain uses massively parallel computation – »1011 neurons in the brain – »104 connections per neuron

9 Biology The dendrites are tree-like receptive networks of nerve fibers that carry electrical signals into the cell body The cell body effectively sums and thresholds these incoming signals. The axon is a single long fiber that carries the signal from the cell body out to other neurons. The point of contact between an axon of one cell and a dendrite of another cell is called a synapse.

10 Neuron Model

11 Neuron Model the weight “w” corresponds to the strength of a synapse
the cell body is represented by the summation and the transfer function the neuron output “a”represents the signal on the axon

12 Single-Input Neuron Model
The scalar input “p” is multiplied by “w” the scalar weight “w” to form “wp”, one of the terms that is sent to the summer. The other input, 1, is multiplied by a bias “b”and then passed to the summer. The summer output “n”, often referred to as the net input , goes into a transfer function, which produces the scalar neuron output “a” .

13 Neuron Model Transfer Functions
The hard limit transfer function sets the output of the neuron to 0 if the function argument is less than 0 or sets the output of the neuron to 1 if its argument is greater than or equal to 0.

14 Neuron Model Transfer Functions
The output of a linear transfer function is equal to its input a=n

15 Neuron Model Transfer Functions
This transfer function takes the input and squashes the output into the range 0 to 1, according to the expression:

16 Multiple-Input Neuron Model
The neuron has a bias b, which is summed with the weighted inputs to form the net input n n = Wp + b the neuron output can be written as a = f (Wp + b)

17 Multiple-Input Neuron Model
The neuron has a bias b, which is summed with the weighted inputs to form the net input n n = Wp + b the neuron output can be written as a = f (Wp + b) The first index indicates the particular neuron destination for that weight. The second index indicates the source of the signal fed to the neuron.

18 Multiple-Input Neuron Model Abbreviated Notation

19 Network Architectures A Layer of Neurons

20 Network Architectures A Layer of Neurons
Each of the R inputs is connected to each of the neurons The layer includes the weight matrix W, the summers, the bias vector b, the transfer function boxes and the output vector a

21 Network Architectures A Layer of Neurons
Abbreviated Notation

22 Network Architectures Multiple Layers of Neurons
Each layer has its own weight matrix , its own bias vector , a net input vector and an output vector The number of the layer as a superscript to the names for each of these variables A layer whose output is the network output is called an output layer. The other layers are called hidden layers.

23 Network Architectures Multiple Layers of Neurons
Abbreviated Notation

24 Simulation using MATLAB
Neuron output=?

25 Simulation using MATLAB
To set up this feedforward network net = newlin([1 3;1 3],1);

26 Simulation using MATLAB
To set up this feedforward network net = newlin([1 3;1 3],1); Assignments net.IW{1,1} = [1 2]; net.b{1} = 0; P = [ ; ];

27 Simulation using MATLAB
To set up this feedforward network net = newlin([1 3;1 3],1); Assignments net.IW{1,1} = [1 2]; net.b{1} = 0; P = [ ; ]; simulate the network A = sim(net,P) A =


Download ppt "Introduction."

Similar presentations


Ads by Google