Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a.

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
Learning in Neural and Belief Networks - Feed Forward Neural Network 2001 년 3 월 28 일 안순길.
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Artificial Intelligence 13. Multi-Layer ANNs Course V231 Department of Computing Imperial College © Simon Colton.
Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Kostas Kontogiannis E&CE
Artificial Neural Networks
Artificial Neural Networks - Introduction -
Machine Learning Neural Networks
1 Part I Artificial Neural Networks Sofia Nikitaki.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
The back-propagation training algorithm
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Chapter 6: Multilayer Neural Networks
Lecture 4 Neural Networks ICS 273A UC Irvine Instructor: Max Welling Read chapter 4.
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #31 4/17/02 Neural Networks.
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
ICS 273A UC Irvine Instructor: Max Welling Neural Networks.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Introduction to Directed Data Mining: Neural Networks
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
NEURAL NETWORKS FOR DATA MINING
For games. 1. Control  Controllers for robotic applications.  Robot’s sensory system provides inputs and output sends the responses to the robot’s motor.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Multi-Layer Perceptron
Introduction to Neural Networks. Biological neural activity –Each neuron has a body, an axon, and many dendrites Can be in one of the two states: firing.
Neural Networks - Berrin Yanıkoğlu1 Applications and Examples From Mitchell Chp. 4.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Chapter 11 – Neural Nets © Galit Shmueli and Peter Bruce 2010 Data Mining for Business Intelligence Shmueli, Patel & Bruce.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Machine Learning Supervised Learning Classification and Regression
Neural networks.
Learning with Perceptrons and Neural Networks
第 3 章 神经网络.
Real Neurons Cell structures Cell body Dendrites Axon
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
Artificial Intelligence Chapter 3 Neural Networks
Neural Networks Geoff Hulten.
DeltaV Neural – Lab Entry
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Artificial Intelligence Chapter 3 Neural Networks
Presentation transcript:

Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a good thing The Back Propagation and other training methods Network Pruning Network Development - Hints and kinks, practical application notes Examples of Neural Networks Closed loop control techniques The Neural Network Laboratory Introduction to DeltaV, HYSYS and the OLE connection Develop a soft sensor application

Neural Network Course Objectives ● Neural Network Fundamentals, Structure, Training, Testing ● How to use data to Train and Test the network ● Develop DeltaV Neural Networks ● Laboratory in DeltaV ● Design Neural Networks in MATLAB and EXCEL

Neural Networks, Heuristic ● Heuristic vs. Deterministic Model Development, NN are Heuristic Models, similar to statistics ● HYSYS, ASPEN, etc are deterministic models, using first principals

For DeltaV Neural Nets ● Understand basic principals of the DeltaV Neural Network ● Construct a DeltaV NNet and Lab Entry function blocks ● Configure the operator interface for these function blocks

DeltaV Neural ● DeltaV Neural is a complete package ● Network Development ● Training, Testing ● Operator Interface, both Neural Network and Lab Entry

DeltaV Neural Applications ● Critical process measurements available from grab samples, paper properties, food properties, etc. ● Backup or cross check on a measurement by a sampled or continuous analyzer, mass spec, stack gas analyzer. – Hint consider the “cost” of a sample analysis, can the neural network “fill in” for skipped samples?

DeltaV Neural Applications ● Virtual sensors; neural network can be “trained” to calculate the laboratory results, a frequent application, a.k.a. “intelligent sensors” or “soft sensors”, ISTK ● This information is used by operators to anticipate changes in plant operation ● Can develop the network in much less time than first principal methods

Example, paper mill physical property

Predict the lab results

Just what is a neural network anyway? ● Neural networks are computer programs that model the functionality of the brain. ● Multi layered feed forward model, either single or multiple outputs ● Trained (in statistics called regression) by back propagation or other algorithms ● Non Linear Regression

The Neuron ● There are estimates of as many as 100 billion neurons in the human brain. They are often less than 100 microns in diameter and have as many as 10,000 connections to other neurons. ● Each neuron has an axon that acts as a wire for all connections to the other cell's neurons. The neurons have input paths which are called dendrites which gather information from these axons. The connection between the dendrites and the axon is called a synapse. The transmission of signals across the synapse is chemical in nature and the magnitude of the signal depends on the amount of chemicals (called neurotransmitter) released by the axon. Many drugs work on the basis of changing those natural chemicals. The synapse combined with the processing of information in the neuron is how the brain's memory process functions.

The neuron is a nerve cell Dendrites Axon Synapses

Brain – Neural Network Analogy BrainNeural Networks NeuronCell Neural firing rateActivation Synaptic strength Connection weight SynapseConnection Cell body that contains the nucleus Dendrites or Inputs that connect impulses to the cell body Axon conducts impulses away from the cell body Synapses are gaps between neurons that act as outputs and are closely related to the input to adjoining dendrites. This connection is chemical in nature and that chemical concentration is how the weight is determined. Within the cell the inputs are weighed, that is given weight to their strength. If the weight is strong enough, the neuron “fires”, or an output is triggered.

Brain – Neural Network Analogy

Neural Network, Under the hood ● Layer approach – Input Layer – Hidden Layer (or Layers) – Output Layer N 1 /N 2 /N 3 Notation

DeltaV Neural Three Layered Network ● Only one “hidden” Layer ● Only one output ● With enough hidden layers can represent any continuous non-linear function ● Can track either a single continuous variable or a sampled variable, Lab Analysis

Neural Networks – Hidden Layer ● 1 Hidden layer sufficient to model a continuous function of several variables ● Exception to the rule: Inverse action requires 2 layers

DeltaV Neural - Output ● DeltaV Neural Network is designed for one output ● Why? ●  of errors will not properly distribute with more than one output

Network Structure ● Inputs and Scaling ● Synaptic Weights ● Neuron, Summation and Transfer Function ● Layer Concept, input, hidden and output ● Output scaling and the Outputs

Network Structure – Input/Output Scaling ● PV ranges must be normalized so each variable has the same input factor for presentation to the network. ● Scaling around zero Scaled PV = (PV – mean)/  ●  is standard deviation

The Network is a Collection of Neurons and Weights ● Multi-layered feed forward network (MFN) ● Bias Neurons – Connected to each neuron except the input later – Provide a constant value or “bias” to the network

The Neuron

DeltaV - Building the Network Data Collection – The process uses data to design the network, good data is essential Data Preprocessing – Remove outliers and missing points. 3 sigma rule Variable and time delay selection - Determines which variable to use as well as the timing

Building the Network, continued Network Training – Operation number of hidden neurons, adjusts the weights on the conditioned training set, learning the data Network Verification – Checks how well the network behaves against actual data

DeltaV - Training the Network Divides the data in three sets, Training, Testing and Verification Presents Training data to the network For each data set, presents the inputs, forward propagate the training set through all layers and finally the output Compares the output to the target value, adjust the weights based on error, back-propagation One training pass through all the data is called an epoch

Training the Network Present the testing set data to the network after the weights are adjusted in one epoch Propagate the test signals through the network, to the output Compare the results, if small error exists, complete, otherwise repeat the training process

Training the Network ● Use “Balanced” Design, many points over the total range of network inputs ● Consider using techniques employed in design of experiments, DOE ● If most of the data is at one process point, it will learn that point very well.

Gradient Descent Learning ● Back propagation adjusts the weights to reduce the error between the network and the training set ● p is the pattern index i is the output nodes indexes, d is the desired output and y is the actual output.

Gradient Descent Learning Process is: a. Set weights to a random value 1. For each sample point in the training set, calculate the output value and the error. 2. Calculate the derivative  E/  w 3. Adjust the weights to minimize the error 4. GOTO 1. until error decreases to the pre determined value or the number of epochs exceeds a pre determined value

Gradient Descent Learning Two methods for weight update, batch mode and on line mode. Batch mode: The pattern partial error is summed to obtain the total derivative:

Gradient Descent Learning ● On Line mode: Weights are updated based on the partial of the error with respect to weight based on one pattern or entry of test values. This is implemented using a momentum term. Momentum adds a portion of the previous weight to the new change 0 <  < 0.9

Conjugate Gradient Method ● DeltaV Neural uses Conjugate Gradient Method ● Uses previous gradients ● Adapts learning rate ● No need to specify momentum or learning rate factor

Training Criteria ● Predict, not memorize the data presented ● DeltaV neural training software cross validates network against test set to locate lease test error, will not over or under train

Verification of NN Accuracy ● Compare predicted and actual values ● Verification should be done on a set not used for training and testing ● It is very important that the data points in verification data set is within the values used to train and test the network. The training set must contain the minimum and maximum points!

DeltaV Neural ‘single-button’ Method ● All the tools to develop a network are built in to the DeltaV engineers workstation software

DeltaV Neural Network ● Function blocks, similar to AI, PID, etc ● Lab Entry block for entry of analytical data

DeltaV - Input Data for NN Training ● Process data and lab analysis is collected when the NN and Lab Entry blocks are downloaded ● The NN application uses the data collected by the DeltaV historian ● Can input legacy data or data collected by another system as a flat text file

DeltaV NN Block ● Can Access Data anywhere within the control system ● Maximum of 20 references (30 in final release?) ● 3 Modes – Auto: Prediction of output based on the input – Manual: OUT can be set manually – Out of Service: OUT is set to a Bad status, no calculations are executed

Neural Networks for Control ● Using Neural Networks for feedfoward and decoupling control interactions; an improvement to conventional PID control ● Using inverted networks for direct control, non- PID method