Presentation is loading. Please wait.

Presentation is loading. Please wait.

Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a.

Similar presentations


Presentation on theme: "Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a."— Presentation transcript:

1 Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a good thing The Back Propagation and other training methods Network Pruning Network Development - Hints and kinks, practical application notes Examples of Neural Networks Closed loop control techniques The Neural Network Laboratory Introduction to DeltaV, HYSYS and the OLE connection Develop a soft sensor application

2 Neural Network Course Objectives ● Neural Network Fundamentals, Structure, Training, Testing ● How to use data to Train and Test the network ● Develop DeltaV Neural Networks ● Laboratory in DeltaV ● Design Neural Networks in MATLAB and EXCEL

3 Neural Networks, Heuristic ● Heuristic vs. Deterministic Model Development, NN are Heuristic Models, similar to statistics ● HYSYS, ASPEN, etc are deterministic models, using first principals

4 For DeltaV Neural Nets ● Understand basic principals of the DeltaV Neural Network ● Construct a DeltaV NNet and Lab Entry function blocks ● Configure the operator interface for these function blocks

5 DeltaV Neural ● DeltaV Neural is a complete package ● Network Development ● Training, Testing ● Operator Interface, both Neural Network and Lab Entry

6 DeltaV Neural Applications ● Critical process measurements available from grab samples, paper properties, food properties, etc. ● Backup or cross check on a measurement by a sampled or continuous analyzer, mass spec, stack gas analyzer. – Hint consider the “cost” of a sample analysis, can the neural network “fill in” for skipped samples?

7 DeltaV Neural Applications ● Virtual sensors; neural network can be “trained” to calculate the laboratory results, a frequent application, a.k.a. “intelligent sensors” or “soft sensors”, ISTK ● This information is used by operators to anticipate changes in plant operation ● Can develop the network in much less time than first principal methods

8 Example, paper mill physical property

9 Predict the lab results

10 Just what is a neural network anyway? ● Neural networks are computer programs that model the functionality of the brain. ● Multi layered feed forward model, either single or multiple outputs ● Trained (in statistics called regression) by back propagation or other algorithms ● Non Linear Regression

11 The Neuron ● There are estimates of as many as 100 billion neurons in the human brain. They are often less than 100 microns in diameter and have as many as 10,000 connections to other neurons. ● Each neuron has an axon that acts as a wire for all connections to the other cell's neurons. The neurons have input paths which are called dendrites which gather information from these axons. The connection between the dendrites and the axon is called a synapse. The transmission of signals across the synapse is chemical in nature and the magnitude of the signal depends on the amount of chemicals (called neurotransmitter) released by the axon. Many drugs work on the basis of changing those natural chemicals. The synapse combined with the processing of information in the neuron is how the brain's memory process functions.

12 The neuron is a nerve cell Dendrites Axon Synapses

13 Brain – Neural Network Analogy BrainNeural Networks NeuronCell Neural firing rateActivation Synaptic strength Connection weight SynapseConnection Cell body that contains the nucleus Dendrites or Inputs that connect impulses to the cell body Axon conducts impulses away from the cell body Synapses are gaps between neurons that act as outputs and are closely related to the input to adjoining dendrites. This connection is chemical in nature and that chemical concentration is how the weight is determined. Within the cell the inputs are weighed, that is given weight to their strength. If the weight is strong enough, the neuron “fires”, or an output is triggered.

14 Brain – Neural Network Analogy

15 Neural Network, Under the hood ● Layer approach – Input Layer – Hidden Layer (or Layers) – Output Layer N 1 /N 2 /N 3 Notation

16 DeltaV Neural Three Layered Network ● Only one “hidden” Layer ● Only one output ● With enough hidden layers can represent any continuous non-linear function ● Can track either a single continuous variable or a sampled variable, Lab Analysis

17 Neural Networks – Hidden Layer ● 1 Hidden layer sufficient to model a continuous function of several variables ● Exception to the rule: Inverse action requires 2 layers

18 DeltaV Neural - Output ● DeltaV Neural Network is designed for one output ● Why? ●  of errors will not properly distribute with more than one output

19

20 Network Structure ● Inputs and Scaling ● Synaptic Weights ● Neuron, Summation and Transfer Function ● Layer Concept, input, hidden and output ● Output scaling and the Outputs

21 Network Structure – Input/Output Scaling ● PV ranges must be normalized so each variable has the same input factor for presentation to the network. ● Scaling around zero Scaled PV = (PV – mean)/  ●  is standard deviation

22 The Network is a Collection of Neurons and Weights ● Multi-layered feed forward network (MFN) ● Bias Neurons – Connected to each neuron except the input later – Provide a constant value or “bias” to the network

23 The Neuron

24

25 DeltaV - Building the Network Data Collection – The process uses data to design the network, good data is essential Data Preprocessing – Remove outliers and missing points. 3 sigma rule Variable and time delay selection - Determines which variable to use as well as the timing

26 Building the Network, continued Network Training – Operation number of hidden neurons, adjusts the weights on the conditioned training set, learning the data Network Verification – Checks how well the network behaves against actual data

27 DeltaV - Training the Network Divides the data in three sets, Training, Testing and Verification Presents Training data to the network For each data set, presents the inputs, forward propagate the training set through all layers and finally the output Compares the output to the target value, adjust the weights based on error, back-propagation One training pass through all the data is called an epoch

28 Training the Network Present the testing set data to the network after the weights are adjusted in one epoch Propagate the test signals through the network, to the output Compare the results, if small error exists, complete, otherwise repeat the training process

29 Training the Network ● Use “Balanced” Design, many points over the total range of network inputs ● Consider using techniques employed in design of experiments, DOE ● If most of the data is at one process point, it will learn that point very well.

30 Gradient Descent Learning ● Back propagation adjusts the weights to reduce the error between the network and the training set ● p is the pattern index i is the output nodes indexes, d is the desired output and y is the actual output.

31 Gradient Descent Learning Process is: a. Set weights to a random value 1. For each sample point in the training set, calculate the output value and the error. 2. Calculate the derivative  E/  w 3. Adjust the weights to minimize the error 4. GOTO 1. until error decreases to the pre determined value or the number of epochs exceeds a pre determined value

32 Gradient Descent Learning Two methods for weight update, batch mode and on line mode. Batch mode: The pattern partial error is summed to obtain the total derivative:

33 Gradient Descent Learning ● On Line mode: Weights are updated based on the partial of the error with respect to weight based on one pattern or entry of test values. This is implemented using a momentum term. Momentum adds a portion of the previous weight to the new change 0 <  < 0.9

34 Conjugate Gradient Method ● DeltaV Neural uses Conjugate Gradient Method ● Uses previous gradients ● Adapts learning rate ● No need to specify momentum or learning rate factor

35 Training Criteria ● Predict, not memorize the data presented ● DeltaV neural training software cross validates network against test set to locate lease test error, will not over or under train

36 Verification of NN Accuracy ● Compare predicted and actual values ● Verification should be done on a set not used for training and testing ● It is very important that the data points in verification data set is within the values used to train and test the network. The training set must contain the minimum and maximum points!

37 DeltaV Neural ‘single-button’ Method ● All the tools to develop a network are built in to the DeltaV engineers workstation software

38 DeltaV Neural Network ● Function blocks, similar to AI, PID, etc ● Lab Entry block for entry of analytical data

39 DeltaV - Input Data for NN Training ● Process data and lab analysis is collected when the NN and Lab Entry blocks are downloaded ● The NN application uses the data collected by the DeltaV historian ● Can input legacy data or data collected by another system as a flat text file

40 DeltaV NN Block ● Can Access Data anywhere within the control system ● Maximum of 20 references (30 in final release?) ● 3 Modes – Auto: Prediction of output based on the input – Manual: OUT can be set manually – Out of Service: OUT is set to a Bad status, no calculations are executed

41 Neural Networks for Control ● Using Neural Networks for feedfoward and decoupling control interactions; an improvement to conventional PID control ● Using inverted networks for direct control, non- PID method


Download ppt "Neural Networks Introduction – What is a neural network and what can it do for me? Terminology, Design and Topology Data Sets – When too much is not a."

Similar presentations


Ads by Google