Cascade Correlation Architecture and Learning Algorithm for Neural Networks.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
Kostas Kontogiannis E&CE
Modular Neural Networks CPSC 533 Franco Lee Ian Ko.
PROTEIN SECONDARY STRUCTURE PREDICTION WITH NEURAL NETWORKS.
Artificial Intelligence (CS 461D)
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Neural Networks Basic concepts ArchitectureOperation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Back-Propagation Algorithm
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Company LOGO Use of Neural Networks in Automatic Caricature Generation: An Approach based on Drawing Style Capture By Rupesh Shet, K.H.Lai Dr. Eran Edirisinghe.
October 28, 2010Neural Networks Lecture 13: Adaptive Networks 1 Adaptive Networks As you know, there is no equation that would tell you the ideal number.
Dan Simon Cleveland State University
November 21, 2012Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms III 1 Learning in the BPN Gradients of two-dimensional functions:
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Microsoft Enterprise Consortium Data Mining Concepts Introduction to Directed Data Mining: Neural Networks Prepared by David Douglas, University of ArkansasHosted.
Traffic Sign Recognition Using Artificial Neural Network Radi Bekker
Multiple-Layer Networks and Backpropagation Algorithms
Artificial Neural Networks
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
Chapter 11 – Neural Networks COMP 540 4/17/2007 Derek Singer.
Appendix B: An Example of Back-propagation algorithm
Chapter 7 Neural Networks in Data Mining Automatic Model Building (Machine Learning) Artificial Intelligence.
Classification / Regression Neural Networks 2
Radial Basis Function Networks:
Artificial Intelligence Methods Neural Networks Lecture 4 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 CMSC 671 Fall 2001 Class #25-26 – Tuesday, November 27 / Thursday, November 29.
Multi-Layer Perceptron
CS621 : Artificial Intelligence
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Neural Network Terminology
NEURAL NETWORKS LECTURE 1 dr Zoran Ševarac FON, 2015.
Perceptrons Michael J. Watts
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Essential components of the implementation are:  Formation of the network and weight initialization routine  Pixel analysis of images for symbol detection.
Bab 5 Classification: Alternative Techniques Part 4 Artificial Neural Networks Based Classifer.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Neural Networks Lecture 11: Learning in recurrent networks Geoffrey Hinton.
Soft Computing Lecture 15 Constructive learning algorithms. Network of Hamming.
Dimensions of Neural Networks Ali Akbar Darabi Ghassem Mirroshandel Hootan Nokhost.
Kim HS Introduction considering that the amount of MRI data to analyze in present-day clinical trials is often on the order of hundreds or.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Multiple-Layer Networks and Backpropagation Algorithms
Neural Networks.
Chapter 4 Supervised learning: Multilayer Networks II
Artificial Intelligence (CS 370D)
Prof. Carolina Ruiz Department of Computer Science
Artificial Intelligence Methods
Artificial Neural Network & Backpropagation Algorithm
network of simple neuron-like computing elements
Neural Network - 2 Mayank Vatsa
Lecture Notes for Chapter 4 Artificial Neural Networks
COSC 4335: Part2: Other Classification Techniques
Prof. Carolina Ruiz Department of Computer Science
Presentation transcript:

Cascade Correlation Architecture and Learning Algorithm for Neural Networks

Outline  What is Cascade Correlation ?  NN Terminology  CC Architecture and learning Algorithm  Advantages of CC  References

What is Cascade Correlation ?  Cascade-correlation (CC) is an architecture and generative, feed-forward, supervised learning algorithm for artificial neural networks.  Cascade-Correlation begins with a minimal network, then automatically trains and adds new hidden units one by one creating a multi-layer structure.

NN Terminology  An artificial neural network (ANN) is composed of units and connections between the units. Units in ANNs can be seen as analogous to neurons or perhaps groups of neurons.  Connection weights determine an organizational topology for a network and allow units to send activation to each other.  Input units code the problem being presented to the network.  Output units code the network’s response to the input problem.

NN Terminology  Hidden units perform essential intermediate computations.  Input function is a linear component which computes the weighted sum of the units input values.  Activation function is a non-linear component which transforms the weighted sum in to final output value  In cascade-correlation, there are cross-connections that bypass hidden units.

CC Architecture and learning Algorithm  Cascade-Correlation (CC) combines two ideas: The first is the cascade architecture, in which hidden units are added only one at a time and do not change after they have been added. The second is the learning algorithm, which creates and installs the new hidden units. For each new hidden unit, the algorithm tries to maximize the magnitude of the correlation between the new unit's output and the residual error signal of the network.

The Algorithm 1.CC starts with a minimal network consisting only of an input and an output layer. Both layers are fully connected. 2.Train all the connections ending at an output unit with a usual learning algorithm until the error of the net no longer decreases. 3.Generate the so-called candidate units. Every candidate unit is connected with all input units and with all existing hidden units. Between the pool of candidate units and the output units there are no weights.

The Algorithm 4. Try to maximize the correlation between the activation of the candidate units and the residual error of the net by training all the links leading to a candidate unit. Learning takes place with an ordinary learning algorithm. The training is stopped when the correlation scores no longer improves. 5. Choose the candidate unit with the maximum correlation, freeze its incoming weights and add it to the net.

The Algorithm 5.To change the candidate unit into a hidden unit, generate links between the selected unit and all the output units. Since the weights leading to the new hidden unit are frozen, a new permanent feature detector is obtained. Loop back to step 2. 6.This algorithm is repeated until the overall error of the net falls below a given value

A Neural Network trained with Cascade Correlation Algorithm

Advantages of CC  It learns at least 10 times faster than standard Back-propagation Algorithms.  The network determines its own size and topologies.  It is useful for incremental learning in which new information is added to the already trained network.

References  The Cascade Correlation Learning Architecture. Scott Fahlman and Christian Lebiere.  A Tutorial on Cascade-correlation. Thomas R. Shultz