Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.

Slides:



Advertisements
Similar presentations
Artificial Neural Networks
Advertisements

Bioinspired Computing Lecture 16
Slides from: Doug Gray, David Poole
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Introduction to Neural Networks Computing
Artificial Neural Networks (1)
Perceptron Learning Rule
Reducibility Class of problems A can be reduced to the class of problems B Take any instance of problem A Show how you can construct an instance of problem.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Mehran University of Engineering and Technology, Jamshoro Department of Electronic Engineering Neural Networks Feedforward Networks By Dr. Mukhtiar Ali.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
CS 678 –Relaxation and Hopfield Networks1 Relaxation and Hopfield Networks Totally connected recurrent relaxation networks Bidirectional weights (symmetric)
Machine Learning Neural Networks
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Simple Neural Nets For Pattern Classification
Neural Networks for Optimization William J. Wolfe California State University Channel Islands.
Neural Networks for Optimization Bill Wolfe California State University Channel Islands.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
Hypercubes and Neural Networks bill wolfe 10/23/2005.
Lecture 4 Neural Networks ICS 273A UC Irvine Instructor: Max Welling Read chapter 4.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
Neural Networks Lecture 17: Self-Organizing Maps
Radial Basis Function Networks
Ant Colony Optimization: an introduction
Neural Networks Lecture 8: Two simple learning algorithms
Machine Learning. Learning agent Any other agent.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Artificial Neural Networks
Chapter 7 Other Important NN Models Continuous Hopfield mode (in detail) –For combinatorial optimization Simulated annealing (in detail) –Escape from local.
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Computer Science and Engineering
Using Neural Networks in Database Mining Tino Jimenez CS157B MW 9-10:15 February 19, 2009.
Artificial Neural Networks Shreekanth Mandayam Robi Polikar …… …... … net k   
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Network Unsupervised Learning
Introduction to Artificial Neural Network Models Angshuman Saha Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg.
Simultaneous Recurrent Neural Networks for Static Optimization Problems By: Amol Patwardhan Adviser: Dr. Gursel Serpen August, 1999 The University of.
Discrete optimization of trusses using ant colony metaphor Saurabh Samdani, Vinay Belambe, B.Tech Students, Indian Institute Of Technology Guwahati, Guwahati.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
Two Discrete Optimization Problems Problem: The Transportation Problem.
Linear Models for Classification
1 Lecture 6 Neural Network Training. 2 Neural Network Training Network training is basic to establishing the functional relationship between the inputs.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Image Source: ww.physiol.ucl.ac.uk/fedwards/ ca1%20neuron.jpg
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Hopfield Neural Networks for Optimization 虞台文 大同大學資工所 智慧型多媒體研究室.
Chapter 6 Neural Network.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
Lecture 39 Hopfield Network
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Self-Organizing Network Model (SOM) Session 11
EMIS 8373: Integer Programming
Ch7: Hopfield Neural Model
One-layer neural networks Approximation problems
Other Applications of Energy Minimzation
Recurrent Neural Networks
Neural Networks for Vertex Covering
of the Artificial Neural Networks.
Perceptron as one Type of Linear Discriminants
Neural Networks Chapter 4
Lecture Notes for Chapter 4 Artificial Neural Networks
Neural Networks ICS 273A UC Irvine Instructor: Max Welling
traveling salesman problem
Hopfield Neural Networks for Optimization
Artificial Neural Networks
Presentation transcript:

Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007

Introduction An optimization problem consists of two parts: Cost function and Constraints  Constrained The constraints are built in the cost function, so minimizing the cost function also satisfies the constraints  Unconstraint There is no constraint for the problem!  Combinatorial We separate the constraints and the cost function, minimize each of them and then add them together

Application Applications in many fields like:  Routing in computer networks  VLSI circuit design  Planning in operational and logistic systems  Power distribution systems  Wireless and satellite communication systems

Basic idea  If : decision variables  Suppose is our objective function.  Constraints can be expressed as nonnegative penalty functions that only when represent a feasible solution  By combining the penalty functions with F, the original constrained problem may be reformulated as unconstrained problem in which the goal is to minimize the quantity :

TSP  Is simple to state but very difficult to solve.  The problem is to find the shortest possible tour through a set of N vertices so that each vertex is visited exactly once.  This problem is known to be NP-complete

Why neural network?  Drawbacks of conventional computing systems: Perform poorly on complex problems Lack the computational power Don’t utilize the inherent parallelism of problems  Advantages of artificial neural networks: Perform well even on complex problems Very fast computational cycles if implemented in hardware Can take the advantage of inherent parallelism of problems

Some Efforts to solve optimization problems  Many ANN algorithms with different architectures have been used to solve different optimization problems…  We’ve selected: Hopfield NN Elastic Net Self Organizing Map NN

Hopfield-Tank model  TSP must be mapped, in some way, onto the neural network structure  Each row corresponds to a particular city and each column to a particular position in the tour

Mapping TSP to Hopfield neural net  There is a connection between each pair of units  The signal sent along a connection from i to t j is equal to the weight Tij if i is activated. It is equal to 0 otherwise.  A negative weight defines inhibitory connection between the two units  It is unlikely that two units with negative weigh will be active or “on” at the same time

Discrete Hopfield Model  connection weights are not learned  Hopfield network evolves by updating the activation of each unit in turn  In final state, all units are stable according to the update rule  The units are updated at random, one unit at a time {Vi}i=1,...,L, L :number of units Vi :activation level of unit i Tij: connection weight between units i and j tetai: threshold of unit i.

Discrete Hopfield Model (Cont.)  Energy function  Units changes its activation level if and only if the energy of the network decreases by doing so:  Since the energy can only decrease over time and the number configuration is finite the network must converge (but not necessarily the minimum energy state)

Continuous Hopfield-Tank  Neuron function is continuous (Sigmoid function)  The evolution of the units over time is now characterized by the following differential equation : Ui, Ii and Vi are the input, input bias, and activation level of unit I, respectively

Continuous Hopfield-Tank  Energy function  Discrete time approximation is applied to the equations of motion

Application of the Hopfield-Tank Model to the TSP

Application of the Hopfield-Tank model to the TSP (1)The TSP is represented as an N*N matrix (2) Energy function (3)Bias and connection weights are derived

Application of the Hopfield-Tank model to the TSP

Results of Hopfield-Tank  Hopfield and Tank were able to solve a randomly generated 10-city,with parameter value :A=B=500,C=200,N=15.  They reported for 20 trails, network converge 16 times to feasible tours.  Half of those tours were one of two optimal tours

 The size of each black square indicates the value of the output of the corresponding neuron

The main weaknesses of the original Hopfield-Tank model

(d) Model plagued with the limitation of “hill-climbing” approaches (e) Model does not guarantee feasibility

The main weaknesses of the original Hopfield-Tank model The positive points: Can easily implemented in hardware Can be applied to non-Euclidean TSPs

Elastic net (Willshaw-Von der Malsburg)

Elastic net

Energy function for Elastic net

The self organizing map  The SOM are instances of “competitive NN”, used by unsupervised learning system to classify data  Adjusting the weights  Related to elastic net  Differ of elastic net

Competitive Network  Group a set of I-dimensional input pattern in to K cluster (K<=M)

SOM in the TSP context  A set of 2-dimensional coordinates must be mapped onto a set of 1-dimensional positions in the tour 

SOM in the TSP context

Different SOM based on that form  Fort increased speed of convergence by reducing neighborhood and reducing modification to weights of neighboring units over time.  The work of Angeniol

Questions ?