Neural Network to solve Traveling Salesman Problem Amit Goyal 01005009 Koustubh Vachhani 01005021 Ankur Jain 01D05007.

Slides:



Advertisements
Similar presentations
Feedback Neural Networks
Advertisements

Bioinspired Computing Lecture 16
Chapter3 Pattern Association & Associative Memory
Beyond Linear Separability
Computational Intelligence
Introduction to Neural Networks Computing
Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
1 Optimization Algorithms on a Quantum Computer A New Paradigm for Technical Computing Richard H. Warren, PhD Optimization.
CS 678 –Relaxation and Hopfield Networks1 Relaxation and Hopfield Networks Totally connected recurrent relaxation networks Bidirectional weights (symmetric)
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Simple Neural Nets For Pattern Classification
Neural Networks for Optimization William J. Wolfe California State University Channel Islands.
Neural Networks for Optimization Bill Wolfe California State University Channel Islands.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Neural Networks: Concepts (Reading:
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
MAE 552 – Heuristic Optimization Lecture 26 April 1, 2002 Topic:Branch and Bound.
Ant Colony Optimization Optimisation Methods. Overview.
Hypercubes and Neural Networks bill wolfe 10/23/2005.
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
December 7, 2010Neural Networks Lecture 21: Hopfield Network Convergence 1 The Hopfield Network The nodes of a Hopfield network can be updated synchronously.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Ch. 11: Optimization and Search Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 some slides from Stephen Marsland, some images.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
10/6/20151 III. Recurrent Neural Networks. 10/6/20152 A. The Hopfield Network.
Neural Networks Ellen Walker Hiram College. Connectionist Architectures Characterized by (Rich & Knight) –Large number of very simple neuron-like processing.
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Artificial Neural Network Supervised Learning دكترمحسن كاهاني
NEURAL NETWORKS FOR DATA MINING
Hebbian Coincidence Learning
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
IE 585 Associative Network. 2 Associative Memory NN Single-layer net in which the weights are determined in such a way that the net can store a set of.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Simultaneous Recurrent Neural Networks for Static Optimization Problems By: Amol Patwardhan Adviser: Dr. Gursel Serpen August, 1999 The University of.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam
Optimization with Neural Networks Presented by: Mahmood Khademi Babak Bashiri Instructor: Dr. Bagheri Sharif University of Technology April 2007.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Prof. Pushpak Bhattacharyya, IIT Bombay 1 CS 621 Artificial Intelligence Lecture /10/05 Prof. Pushpak Bhattacharyya Artificial Neural Networks:
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
1  Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Activations, attractors, and associators Jaap Murre Universiteit van Amsterdam en Universiteit Utrecht
Hopfield Neural Networks for Optimization 虞台文 大同大學資工所 智慧型多媒體研究室.
Chapter 13 Backtracking Introduction The 3-coloring problem
Chapter 6 Neural Network.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Lecture 9 Model of Hopfield
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
Computational Intelligence Winter Term 2015/16 Prof. Dr. Günter Rudolph Lehrstuhl für Algorithm Engineering (LS 11) Fakultät für Informatik TU Dortmund.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
Lecture 39 Hopfield Network
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
CSC321 Lecture 18: Hopfield nets and simulated annealing
Ch7: Hopfield Neural Model
Recurrent Networks A recurrent network is characterized by
Neural Networks ICS 273A UC Irvine Instructor: Max Welling
Lecture 39 Hopfield Network
Hopfield Neural Networks for Optimization
Computational Intelligence
Computational Intelligence
CS623: Introduction to Computing with Neural Nets (lecture-11)
CSC 578 Neural Networks and Deep Learning
Presentation transcript:

Neural Network to solve Traveling Salesman Problem Amit Goyal Koustubh Vachhani Ankur Jain 01D05007

Roadmap Hopfield Neural Network Solving TSP using Hopfield Network Modification of Hopfield Neural Network Solving TSP using Concurrent Neural Network Comparison between Neural Network and SOM for solving TSP

Background Neural Networks Computing device composed of processing elements called neurons Processing power comes from interconnection between neurons Various models are Hopfield, Back propagation, Perceptron, Kohonen Net etc

Associative memory Produces for any input pattern a similar stored pattern Retrieval by part of data Noisy input can be also recognized OriginalDegradedReconstruction

Hopfield Network Recurrent network Feedback from output to input Fully connected Every neuron connected to every other neuron

Hopfield Network Symmetric connections Connection weights from unit i to unit j and from unit j to unit i are identical for all i and j No self connection, so weight matrix is 0- diagonal and symmetric Logic levels are +1 and -1

Computation  For any neuron i, at an instant t input is Σ j = 1 to n, j≠i w ij σ j (t) σ j (t) is the activation of the j th neuron  Threshold function θ = 0  Activation σ i (t+1)=sgn(Σ j=1 to n, j≠i w ij σ j (t)) where Sgn(x) = +1 x>0 Sgn(x) = -1 x<0

Modes of operation Synchronous All neurons are updated simultaneously Asynchronous Simple : Only one unit is randomly selected at each step General : Neurons update themselves independently and randomly based on probability distribution over time.

Stability Issue of stability arises since there is a feedback in Hopfield network May lead to fixed point, limit cycle or chaos Fixed point : unique point attractor Limit cycles : state space repeats itself in periodic cycles Chaotic : aperiodic strange attractor

Procedure Store and stabilize the vector which has to be part of memory. Find the value of weight w ij, for all i, j such that : is stable in Hopfield Network of N neurons.

Weight learning Weight learning is given by w ij = 1/(N-1) σ i σ j 1/(N-1) is Normalizing factor σ i σ j derives from Hebb’s rule If two connected neurons are ON then weight of the connection is such that mutual excitation is sustained. Similarly, if two neurons inhibit each other then the connection should sustain the mutual inhibition.

Multiple Vectors If multiple vectors need to be stored in memory like ………………………………. Then the weight are given by: w ij = 1/(N-1) Σ m=1 to p σ i m σ j m

Energy Energy is associated with the state of the system. Some patterns need to be made stable this corresponds to minimum energy state of the system.

Energy function Energy at state σ’ = E(σ’) = -½ Σ i Σ j≠i w ij σ i σ j Let the p th neuron change its state from σ p initial to σ p final so E initial = -½ Σ j≠p w pj σ p initial σ j + T E final = -½ Σ j≠p w pj σ p final σ j + T ΔE = E final – E initial T is independent of σ p

Continued… ΔE = - ½ (σ p final - σ p initial ) Σ j≠p w pj σ j i.e. ΔE = -½ Δσ p Σ j≠p w pj σ j Thus: ΔE = -½ Δσ p x (netinput p )  If p changes from +1 to -1 then Δσ p is negative and netinput p is negative and vice versa.  So, ΔE is always negative. Thus energy always decreases when neuron changes state.

Applications of Hopfield Nets Hopfield nets are applied for Optimization problems. Optimization problems maximize or minimize a function. In Hopfield Network the energy gets minimized.

Traveling Salesman Problem Given a set of cities and the distances between them, determine the shortest closed path passing through all the cities exactly once.

Traveling Salesman Problem One of the classic and highly researched problem in the field of computer science. Decision problem “Is there a tour with length less than k" is NP - Complete Optimization problem “What is the shortest tour?” is NP - Hard

Hopfield Net for TSP N cities are represented by an N X N matrix of neurons Each row has exactly one 1 Each column has exactly one 1 Matrix has exactly N 1’s σ kj = 1 if city k is in position j σ kj = 0 otherwise

Hopfield Net for TSP For each element of the matrix take a neuron and fully connect the assembly with symmetric weights Finding a suitable energy function E

Determination of Energy Function E function for TSP has four components satisfying four constraints Each city can have no more than one position i.e. each row can have no more than one activated neuron E 1 = A/2 Σ k Σ i Σ j≠i σ ki σ kj A - Constant

Energy Function (Contd..) Each position contains no more than one city i.e. each column contains no more than one activated neuron E 2 = B/2 Σ j Σ k Σ r≠k σ kj σ rj B - constant

Energy Function (Contd..) There are exactly N entries in the output matrix i.e. there are N 1’s in the output matrix E 3 = C/2 (n - Σ k Σ i σ ki ) 2 C - constant

Energy Function (cont..) Fourth term incorporates the requirement of the shortest path E 4 = D/2 Σ k Σ r≠k Σ j d kr σ kj ( σ r(j+1) + σ r(j-1) ) where d kr is the distance between city-k and city-r E total = E 1 + E 2 + E 3 + E 4

Energy Function (cont..)  Energy equation is also given by E= -½Σ ki Σ rj w (ki)(rj) σ ki σ rj σ ki – City k at position i σ rj – City r at position j  Output function σ ki σ ki = ½ ( 1 + tanh(u ki /u 0 )) u 0 is a constant u ki is the net input

Weight Value  Comparing above equations with the energy equation obtained previously W (ki)(rj) = -A δ kr (1 – δ rj ) - Bδ ij (1 – δ kr ) – C – Dd kr (δ j(i+1) + δ j(i-1) )  Kronecker Symbol : δ kr  δ kr = 1 when k = r  δ kr = 0 when k ≠ r

Observation Choice of constants A,B,C and D that provide a good solution vary between Always obtain legitimate loops (D is small relative to A, B and C) Giving heavier weights to the distances (D is large relative to A, B and C)

Observation (cont..) Local minima Energy function full of dips, valleys and local minima Speed Fast due to rapid computational capacity of network

Concurrent Neural Network Proposed by N. Toomarian in 1988 It requires N(log(N)) neurons to compute TSP of N cities. It also has a much higher probability to reach a valid tour.

Objective Function  Aim is to minimize the distance between city k at position i and city r at position i+1 E i = Σ k≠r Σ r Σ i δ ki δ r(i+1) d kr Where δ is the Kronecers Symbol

Cont … E i = 1/N 2 Σ k≠r Σ r Σ i d kr Π i= 1 to ln(N) [1 + (2ע i – 1) σ ki ] [1 + (2 µ i – 1) σ ri ] Where (2 µ i – 1) = (2ע i – 1) [1 – Π j= 1 to i-1 ע i ] Also to ensure that 2 cities don ’ t occupy same position E error = Σ k≠r Σ r δ kr

Solution E error will have a value 0 for any valid tour. So we have a constrained optimization problem to solve. E = E i + λ E error λ is the Lagrange multiplier to be calculated form the solution.

Minimization of energy function Minimizing Energy function which is in terms of σ ki Algorithm is an iterative procedure which is usually used for minimization of quadratic functions The iteration steps are carried out in the direction of steepest decent with respect to the energy function E

Minimization of energy function Differentiating the energy dU ki /dt = - δE/ δ σ ki = - δE i / δ σ ki - λδE error / δ σ ki dλ/dt = ± δE/ δλ = ± E error σ ki = tanh(αU ki ), α – const.

Implementation Initial Input Matrix and the value of λ is randomly selected and specified At each iteration, new value of σ ki and λ is calculated in the direction of steepest descent of energy function Iterations will stop either when convergence is achieved or when the number of iterations exceeds a user specified number

Comparison – Hopfield vs Concurrent NN Converges faster than Hopfield Network Probability to achieve valid tour is higher than Hopfield Network Hopfield doesn’t have systematic way to determine the constant terms.

Comparison – SOM and Concurrent NN Data set consists of 52 cities in Germany and its subset of 15 cities. Both algorithms were run for 80 times on 15 city data set. 52 city dataset could be analyzed only using SOM while Concurrent Neural Net failed to analyze this dataset.

Result Concurrent neural network always converged and never missed any city, where as SOM is capable of missing cities. Concurrent Neural Network is very erratic in behavior, whereas SOM has higher reliability to detect every link in smallest path. Overall Concurrent Neural Network performed poorly as compared to SOM.

Shortest path generated Concurrent Neural Network (2127 km)Self Organizing Maps (1311km)

Behavior in terms of probability Concurrent Neural Network Self Organizing Maps

Conclusion Hopfield Network can also be used for optimization problems. Concurrent Neural Network performs better than Hopfield network and uses less neurons. Concurrent and Hopfield Neural Network are less efficient than SOM for solving TSP.

References N. K. Bose and P. Liang, ”Neural Network Fundamentals with Graphs, Algorithms and Applications”, Tata McGraw Hill Publication, 1996 P. D. Wasserman, “Neural computing: theory and practice”, Van Nostrand Reinhold Co., 1989 N. Toomarian, “A Concurrent Neural Network algorithm for the Traveling Salesman Problem”, ACM Proceedings of the third conference on Hypercube concurrent computers and applications, pp , 1988.

References R. Reilly, “Neural Network approach to solving the Traveling Salesman Problem”, Journals of Computer Science in Colleges, pp ,October 2003 Wolfram Research inc., “Tutorial on Neural Networks”, ks/NeuralNetworkTheory/2.7.0.html, ks/NeuralNetworkTheory/2.7.0.html Prof. P. Bhattacharyya, “Introduction to computing with Neural Nets”,

NP-complete NP-hard When a decision version of a combinatorial optimization problem is proved to belong to the class of NP-complete problems, which includes well-known problems such as satisfiability,traveling salesman, the bin packing problem, etc., then the optimization version is NP-hard. optimization problemNP-completetraveling salesmanbin packing problem

NP-complete NP-hard “Is there a tour with length less than k" is NP-complete: It is easy to determine if a proposed certificate has length less than k certificate The optimization problem : "what is the shortest tour?", is NP-hard Since there is no easy way to determine if a certificate is the shortest.

Path lengths Concurrent Neural NetworkSelf Organizing Maps