CS621: Artificial Intelligence

Slides:



Advertisements
Similar presentations
G5BAIM Artificial Intelligence Methods
Advertisements

CS344: Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 15, 16: Perceptrons and their computing power 6 th and.
CS6800 Advanced Theory of Computation
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Simulated Annealing Methods Matthew Kelly April 12, 2011.
CS 678 –Boltzmann Machines1 Boltzmann Machine Relaxation net with visible and hidden units Learning algorithm Avoids local minima (and speeds up learning)
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
Stochastic Relaxation, Simulating Annealing, Global Minimizers.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
MAE 552 – Heuristic Optimization Lecture 6 February 4, 2002.
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Genetic Algorithm.
JM - 1 Introduction to Bioinformatics: Lecture XVI Global Optimization and Monte Carlo Jarek Meller Jarek Meller Division of Biomedical.
CS344 : Introduction to Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 21- Forward Probabilities and Robotic Action Sequences.
1 IE 607 Heuristic Optimization Simulated Annealing.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Global Optimization The Problem minimize f(x) subject to g i (x)>b i i=1,…,m h j (x)=c j j=1,…,n When x is discrete we call this combinatorial optimization.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
CS 484 – Artificial Intelligence1 Announcements Homework 2 due today Lab 1 due Thursday, 9/20 Homework 3 has been posted Autumn – Current Event Tuesday.
1 Chapter 5 Advanced Search. 2 Chapter 5 Contents l Constraint satisfaction problems l Heuristic repair l The eight queens problem l Combinatorial optimization.
1 Simulated Annealing Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Simulated Annealing.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 7: Traveling Salesman Problem as search; Simulated Annealing; how to.
Monte Carlo Methods So far we have discussed Monte Carlo methods based on a uniform distribution of random numbers on the interval [0,1] p(x) = 1 0  x.
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Simulated Annealing G.Anuradha.
Introduction to Simulated Annealing Study Guide for ES205 Yu-Chi Ho Xiaocang Lin Aug. 22, 2000.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 5: Power of Heuristic; non- conventional search.
Chapter 10 Minimization or Maximization of Functions.
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
Lecture 9 State Space Gradient Descent Gibbs Sampler with Simulated Annealing.
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Lecture 18, CS5671 Multidimensional space “The Last Frontier” Optimization Expectation Exhaustive search Random sampling “Probabilistic random” sampling.
CS623: Introduction to Computing with Neural Nets (lecture-12) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
CS623: Introduction to Computing with Neural Nets (lecture-17) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Lecture 11: Linkage Analysis IV Date: 10/01/02  linkage grouping  locus ordering  confidence in locus ordering.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Scientific Research Group in Egypt (SRGE)
Optimization via Search
Genetic Algorithms.
Simulated Annealing Chapter
Heuristic Optimization Methods
By Rohit Ray ESE 251 Simulated Annealing.
Artificial Intelligence Project 2 Genetic Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
CS621: Artificial Intelligence
Simulated Annealing A physical analogy.
CSE 589 Applied Algorithms Spring 1999
Introduction to Simulated Annealing
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Simulated Annealing & Boltzmann Machines
Presentation transcript:

CS621: Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 7: Traveling Salesman Problem as search; Simulated Annealing; Comparison with GA

4-city TSP dij not necessarily Equal to dji 2 1 d12 d23 d23 d31 d14 4

TSP: State Representation 1 4 3 2 Position (α) City (i) `i’ varies over cities `α’ varies over positions

Objective Functions Minimize F = F1 + F2 F1 = k1 ∑i ((∑α xiα) – 1)2 + k2 ∑β ((∑j xjβ) – 1)2 1(a) 1(b) F2 = k3 ∑i ∑j ∑α dij (xiα xi,α+1 + xiα xi,α-1) 2

Travelling Salesperson problem through Simulated Annealing, State representation Position City 1 2 3 4 1 3 4 2 State

State/Node expansion 1 Parent State 3 4 2 3  2 4  2 1 1 1 Parent State 3 4 2 3  2 4  2 1 1 Children States 2 4 3 2 1 2 3 4 1 2 3 4 3 4

State-Energy diagram

Metropolis Algorithm Initialize: Start with a random state matrix S. Compute the objective function value at S. Call this the energy of the state E(S). The states are transformed by the application of an operator (for TSP, inversion of adjacent cities) Compute change the energy ΔE=Enew-Eold if ΔE <=0, accept the new state Snew Else, accept Snew with probability (‘T’ is the “temperature” and KB, the Boltzmann constant)

Metropolis Algorithm (contd) 6) Continue 2-5 until there is no appreciable change in energy 7) The current state may be one of the local minima 8) Increase the temperature and continue 2-7 until the global minimum is reached

How to probabilistically accept a state? Suppose the probability =p Generate a random number from a uniform distribution [0,1] Number generated is in the range [0-p]: Accept the new state, else continue search from the old state itself

Why? The significance of p (= ) is that if the states are generated infinite number of times then a proportion p of them will be the concerned new state (0,0) (1,1) Uniform distribution Of [0,1] (0,0) (0,1) (0,p)

Why? (contd) If numbers in the range [0,1] are generated randomly, p% of them will be in the range [0,p]. Hence this process can simulate the state generation process (0,0) (1,1) Uniform distribution Of [0,1] (0,0) (0,1) (0,p)

Compare with Roulette Wheel Algorithm for Selection Chromosome Fitness % of total 1 6.82 31 2 1.11 5 3 8.48 38 4 2.57 12 3.08 14 Total 22.0 100 Acknowledgement: http://www.edc.ncl.ac.uk/highlight/rhjanuary2007g02.php/

Roulette Wheel Selection Let i = 1, where i denotes chromosome index; Calculate P(xi) using proportional selection; sum = P(xi); choose r ~ U(0,1); while sum < r do i = i + 1; i.e. next chromosome sum = sum + P(xi); end return xi as one of the selected parent; repeat until all parents are selected

Significance of “temperature” We have a pseudo temperature T As T increases so does T is a parameter in the algorithm When stuck in the local minima, we increase the temperature The probability of going to a higher energy state increases Shaken out of local minima Similar to annealing of metal

Annealing of Metal The metal should have a stable crystal structure so that it is not brittle For this it is repeatedly heated and then cooled slowly