MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002.

Slides:



Advertisements
Similar presentations
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Advertisements

Simulated Annealing Premchand Akella. Agenda Motivation The algorithm Its applications Examples Conclusion.
Simulated Annealing General Idea: Start with an initial solution
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
Planning under Uncertainty
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Tabu Search for Model Selection in Multiple Regression Zvi Drezner California State University Fullerton.
Recent Development on Elimination Ordering Group 1.
MAE 552 – Heuristic Optimization Lecture 6 February 6, 2002.
Task Assignment and Transaction Clustering Heuristics.
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
MAE 552 – Heuristic Optimization
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.
MAE 552 – Heuristic Optimization Lecture 5 February 1, 2002.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
1 IE 607 Heuristic Optimization Simulated Annealing.
SPANISH CRYPTOGRAPHY DAYS (SCD 2011) A Search Algorithm Based on Syndrome Computation to Get Efficient Shortened Cyclic Codes Correcting either Random.
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
1 Simulated Annealing Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Simulated Annealing.
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Course: Logic Programming and Constraints
FORS 8450 Advanced Forest Planning Lecture 11 Tabu Search.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
Single-solution based metaheuristics. Outline Local Search Simulated annealing Tabu search …
Simulated Annealing G.Anuradha.
Introduction to Simulated Annealing Study Guide for ES205 Yu-Chi Ho Xiaocang Lin Aug. 22, 2000.
Introduction to Simulated Annealing Study Guide for ES205 Xiaocang Lin & Yu-Chi Ho August 22, 2000.
1 CONTEXT DEPENDENT CLASSIFICATION  Remember: Bayes rule  Here: The class to which a feature vector belongs depends on:  Its own value  The values.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Local Search and Optimization Presented by Collin Kanaley.
A local search algorithm with repair procedure for the Roadef 2010 challenge Lauri Ahlroth, André Schumacher, Henri Tokola
Optimization Problems
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
An Introduction to Simulated Annealing Kevin Cannons November 24, 2005.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Escaping Local Optima. Where are we? Optimization methods Complete solutions Partial solutions Exhaustive search Hill climbing Exhaustive search Hill.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
1 Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations Simulated Annealing (SA)
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Scientific Research Group in Egypt (SRGE)
Heuristic Optimization Methods
Van Laarhoven, Aarts Version 1, October 2000
Local Search Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Maria Okuniewski Nuclear Engineering Dept.
Dr. Arslan Ornek IMPROVING SEARCH
Collaborative Filtering Matrix Factorization Approach
School of Computer Science & Engineering
Introduction to Simulated Annealing
CONTEXT DEPENDENT CLASSIFICATION
Boltzmann Machine (BM) (§6.4)
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Artificial Intelligence
Local Search Algorithms
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Local Search Algorithms
Presentation transcript:

MAE 552 – Heuristic Optimization Lecture 8 February 8, 2002

annealing.html

A B C Start with a ball at point A. Shake it up and it might jump out of A and into B. Give it another shake (adding energy) and it might go to C. This is the general idea behind SAs.

The SA Algorithm T 0 :m 10, m 20, m 30, m 40, …………………………………m m0 T 1 :m 11, m 21, m 31, m 41, …………………………………m m1 T 2 :m 12, m 22, m 32, m 42, …………………………………m m2 T 3 :m 13, m 23, m 33, m 43, …………………………………m m3 T 4 :m 14, m 24, m 34, m 44, …………………………………m m4 T 5 :m 15, m 25, m 35, m 45, …………………………………m m5 ….. T n :m 1n, m 2n, m 3n, m 4n, …………………………………m mn n=number of levels in cooling schedule m=number of transitions in each Markov chain

Simulated Annealing – Parts of the SA The following musty be specified in implementing SA: 1.An unambiguous description for the objective function f (analogous to energy) and possible constraints. 2.A clear representation of the design vector (analogous to the configuration of a solid) over which an optimum is sought. 3.A ‘cooling schedule’ – this includes the starting value of the control parameter, T o, and rules to determine when the current value of the control parameter should be reduced and by how much (‘the decrement rule’) and a stopping criterion to determine when the optimization process should be terminated.

Simulated Annealing – Parts of the SA 4.A ‘move set generator’ which generates candidate points. 5.An ‘acceptance criterion; which decides whether or not a new move is accepted. 4 and 5 together are called a ‘transition mechanism’ which results in the transformation of a current state into a subsequent one.

Simulated Annealing – Cooling Schedule SA generates a series of points towards the optimum as it proceeds X 0, X 1, X 2, X 3 ………. With corresponding function values f(X 0 ), f(X 1 ), f(X 2 ), f(X 3 )……. Because of the stochastic nature of SA, the sequence of the f’s is random and not monotonic. However it does drift towards the optimum because of the gradual reduction in the control parameter.

A cooling schedule is used to achieve convergence to a global optimum in function optimization. Cooling schedule describes how control parameter T changes during optimization process. First let us look at the concept of acceptance ratio, X(T k ). X(T k ) = (# of Accepted Moves / # of Attempted Moves) If T is large almost all moves are accepted –X(T k )->1 As T decreases: –X(T k )->1 For maximum efficiency, it is important to set the proper value of T o. Cooling Schedules

Simulated Annealing – Cooling Schedule 3 Parts in a cooling schedule: 1.Choose the starting value of the control parameter, T 0. It should be large enough to melt the objective function, to leap over all peaks. This is accomplished by ensuring that the initial X(T 0 ) is close to 1.0 (most random moves are accepted). 2.Start the SA Algorithm At some T 0 and execute for some number of transitions and check X(T 0 ). If not close to 1.0 multiply T k by a factor greater than 1.0 and execute again. Repeat until X(T 0 ) close to 1.0.

Simulated Annealing – Cooling Schedule 2. The decrement rule. Two parts to this - the time when the control parameter reduction should occur and the rate by which it should be reduced. If using fixed length Markov Chains of fixed length, that is once the total number of attempted moves at each value of the control parameter (i.e. inner loop) reaches a predetermined value, it is time to reduce the control parameter. A frequently used decrement function is: T k+1 =rT k k=0,1,2, r= control parameter reduction coefficient. Generally this is a constant between.8 and.99.

Simulated Annealing – Cooling Schedule r t can also be set based on the problem size and characteristics. SPEARS set r = 1/(Num_dvs*k) k = current step in the cooling schedule All settings of Simulated Annealing will entail a tradeoff between searching thoroughly at a particular level of T and the number of steps in the cooling schedule.

The SA Algorithm T 0 :m 10, m 20, m 30, m 40, …………………………………m m0 T 1 :m 11, m 21, m 31, m 41, …………………………………m m1 T 2 :m 12, m 22, m 32, m 42, …………………………………m m2 T 3 :m 13, m 23, m 33, m 43, …………………………………m m3 T 4 :m 14, m 24, m 34, m 44, …………………………………m m4 T 5 :m 15, m 25, m 35, m 45, …………………………………m m5 n=number of levels in cooling schedule m=number of transitions in each Markov chain Increase the number of transitions in each Markov Chain Number of steps in the Cooling Schedule

Simulated Annealing – Cooling Schedule No matter how sophisticated the decrement rule - important to reach a balance between rapid decrement of the control parameter and short length of Markov Chains. 3.Stopping criterion. Rule of thumb : if the improvement in objective function after a period of execution remains fairly constant then stop the algorithm. If the last configuration of several consecutive inner loops have been very close to each other then it is time to stop

Simulated Annealing – Transition Mechanism A transition mechanism transforms a current state into a subsequent one. It consists of two parts: –(a) move set generator and –(b) an acceptance criterion

Simulated Annealing – Move Set Generator (a)a move set generator –Generates a random point X’ from the neighborhood of x c. –Its move (step) generation depends on the data type and the corresponding value of the control parameter T k. –For high values of T k, almost all attempted moves are accepted and it is inefficient to use a small neighborhood because it will cause slow progress of the algorithm. –On the contrary, for small values of T k, more attempted moves are rejected if a neighborhood is used. –The size of the move should decrease as the control parameter is reduced. This improves computational efficiency.

Simulated Annealing – Move Set Generator Large Value of T, large neighborhood. vcvc x1x1 x2x2

Simulated Annealing – Move Set Generator Small Value of T, small neighborhood. g 1: x1x1 x2x2

Simulated Annealing – Move Set Generator Gaussian Neighborhoods g 1: x1x1 x2x2 Choose a candidate from the neighborhood based on a gaussian distribution.

Simulated Annealing – Move Set Generator Depending on the type of representation controlling the size of the neighborhood is going to entail different things. For the SAT problem the representation is a string on binary numbers {TRUE, FALSE} A one-flip neighborhood is defined as all of the points that could be arrived at by flipping one of the bits. X={ }->X={ } Two-flip neighborhood: X={ }->X={ } Less than one-flip neighborhood: X={ 01011{ } }->X={ 10011{ } }

Simulated Annealing – Move Set Generator For the NLP there are an infinite choice of move directions and magnitudes. One approach is to generate a random move each time along a single design variable keeping all others constant. X c =[x 1,x 2,x 3,x 4 ]-> X n =[x 1new,x 2,x 3,x 4 ] Another approach is to change all design variables simultaneously. X c =[x 1,x 2,x 3,x 4 ]-> X n =[x 1new, x 2new, x 3new, x 4new ]

Exterior Penalty Function Where r p generally starts small and is gradually increased to ensure feasibility. Interior Penalty Function Here r p for the second term is the same as before but for the first terms it starts large and is gradually decreased. Simulated Annealing - Constraint Handling