Presentation is loading. Please wait.

Presentation is loading. Please wait.

MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002.

Similar presentations


Presentation on theme: "MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002."— Presentation transcript:

1 MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002

2 Simulated Annealing – Move Set Generator (a)a move set generator –Generates a random point X’ from the neighborhood of x c. –Its move (step) generation depends on the data type and the corresponding value of the control parameter T k. –For high values of T k, almost all attempted moves are accepted and it is inefficient to use a small neighborhood because it will cause slow progress of the algorithm. –On the contrary, for small values of T k, more attempted moves are rejected if a neighborhood is used. –The size of the move should decrease as the control parameter is reduced. This improves computational efficiency.

3 Simulated Annealing – Move Set Generator Large Value of T, large neighborhood. vcvc x1x1 x2x2

4 Simulated Annealing – Move Set Generator Small Value of T, small neighborhood. g 1: x1x1 x2x2

5 Simulated Annealing – Move Set Generator Decreasing the size of the neighborhood: Use a multiplier in a similar fashion to the control parameter decrement rule. N_size new =r n * N_size old R n depends on the number # steps in the cooling schedule -.95-.99 N_size new N_size old

6 Simulated Annealing – Move Set Generator For the NLP there are an infinite choice of move directions and magnitudes. One approach is to generate a random move each time along a single design variable keeping all others constant. X c =[x 1,x 2,x 3,x 4 ]-> X n =[x 1new,x 2,x 3,x 4 ] Another approach is to change all design variables simultaneously. X c =[x 1,x 2,x 3,x 4 ]-> X n =[x 1new, x 2new, x 3new, x 4new ]

7 Penalty Function Approach: assigns a penalty to all candidate solutions that fall outside of feasible design space. Transforms a constrained optimization problem into an unconstrained one. Minimize: F(x)objective function Subject to:g j (x)  0j=1,m inequality constraints h k (x)  0k=1,l equality constraints x i lower  x i  x i upper i=1,nside constraints where x=(x 1, x 2, x 3, x 4, x 5,x n )design variables Simulated Annealing - Constraint Handling

8 Minimize:  (x)pseudo-objective where x=(x 1, x 2, x 3, x 4, x 5,x n )design variables  (x) = F(x) + r p* P(x) F(x) – the original objection function P(x) – Penalty term

9 Exterior Penalty Function Where r p generally starts small and is gradually increased to ensure feasibility. Interior Penalty Function Here r p for the second term is the same as before but for the first terms it starts large and is gradually decreased. Simulated Annealing - Constraint Handling

10 Review for Test 1 Friday February 15 th 3-3:50 Homework # 2 is due Wed. Feb 20 th. Enginet Students – the test will be held Monday Feb. 25 th Short answer questions Simple problems on the behavior of the algorithms Special Office hours, Thursday 9-11, 805 Furnas.

11 Review for Test 1 Material to be covered in test 1. Optimization Basics – Conditions for local and global optimality Complexity Theory Why are some problems difficult to solve Size of search space Fidelity of the model Time changing model Constraints

12 Review for Test 1 Basic Complexity Classes for Optimization Standard O notation description for the time complexity of problem e.g. O(n 2 ) P vs NP hard Intractability Types of Optimization Algorithms How to choose the right method

13 Review for Test 1 Three basic concepts common to every algorithmic approach to problem solving 1.A representation of the problem 2. The objective 3.The evaluation function Concept of a Neighborhood Search Hill Climbing Methods How they work Advantages and Disadvatages

14 Review for Test 1 Hill Climbing Methods How they work Advantages and Disadvatages Balancing Local and Global Search Pure Random Search How they work Advantages and Disadvatages

15 Review for Test 1 Exhaustive Search Methods How they work Advantages and Disadvatages Enumerating NLP, TSP Greedy Search Methods How they work Advantages and Disadvatages Implementing for NLP, TSP

16 Review for Test 1 Simulated Annealing Physical Annealing Process Steps in the process Metropolis Monte-Carlo algorithm

17 Review for Test 1 Parts of the SA 1.An unambiguous description for the evaluation function (analogous to energy) and possible constraints. 2.A clear representation of the design vector (analogous to the configuration of a solid) over which an optimum is sought. 3.A ‘cooling schedule’ – this includes the starting value of the control parameter, T o, and rules to determine when the current value of the control parameter should be reduced and by how much (‘the decrement rule’) and a stopping criterion to determine when the optimization process should be terminated.

18 The SA Algorithm 4.A ‘move set generator’ which generates candidate points. 5.An ‘acceptance criterion; which decides whether or not a new move is accepted. Steps 4 and 5 together are called a ‘transition mechanism’ which results in the transformation of a current state into a subsequent one.

19 The SA Algorithm How the SA escapes local optima.


Download ppt "MAE 552 – Heuristic Optimization Lecture 10 February 13, 2002."

Similar presentations


Ads by Google